Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders
NASA Technical Reports Server (NTRS)
Lovejoy, Andrew E.; Schultz, Marc R.
2012-01-01
Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.
Interdisciplinary Analysis and Global Policy Studies.
ERIC Educational Resources Information Center
Meeks, Philip
This paper examines ways in which interdisciplinary and multidisciplinary analysis of global policy studies can increase understanding of complex global problems. Until recently, social science has been the discipline most often turned to for techniques and methodology to analyze social problems and behaviors. However, because social science…
Li, Kai; Rüdiger, Heinz; Haase, Rocco; Ziemssen, Tjalf
2018-01-01
Objective: As the multiple trigonometric regressive spectral (MTRS) analysis is extraordinary in its ability to analyze short local data segments down to 12 s, we wanted to evaluate the impact of the data segment settings by applying the technique of MTRS analysis for baroreflex sensitivity (BRS) estimation using a standardized data pool. Methods: Spectral and baroreflex analyses were performed on the EuroBaVar dataset (42 recordings, including lying and standing positions). For this analysis, the technique of MTRS was used. We used different global and local data segment lengths, and chose the global data segments from different positions. Three global data segments of 1 and 2 min and three local data segments of 12, 20, and 30 s were used in MTRS analysis for BRS. Results: All the BRS-values calculated on the three global data segments were highly correlated, both in the supine and standing positions; the different global data segments provided similar BRS estimations. When using different local data segments, all the BRS-values were also highly correlated. However, in the supine position, using short local data segments of 12 s overestimated BRS compared with those using 20 and 30 s. In the standing position, the BRS estimations using different local data segments were comparable. There was no proportional bias for the comparisons between different BRS estimations. Conclusion: We demonstrate that BRS estimation by the MTRS technique is stable when using different global data segments, and MTRS is extraordinary in its ability to evaluate BRS in even short local data segments (20 and 30 s). Because of the non-stationary character of most biosignals, the MTRS technique would be preferable for BRS analysis especially in conditions when only short stationary data segments are available or when dynamic changes of BRS should be monitored.
Real-Time Onboard Global Nonlinear Aerodynamic Modeling from Flight Data
NASA Technical Reports Server (NTRS)
Brandon, Jay M.; Morelli, Eugene A.
2014-01-01
Flight test and modeling techniques were developed to accurately identify global nonlinear aerodynamic models onboard an aircraft. The techniques were developed and demonstrated during piloted flight testing of an Aermacchi MB-326M Impala jet aircraft. Advanced piloting techniques and nonlinear modeling techniques based on fuzzy logic and multivariate orthogonal function methods were implemented with efficient onboard calculations and flight operations to achieve real-time maneuver monitoring and analysis, and near-real-time global nonlinear aerodynamic modeling and prediction validation testing in flight. Results demonstrated that global nonlinear aerodynamic models for a large portion of the flight envelope were identified rapidly and accurately using piloted flight test maneuvers during a single flight, with the final identified and validated models available before the aircraft landed.
Development and verification of local/global analysis techniques for laminated composites
NASA Technical Reports Server (NTRS)
Griffin, O. Hayden, Jr.
1989-01-01
Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.
NASA Technical Reports Server (NTRS)
Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.
2015-01-01
The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.
NASA Technical Reports Server (NTRS)
Eigen, D. J.; Fromm, F. R.; Northouse, R. A.
1974-01-01
A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.
2012-09-01
Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain techniques Peter N. Crabtree, Collin Seanor...00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain...demonstrate performance of a hybrid algorithm . These results are from analysis of a set of images of an ISO 12233 [12] resolution chart captured in the
Detrended Cross Correlation Analysis: a new way to figure out the underlying cause of global warming
NASA Astrophysics Data System (ADS)
Hazra, S.; Bera, S. K.
2016-12-01
Analysing non-stationary time series is a challenging task in earth science, seismology, solar physics, climate, biology, finance etc. Most of the cases external noise like oscillation, high frequency noise, low frequency noise in different scales lead to erroneous result. Many statistical methods are proposed to find the correlation between two non-stationary time series. N. Scafetta and B. J. West, Phys. Rev. Lett. 90, 248701 (2003), reported a strong relationship between solar flare intermittency (SFI) and global temperature anomalies (GTA) using diffusion entropy analysis. It has been recently shown that detrended cross correlation analysis (DCCA) is better technique to remove the effects of any unwanted signal as well as local and periodic trend. Thus DCCA technique is more suitable to find the correlation between two non-stationary time series. By this technique, correlation coefficient at different scale can be estimated. Motivated by this here we have applied a new DCCA technique to find the relationship between SFI and GTA. We have also applied this technique to find the relationship between GTA and carbon di-oxide density, GTA and methane density on earth atmosphere. In future we will try to find the relationship between GTA and aerosols present in earth atmosphere, water vapour density on earth atmosphere, ozone depletion etc. This analysis will help us for better understanding about the reason behind global warming
A hierarchical structure for automatic meshing and adaptive FEM analysis
NASA Technical Reports Server (NTRS)
Kela, Ajay; Saxena, Mukul; Perucchio, Renato
1987-01-01
A new algorithm for generating automatically, from solid models of mechanical parts, finite element meshes that are organized as spatially addressable quaternary trees (for 2-D work) or octal trees (for 3-D work) is discussed. Because such meshes are inherently hierarchical as well as spatially addressable, they permit efficient substructuring techniques to be used for both global analysis and incremental remeshing and reanalysis. The global and incremental techniques are summarized and some results from an experimental closed loop 2-D system in which meshing, analysis, error evaluation, and remeshing and reanalysis are done automatically and adaptively are presented. The implementation of 3-D work is briefly discussed.
Cardot, J C; Berthout, P; Verdenet, J; Bidet, A; Faivre, R; Bassand, J P; Bidet, R; Maurat, J P
1982-01-01
Regional and global left ventricular wall motion was assessed in 120 patients using radionuclide cineangiography (RCA) and contrast angiography. Functional imaging procedures based on a temporal Fourier analysis of dynamic image sequences were applied to the study of cardiac contractility. Two images were constructed by taking the phase and amplitude values of the first harmonic in the Fourier transform for each pixel. These two images aided in determining the perimeter of the left ventricle to calculate the global ejection fraction. Regional left ventricular wall motion was studied by analyzing the phase value and by examining the distribution histogram of these values. The accuracy of global ejection fraction calculation was improved by the Fourier technique. This technique increased the sensitivity of RCA for determining segmental abnormalities especially in the left anterior oblique view (LAO).
ERIC Educational Resources Information Center
Ivy, Karen Lynne-Daniels
2017-01-01
This paper shares the findings of a study conducted on a virtual inter-cultural global leadership development learning project. Mixed Methods analysis techniques were used to examine the interviews of U.S. and Uganda youth project participants. The study, based on cultural and social constructivist learning theories, investigated the effects of…
Fitting Flux Ropes to a Global MHD Solution: A Comparison of Techniques. Appendix 1
NASA Technical Reports Server (NTRS)
Riley, Pete; Linker, J. A.; Lionello, R.; Mikic, Z.; Odstrcil, D.; Hidalgo, M. A.; Cid, C.; Hu, Q.; Lepping, R. P.; Lynch, B. J.
2004-01-01
Flux rope fitting (FRF) techniques are an invaluable tool for extracting information about the properties of a subclass of CMEs in the solar wind. However, it has proven difficult to assess their accuracy since the underlying global structure of the CME cannot be independently determined from the data. In contrast, large-scale MHD simulations of CME evolution can provide both a global view as well as localized time series at specific points in space. In this study we apply 5 different fitting techniques to 2 hypothetical time series derived from MHD simulation results. Independent teams performed the analysis of the events in "blind tests", for which no information, other than the time series, was provided. F rom the results, we infer the following: (1) Accuracy decreases markedly with increasingly glancing encounters; (2) Correct identification of the boundaries of the flux rope can be a significant limiter; and (3) Results from techniques that infer global morphology must be viewed with caution. In spite of these limitations, FRF techniques remain a useful tool for describing in situ observations of flux rope CMEs.
Metaphors of Primary School Students Relating to the Concept of Global Warming
ERIC Educational Resources Information Center
Dogru, Mustafa; Sarac, Esra
2013-01-01
The purpose of this study is to reveal the metaphors of primary school students (n = 362) relating to the concept of global warming. Data collected by completing the expression of "global warming is like..., because..." of the students were analysed by use of qualitative and quantitative data analysis techniques. According to findings of…
Constellation Coverage Analysis
NASA Technical Reports Server (NTRS)
Lo, Martin W. (Compiler)
1997-01-01
The design of satellite constellations requires an understanding of the dynamic global coverage provided by the constellations. Even for a small constellation with a simple circular orbit propagator, the combinatorial nature of the analysis frequently renders the problem intractable. Particularly for the initial design phase where the orbital parameters are still fluid and undetermined, the coverage information is crucial to evaluate the performance of the constellation design. We have developed a fast and simple algorithm for determining the global constellation coverage dynamically using image processing techniques. This approach provides a fast, powerful and simple method for the analysis of global constellation coverage.
High accuracy-nationwide differential global positioning system test and analysis : phase II report
DOT National Transportation Integrated Search
2005-07-01
The High Accuracy-Nationwide Differential Global Positioning System (HA-NDGPS) program focused on the development of compression and broadcast techniques to provide users over a large area wit very accurate radio navigation solutions. The goal was ac...
Basic Features of Global Circulation in the Mesopause Lower Thermosphere Region
NASA Technical Reports Server (NTRS)
Portnyagin, Y. I.
1984-01-01
D1 and D2 techniques have been used and are being used for observations at stations located in the high, middle, and low latitudes of both hemispheres. The systematical and wind velocity measurements with these techniques make it possible to specify and to refine earlier mesopause-lower thermosphere circulation models. With this in view, an effort was made to obtain global long term average height-latitude sections of the wind field at 70 to 110 km using the analysis of long period D1 and D2 observations. Data from 26 meteor radar and 6 ionospheric stations were taken for analysis.
Ozone data and mission sampling analysis
NASA Technical Reports Server (NTRS)
Robbins, J. L.
1980-01-01
A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.
NASA Technical Reports Server (NTRS)
Hailperin, Max
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.
NASA Technical Reports Server (NTRS)
Hailperin, M.
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Gundy-Burlet, Karen
2010-01-01
A useful technique for the validation and verification of complex flight systems is Monte Carlo Filtering -- a global sensitivity analysis that tries to find the inputs and ranges that are most likely to lead to a subset of the outputs. A thorough exploration of the parameter space for complex integrated systems may require thousands of experiments and hundreds of controlled and measured variables. Tools for analyzing this space often have limitations caused by the numerical problems associated with high dimensionality and caused by the assumption of independence of all of the dimensions. To combat both of these limitations, we propose a technique that uses a combination of the original variables with the derived variables obtained during a principal component analysis.
A new perspective on global mean sea level (GMSL) acceleration
NASA Astrophysics Data System (ADS)
Watson, Phil J.
2016-06-01
The vast body of contemporary climate change science is largely underpinned by the premise of a measured acceleration from anthropogenic forcings evident in key climate change proxies -- greenhouse gas emissions, temperature, and mean sea level. By virtue, over recent years, the issue of whether or not there is a measurable acceleration in global mean sea level has resulted in fierce, widespread professional, social, and political debate. Attempts to measure acceleration in global mean sea level (GMSL) have often used comparatively crude analysis techniques providing little temporal instruction on these key questions. This work proposes improved techniques to measure real-time velocity and acceleration based on five GMSL reconstructions spanning the time frame from 1807 to 2014 with substantially improved temporal resolution. While this analysis highlights key differences between the respective reconstructions, there is now more robust, convincing evidence of recent acceleration in the trend of GMSL.
Abdul-Nasir, Aimi Salihah; Mashor, Mohd Yusoff; Mohamed, Zeehaida
2012-01-01
Malaria is one of the serious global health problem, causing widespread sufferings and deaths in various parts of the world. With the large number of cases diagnosed over the year, early detection and accurate diagnosis which facilitates prompt treatment is an essential requirement to control malaria. For centuries now, manual microscopic examination of blood slide remains the gold standard for malaria diagnosis. However, low contrast of the malaria and variable smears quality are some factors that may influence the accuracy of interpretation by microbiologists. In order to reduce this problem, this paper aims to investigate the performance of the proposed contrast enhancement techniques namely, modified global and modified linear contrast stretching as well as the conventional global and linear contrast stretching that have been applied on malaria images of P. vivax species. The results show that the proposed modified global and modified linear contrast stretching techniques have successfully increased the contrast of the parasites and the infected red blood cells compared to the conventional global and linear contrast stretching. Hence, the resultant images would become useful to microbiologists for identification of various stages and species of malaria.
Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oesterling, Patrick; Heine, Christian; Weber, Gunther H.
2012-05-04
Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less
Moho Modeling Using FFT Technique
NASA Astrophysics Data System (ADS)
Chen, Wenjin; Tenzer, Robert
2017-04-01
To improve the numerical efficiency, the Fast Fourier Transform (FFT) technique was facilitated in Parker-Oldenburg's method for a regional gravimetric Moho recovery, which assumes the Earth's planar approximation. In this study, we extend this definition for global applications while assuming a spherical approximation of the Earth. In particular, we utilize the FFT technique for a global Moho recovery, which is practically realized in two numerical steps. The gravimetric forward modeling is first applied, based on methods for a spherical harmonic analysis and synthesis of the global gravity and lithospheric structure models, to compute the refined gravity field, which comprises mainly the gravitational signature of the Moho geometry. The gravimetric inverse problem is then solved iteratively in order to determine the Moho depth. The application of FFT technique to both numerical steps reduces the computation time to a fraction of that required without applying this fast algorithm. The developed numerical producers are used to estimate the Moho depth globally, and the gravimetric result is validated using the global (CRUST1.0) and regional (ESC) seismic Moho models. The comparison reveals a relatively good agreement between the gravimetric and seismic models, with the RMS of differences (of 4-5 km) at the level of expected uncertainties of used input datasets, while without the presence of significant systematic bias.
Is manipulation of color effective in study of the global precedence effect?
Vidal-López, Joaquín; Romera-Vivancos, Juan Antonio
2009-04-01
This article evaluates the use of color manipulation in studying the effect of global precedence and the possible involvement of the magnocellular processing system. The analysis shows variations of color used in three studies produced changes on the global precedence effect, but findings based on this technique present some methodological problems and have little theoretical support from the magnocellular processing-system perspective. For this reason, more research is required to develop knowledge about the origin of these variations in global precedence.
2-D to 3-D global/local finite element analysis of cross-ply composite laminates
NASA Technical Reports Server (NTRS)
Thompson, D. Muheim; Griffin, O. Hayden, Jr.
1990-01-01
An example of two-dimensional to three-dimensional global/local finite element analysis of a laminated composite plate with a hole is presented. The 'zoom' technique of global/local analysis is used, where displacements of the global/local interface from the two-dimensional global model are applied to the edges of the three-dimensional local model. Three different hole diameters, one, three, and six inches, are considered in order to compare the effect of hole size on the three-dimensional stress state around the hole. In addition, three different stacking sequences are analyzed for the six inch hole case in order to study the effect of stacking sequence. The existence of a 'critical' hole size, where the interlaminar stresses are maximum, is indicated. Dispersion of plies at the same angle, as opposed to clustering, is found to reduce the magnitude of some interlaminar stress components and increase others.
Consistency of seven different GNSS global ionospheric mapping techniques during one solar cycle
NASA Astrophysics Data System (ADS)
Roma-Dollase, David; Hernández-Pajares, Manuel; Krankowski, Andrzej; Kotulak, Kacper; Ghoddousi-Fard, Reza; Yuan, Yunbin; Li, Zishen; Zhang, Hongping; Shi, Chuang; Wang, Cheng; Feltens, Joachim; Vergados, Panagiotis; Komjathy, Attila; Schaer, Stefan; García-Rigo, Alberto; Gómez-Cama, José M.
2018-06-01
In the context of the International GNSS Service (IGS), several IGS Ionosphere Associated Analysis Centers have developed different techniques to provide global ionospheric maps (GIMs) of vertical total electron content (VTEC) since 1998. In this paper we present a comparison of the performances of all the GIMs created in the frame of IGS. Indeed we compare the classical ones (for the ionospheric analysis centers CODE, ESA/ESOC, JPL and UPC) with the new ones (NRCAN, CAS, WHU). To assess the quality of them in fair and completely independent ways, two assessment methods are used: a direct comparison to altimeter data (VTEC-altimeter) and to the difference of slant total electron content (STEC) observed in independent ground reference stations (dSTEC-GPS). The main conclusion of this study, performed during one solar cycle, is the consistency of the results between so many different GIM techniques and implementations.
NASA Astrophysics Data System (ADS)
Kantar, Ersin; Keskin, Mustafa; Deviren, Bayram
2012-04-01
We have analyzed the topology of 50 important Turkish companies for the period 2006-2010 using the concept of hierarchical methods (the minimal spanning tree (MST) and hierarchical tree (HT)). We investigated the statistical reliability of links between companies in the MST by using the bootstrap technique. We also used the average linkage cluster analysis (ALCA) technique to observe the cluster structures much better. The MST and HT are known as useful tools to perceive and detect global structure, taxonomy, and hierarchy in financial data. We obtained four clusters of companies according to their proximity. We also observed that the Banks and Holdings cluster always forms in the centre of the MSTs for the periods 2006-2007, 2008, and 2009-2010. The clusters match nicely with their common production activities or their strong interrelationship. The effects of the Automobile sector increased after the global financial crisis due to the temporary incentives provided by the Turkish government. We find that Turkish companies were not very affected by the global financial crisis.
Lee, Jihoon; Fredriksson, David W.; DeCew, Judson; Drach, Andrew; Yim, Solomon C.
2018-01-01
This study provides an engineering approach for designing an aquaculture cage system for use in constructed channel flow environments. As sustainable aquaculture has grown globally, many novel techniques have been introduced such as those implemented in the global Atlantic salmon industry. The advent of several highly sophisticated analysis software systems enables the development of such novel engineering techniques. These software systems commonly include three-dimensional (3D) drafting, computational fluid dynamics, and finite element analysis. In this study, a combination of these analysis tools is applied to evaluate a conceptual aquaculture system for potential deployment in a power plant effluent channel. The channel is supposedly clean; however, it includes elevated water temperatures and strong currents. The first portion of the analysis includes the design of a fish cage system with specific net solidities using 3D drafting techniques. Computational fluid dynamics is then applied to evaluate the flow reduction through the system from the previously generated solid models. Implementing the same solid models, a finite element analysis is performed on the critical components to assess the material stresses produced by the drag force loads that are calculated from the fluid velocities. PMID:29897954
NASA Technical Reports Server (NTRS)
1985-01-01
Topics covered include: data systems and quality; analysis and assimilation techniques; impacts on forecasts; tropical forecasts; analysis intercomparisons; improvements in predictability; and heat sources and sinks.
Estimating Sobol Sensitivity Indices Using Correlations
Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...
An Introduction to Data Analysis in Asteroseismology
NASA Astrophysics Data System (ADS)
Campante, Tiago L.
A practical guide is presented to some of the main data analysis concepts and techniques employed contemporarily in the asteroseismic study of stars exhibiting solar-like oscillations. The subjects of digital signal processing and spectral analysis are introduced first. These concern the acquisition of continuous physical signals to be subsequently digitally analyzed. A number of specific concepts and techniques relevant to asteroseismology are then presented as we follow the typical workflow of the data analysis process, namely, the extraction of global asteroseismic parameters and individual mode parameters (also known as peak-bagging) from the oscillation spectrum.
An operational global-scale ocean thermal analysis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, R. M.; Pollak, K.D.; Phoebus, P.A.
1990-04-01
The Optimum Thermal Interpolation System (OTIS) is an ocean thermal analysis system designed for operational use at FNOC. It is based on the optimum interpolation of the assimilation technique and functions in an analysis-prediction-analysis data assimilation cycle with the TOPS mixed-layer model. OTIS provides a rigorous framework for combining real-time data, climatology, and predictions from numerical ocean prediction models to produce a large-scale synoptic representation of ocean thermal structure. The techniques and assumptions used in OTIS are documented and results of operational tests of global scale OTIS at FNOC are presented. The tests involved comparisons of OTIS against an existingmore » operational ocean thermal structure model and were conducted during February, March, and April 1988. Qualitative comparison of the two products suggests that OTIS gives a more realistic representation of subsurface anomalies and horizontal gradients and that it also gives a more accurate analysis of the thermal structure, with improvements largest below the mixed layer. 37 refs.« less
Carminati, M Chiara; Boniotti, Cinzia; Fusini, Laura; Andreini, Daniele; Pontone, Gianluca; Pepi, Mauro; Caiani, Enrico G
2016-05-01
The aim of this study was to compare the performance of quantitative methods, either semiautomated or automated, for left ventricular (LV) nonviable tissue analysis from cardiac magnetic resonance late gadolinium enhancement (CMR-LGE) images. The investigated segmentation techniques were: (i) n-standard deviations thresholding; (ii) full width at half maximum thresholding; (iii) Gaussian mixture model classification; and (iv) fuzzy c-means clustering. These algorithms were applied either in each short axis slice (single-slice approach) or globally considering the entire short-axis stack covering the LV (global approach). CMR-LGE images from 20 patients with ischemic cardiomyopathy were retrospectively selected, and results from each technique were assessed against manual tracing. All methods provided comparable performance in terms of accuracy in scar detection, computation of local transmurality, and high correlation in scar mass compared with the manual technique. In general, no significant difference between single-slice and global approach was noted. The reproducibility of manual and investigated techniques was confirmed in all cases with slightly lower results for the nSD approach. Automated techniques resulted in accurate and reproducible evaluation of LV scars from CMR-LGE in ischemic patients with performance similar to the manual technique. Their application could minimize user interaction and computational time, even when compared with semiautomated approaches.
Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Peters, Jeanne M.
1987-01-01
An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.
Tests of Spectral Cloud Classification Using DMSP Fine Mode Satellite Data.
1980-06-02
processing techniques of potential value. Fourier spectral analysis was identified as the most promising technique to upgrade automated processing of...these measurements on the Earth’s surface is 0. 3 n mi. 3. Pickett, R.M., and Blackman, E.S. (1976) Automated Processing of Satellite Imagery Data at Air...and Pickett. R. Al. (1977) Automated Processing of Satellite Imagery Data at the Air Force Global Weather Central: Demonstrations of Spectral Analysis
NASA Astrophysics Data System (ADS)
Drapeau, L.; Mangiarotti, S.; Le Jean, F.; Gascoin, S.; Jarlan, L.
2014-12-01
The global modeling technique provides a way to obtain ordinary differential equations from single time series1. This technique, initiated in the 1990s, could be applied successfully to numerous theoretic and experimental systems. More recently it could be applied to environmental systems2,3. Here this technique is applied to seasonal snow cover area in the Pyrenees mountain (Europe) and Mont Lebanon (Mediterranean region). The snowpack evolution is complex because it results from combination of processes driven by physiography (elevation, slope, land cover...) and meteorological variables (precipitation, temperature, wind speed...), which are highly heterogeneous in such regions. Satellite observations in visible bands offer a powerful tool to monitor snow cover areas at global scale, with large resolutions range. Although this observable does not directly inform about snow water equivalent, its dynamical behavior strongly relies on it. Therefore, snow cover area is likely to be a good proxy of the global dynamics and global modeling technique a well adapted approach. The MOD10A2 product (500m) generated from MODIS by the NASA is used after a pretreatment is applied to minimize clouds effect. The global modeling technique is then applied using two packages4,5. The analysis is performed with two time series for the whole period (2000-2012) and year by year. Low-dimensional chaotic models are obtained in many cases. Such models provide a strong argument for chaos since involving the two necessary conditions in a synthetic way: determinism and strong sensitivity to initial conditions. The models comparison suggests important non-stationnarities at interannual scale which prevent from detecting long term changes. 1: Letellier et al 2009. Frequently asked questions about global modeling, Chaos, 19, 023103. 2: Maquet et al 2007. Global models from the Canadian lynx cycles as a direct evidence for chaos in real ecosystems. J. of Mathematical Biology, 55 (1), 21-39 3: Mangiarotti et al 2014. Two chaotic global models for cereal crops cycles observed from satellite in Northern Morocco. Chaos, 24, 023130. 4 : Mangiarotti et al 2012. Polynomial search and Global modelling: two algorithms for modeling chaos. Physical Review E, 86(4), 046205. 5: http://cran.r-project.org/web/packages/PoMoS/index.html.
NASA Technical Reports Server (NTRS)
Langland, R. A.; Stephens, P. L.; Pihos, G. G.
1980-01-01
The techniques used for ingesting SEASAT-A SASS wind retrievals into the existing operational software are described. The intent is to assess the impact of SEASAT data in he marine wind fields produced by the global marine wind/sea level pressure analysis. This analysis is performed on a 21/2 deg latitude/longitude global grid which executes at three hourly time increments. Wind fields with and without SASS winds are being compared. The problems of data volume reduction and aliased wind retrieval ambiquity are treated.
NASA Technical Reports Server (NTRS)
Armstrong, Richard; Hardman, Molly
1991-01-01
A snow model that supports the daily, operational analysis of global snow depth and age has been developed. It provides improved spatial interpolation of surface reports by incorporating digital elevation data, and by the application of regionalized variables (kriging) through the use of a global snow depth climatology. Where surface observations are inadequate, the model applies satellite remote sensing. Techniques for extrapolation into data-void mountain areas and a procedure to compute snow melt are also contained in the model.
A global optimization approach to multi-polarity sentiment analysis.
Li, Xinmiao; Li, Jing; Wu, Yukeng
2015-01-01
Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From the results of this comparison, we found that PSOGO-Senti is more suitable for improving a difficult multi-polarity sentiment analysis problem.
Pan, Wei; Hu, Yuan-Jia; Wang, Yi-Tao
2011-08-01
The structure of international flow of acupuncture knowledge was explored in this article so as to promote the globalization of acupuncture technology innovation. Statistical methods were adopted to reveal geographical distribution of acupuncture patents in the U.S.A. and the influencing factors of cumulative advantage of acupuncture techniques as well as innovation value of application of acupuncture patents. Social network analysis was also utilized to establish a global innovation network of acupuncture technology. The result shows that the cumulative strength on acupuncture technology correlates with the patent retention period. The innovative value of acupuncture invention correlates with the frequency of patent citation. And the U. S. A. and Canada seize central positions in the global acupuncture information and technology delivery system.
Local and Global Gestalt Laws: A Neurally Based Spectral Approach.
Favali, Marta; Citti, Giovanna; Sarti, Alessandro
2017-02-01
This letter presents a mathematical model of figure-ground articulation that takes into account both local and global gestalt laws and is compatible with the functional architecture of the primary visual cortex (V1). The local gestalt law of good continuation is described by means of suitable connectivity kernels that are derived from Lie group theory and quantitatively compared with long-range connectivity in V1. Global gestalt constraints are then introduced in terms of spectral analysis of a connectivity matrix derived from these kernels. This analysis performs grouping of local features and individuates perceptual units with the highest salience. Numerical simulations are performed, and results are obtained by applying the technique to a number of stimuli.
Horizontal Temperature Variability in the Stratosphere: Global Variations Inferred from CRISTA Data
NASA Technical Reports Server (NTRS)
Eidmann, G.; Offermann, D.; Jarisch, M.; Preusse, P.; Eckermann, S. D.; Schmidlin, F. J.
2001-01-01
In two separate orbital campaigns (November, 1994 and August, 1997), the Cryogenic Infrared Spectrometers and Telescopes for the Atmosphere (CRISTA) instrument acquired global stratospheric data of high accuracy and high spatial resolution. The standard limb-scanned CRISTA measurements resolved atmospheric spatial structures with vertical dimensions greater than or equal to 1.5 - 2 km and horizontal dimensions is greater than or equal to 100 - 200 km. A fluctuation analysis of horizontal temperature distributions derived from these data is presented. This method is somewhat complementary to conventional power-spectral analysis techniques.
Satellite-enhanced dynamical downscaling for the analysis of extreme events
NASA Astrophysics Data System (ADS)
Nunes, Ana M. B.
2016-09-01
The use of regional models in the downscaling of general circulation models provides a strategy to generate more detailed climate information. In that case, boundary-forcing techniques can be useful to maintain the large-scale features from the coarse-resolution global models in agreement with the inner modes of the higher-resolution regional models. Although those procedures might improve dynamics, downscaling via regional modeling still aims for better representation of physical processes. With the purpose of improving dynamics and physical processes in regional downscaling of global reanalysis, the Regional Spectral Model—originally developed at the National Centers for Environmental Prediction—employs a newly reformulated scale-selective bias correction, together with the 3-hourly assimilation of the satellite-based precipitation estimates constructed from the Climate Prediction Center morphing technique. The two-scheme technique for the dynamical downscaling of global reanalysis can be applied in analyses of environmental disasters and risk assessment, with hourly outputs, and resolution of about 25 km. Here the satellite-enhanced dynamical downscaling added value is demonstrated in simulations of the first reported hurricane in the western South Atlantic Ocean basin through comparisons with global reanalyses and satellite products available in ocean areas.
Sensitivity analysis of a wing aeroelastic response
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.
1991-01-01
A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.
Analysis of Vlbi, Slr and GPS Site Position Time Series
NASA Astrophysics Data System (ADS)
Angermann, D.; Krügel, M.; Meisel, B.; Müller, H.; Tesmer, V.
Conventionally the IERS terrestrial reference frame (ITRF) is realized by the adoption of a set of epoch coordinates and linear velocities for a set of global tracking stations. Due to the remarkable progress of the space geodetic observation techniques (e.g. VLBI, SLR, GPS) the accuracy and consistency of the ITRF increased continuously. The accuracy achieved today is mainly limited by technique-related systematic errors, which are often poorly characterized or quantified. Therefore it is essential to analyze the individual techniques' solutions with respect to systematic differences, models, parameters, datum definition, etc. Main subject of this presentation is the analysis of GPS, SLR and VLBI time series of site positions. The investigations are based on SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS time series are based on weekly IGS station coordinates solutions. We analyze the time series with respect to the issues mentioned above. In particular we characterize the noise in the time series, identify periodic signals, and investigate non-linear effects that complicate the assignment of linear velocities for global tracking sites. One important aspect is the comparison of results obtained by different techniques at colocation sites.
Pneumothorax detection in chest radiographs using local and global texture signatures
NASA Astrophysics Data System (ADS)
Geva, Ofer; Zimmerman-Moreno, Gali; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit
2015-03-01
A novel framework for automatic detection of pneumothorax abnormality in chest radiographs is presented. The suggested method is based on a texture analysis approach combined with supervised learning techniques. The proposed framework consists of two main steps: at first, a texture analysis process is performed for detection of local abnormalities. Labeled image patches are extracted in the texture analysis procedure following which local analysis values are incorporated into a novel global image representation. The global representation is used for training and detection of the abnormality at the image level. The presented global representation is designed based on the distinctive shape of the lung, taking into account the characteristics of typical pneumothorax abnormalities. A supervised learning process was performed on both the local and global data, leading to trained detection system. The system was tested on a dataset of 108 upright chest radiographs. Several state of the art texture feature sets were experimented with (Local Binary Patterns, Maximum Response filters). The optimal configuration yielded sensitivity of 81% with specificity of 87%. The results of the evaluation are promising, establishing the current framework as a basis for additional improvements and extensions.
Ship Trim Optimization: Assessment of Influence of Trim on Resistance of MOERI Container Ship
Duan, Wenyang
2014-01-01
Environmental issues and rising fuel prices necessitate better energy efficiency in all sectors. Shipping industry is a stakeholder in environmental issues. Shipping industry is responsible for approximately 3% of global CO2 emissions, 14-15% of global NOX emissions, and 16% of global SOX emissions. Ship trim optimization has gained enormous momentum in recent years being an effective operational measure for better energy efficiency to reduce emissions. Ship trim optimization analysis has traditionally been done through tow-tank testing for a specific hullform. Computational techniques are increasingly popular in ship hydrodynamics applications. The purpose of this study is to present MOERI container ship (KCS) hull trim optimization by employing computational methods. KCS hull total resistances and trim and sinkage computed values, in even keel condition, are compared with experimental values and found in reasonable agreement. The agreement validates that mesh, boundary conditions, and solution techniques are correct. The same mesh, boundary conditions, and solution techniques are used to obtain resistance values in different trim conditions at Fn = 0.2274. Based on attained results, optimum trim is suggested. This research serves as foundation for employing computational techniques for ship trim optimization. PMID:24578649
Sumner, T; Shephard, E; Bogle, I D L
2012-09-07
One of the main challenges in the development of mathematical and computational models of biological systems is the precise estimation of parameter values. Understanding the effects of uncertainties in parameter values on model behaviour is crucial to the successful use of these models. Global sensitivity analysis (SA) can be used to quantify the variability in model predictions resulting from the uncertainty in multiple parameters and to shed light on the biological mechanisms driving system behaviour. We present a new methodology for global SA in systems biology which is computationally efficient and can be used to identify the key parameters and their interactions which drive the dynamic behaviour of a complex biological model. The approach combines functional principal component analysis with established global SA techniques. The methodology is applied to a model of the insulin signalling pathway, defects of which are a major cause of type 2 diabetes and a number of key features of the system are identified.
NASA Astrophysics Data System (ADS)
Mitchell, K. E.
2006-12-01
The Environmental Modeling Center (EMC) of the National Centers for Environmental Prediction (NCEP) applies several different analyses of observed precipitation in both the data assimilation and validation components of NCEP's global and regional numerical weather and climate prediction/analysis systems (including in NCEP global and regional reanalysis). This invited talk will survey these data assimilation and validation applications and methodologies, as well as the temporal frequency, spatial domains, spatial resolution, data sources, data density and data quality control in the precipitation analyses that are applied. Some of the precipitation analyses applied by EMC are produced by NCEP's Climate Prediction Center (CPC), while others are produced by the River Forecast Centers (RFCs) of the National Weather Service (NWS), or by automated algorithms of the NWS WSR-88D Radar Product Generator (RPG). Depending on the specific type of application in data assimilation or model forecast validation, the temporal resolution of the precipitation analyses may be hourly, daily, or pentad (5-day) and the domain may be global, continental U.S. (CONUS), or Mexico. The data sources for precipitation include ground-based gauge observations, radar-based estimates, and satellite-based estimates. The precipitation analyses over the CONUS are analyses of either hourly, daily or monthly totals of precipitation, and they are of two distinct types: gauge-only or primarily radar-estimated. The gauge-only CONUS analysis of daily precipitation utilizes an orographic-adjustment technique (based on the well-known PRISM precipitation climatology of Oregon State University) developed by the NWS Office of Hydrologic Development (OHD). The primary NCEP global precipitation analysis is the pentad CPC Merged Analysis of Precipitation (CMAP), which blends both gauge observations and satellite estimates. The presentation will include a brief comparison between the CMAP analysis and other global precipitation analyses by other institutions. Other global precipitation analyses produced by other methodologies are also used by EMC in certain applications, such as CPC's well-known satellite-IR based technique known as "GPI", and satellite-microwave based estimates from NESDIS or NASA. Finally, the presentation will cover the three assimilation methods used by EMC to assimilate precipitation data, including 1) 3D-VAR variational assimilation in NCEP's Global Data Assimilation System (GDAS), 2) direct insertion of precipitation-inferred vertical latent heating profiles in NCEP's N. American Data Assimilation System (NDAS) and its N. American Regional Reanalysis (NARR) counterpart, and 3) direct use of observed precipitation to drive the Noah land model component of NCEP's Global and N. American Land Data Assimilation Systems (GLDAS and NLDAS). In the applications of precipitation analyses in data assimilation at NCEP, the analyses are temporally disaggregated to hourly or less using time-weights calculated from A) either radar-based estimates or an analysis of hourly gauge-observations for the CONUS-domain daily precipitation analyses, or B) global model forecasts of 6-hourly precipitation (followed by linear interpolation to hourly or less) for the global CMAP precipitation analysis.
NASA Technical Reports Server (NTRS)
Green, R. N.
1981-01-01
The shape factor, parameter estimation, and deconvolution data analysis techniques were applied to the same set of Earth emitted radiation measurements to determine the effects of different techniques on the estimated radiation field. All three techniques are defined and their assumptions, advantages, and disadvantages are discussed. Their results are compared globally, zonally, regionally, and on a spatial spectrum basis. The standard deviations of the regional differences in the derived radiant exitance varied from 7.4 W-m/2 to 13.5 W-m/2.
The most remote point method for the site selection of the future GGOS network
NASA Astrophysics Data System (ADS)
Hase, Hayo; Pedreros, Felipe
2014-10-01
The Global Geodetic Observing System (GGOS) proposes 30-40 geodetic observatories as global infrastructure for the most accurate reference frame to monitor the global change. To reach this goal, several geodetic observatories have upgrade plans to become GGOS stations. Most initiatives are driven by national institutions following national interests. From a global perspective, the site distribution remains incomplete and the initiatives to improve this are up until now insufficient. This article is a contribution to answer the question on where to install new GGOS observatories and where to add observation techniques to existing observatories. It introduces the iterative most remote point (MRP) method for filling in the largest gaps in existing technique-specific networks. A spherical version of the Voronoi-diagram is used to pick the optimal location of the new observatory, but practical concerns determine its realistic location. Once chosen, the process is iterated. A quality and a homogeneity parameter of global networks measure the progress of improving the homogeneity of the global site distribution. This method is applied to the global networks of VGOS, and VGOS co-located with SLR to derive some clues about where additional observatory sites or additional observation techniques at existing observatories will improve the GGOS network configuration. With only six additional VGOS-stations, the homogeneity of the global VGOS-network could be significantly improved by more than . From the presented analysis, 25 known or new co-located VGOS and SLR sites are proposed as the future GGOS backbone: Colombo, Easter Island, Fairbanks, Fortaleza, Galapagos, GGAO, Hartebeesthoek, Honiara, Ibadan, Kokee Park, La Plata, Mauritius, McMurdo, Metsahövi, Ny Alesund, Riyadh, San Diego, Santa Maria, Shanghai, Syowa, Tahiti, Tristan de Cunha, Warkworth, Wettzell, and Yarragadee.
A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis
Rahman, M. M.; Antani, S. K.; Thoma, G. R.
2011-01-01
We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350
NASA Technical Reports Server (NTRS)
Estes, J. E.; Star, J. L.
1986-01-01
The basic understanding of the role of information systems technologies and artificial intelligence techniques in the integration, manipulation, and analysis of remotely sensed data for global scale studies is examined.
Procedures for woody vegetation surveys in the Kazgail rural council area, Kordofan, Sudan
Falconer, Allan; Cross, Matthew D.; Orr, Donald G.
1990-01-01
Efforts to reforest parts of the Kordofan Province of Sudan are receiving support from international development agencies. These efforts include planning and implementing reforestation activities that require the collection of natural resources and socioeconomic data, and the preparation of base maps. A combination of remote sensing, geographic information system and global positioning systems procedures are used in this study to meet these requirements.Remote sensing techniques were used to provide base maps and to guide the compilation of vegetation resources maps. These techniques provided a rapid and efficient method for documenting available resources. Pocket‐sized global positioning system units were used to establish the location of field data collected for mapping and resource analysis. A microcomputer data management system tabulated and displayed the field data. The resulting system for data analysis, management, and planning has been adopted for the mapping and inventory of the Gum Belt of Sudan.
A collection of flow visualization techniques used in the Aerodynamic Research Branch
NASA Technical Reports Server (NTRS)
1984-01-01
Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.
Fritzsche, Marco; Fernandes, Ricardo A; Colin-York, Huw; Santos, Ana M; Lee, Steven F; Lagerholm, B Christoffer; Davis, Simon J; Eggeling, Christian
2015-11-13
Detecting intracellular calcium signaling with fluorescent calcium indicator dyes is often coupled with microscopy techniques to follow the activation state of non-excitable cells, including lymphocytes. However, the analysis of global intracellular calcium responses both at the single-cell level and in large ensembles simultaneously has yet to be automated. Here, we present a new software package, CalQuo (Calcium Quantification), which allows the automated analysis and simultaneous monitoring of global fluorescent calcium reporter-based signaling responses in up to 1000 single cells per experiment, at temporal resolutions of sub-seconds to seconds. CalQuo quantifies the number and fraction of responding cells, the temporal dependence of calcium signaling and provides global and individual calcium-reporter fluorescence intensity profiles. We demonstrate the utility of the new method by comparing the calcium-based signaling responses of genetically manipulated human lymphocytic cell lines.
Iterative Monte Carlo analysis of spin-dependent parton distributions
Sato, Nobuo; Melnitchouk, Wally; Kuhn, Sebastian E.; ...
2016-04-05
We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳ 0.1. Furthermore, the study also provides the first determination of the flavor-separated twist-3 PDFsmore » and the d 2 moment of the nucleon within a global PDF analysis.« less
Wang, Xueju; Pan, Zhipeng; Fan, Feifei; ...
2015-09-10
We present an application of the digital image correlation (DIC) method to high-resolution transmission electron microscopy (HRTEM) images for nanoscale deformation analysis. The combination of DIC and HRTEM offers both the ultrahigh spatial resolution and high displacement detection sensitivity that are not possible with other microscope-based DIC techniques. We demonstrate the accuracy and utility of the HRTEM-DIC technique through displacement and strain analysis on amorphous silicon. Two types of error sources resulting from the transmission electron microscopy (TEM) image noise and electromagnetic-lens distortions are quantitatively investigated via rigid-body translation experiments. The local and global DIC approaches are applied for themore » analysis of diffusion- and reaction-induced deformation fields in electrochemically lithiated amorphous silicon. As a result, the DIC technique coupled with HRTEM provides a new avenue for the deformation analysis of materials at the nanometer length scales.« less
A survey of compiler optimization techniques
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1972-01-01
Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.
Why Is Rainfall Error Analysis Requisite for Data Assimilation and Climate Modeling?
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Zhang, Sara Q.
2004-01-01
Given the large temporal and spatial variability of precipitation processes, errors in rainfall observations are difficult to quantify yet crucial to making effective use of rainfall data for improving atmospheric analysis, weather forecasting, and climate modeling. We highlight the need for developing a quantitative understanding of systematic and random errors in precipitation observations by examining explicit examples of how each type of errors can affect forecasts and analyses in global data assimilation. We characterize the error information needed from the precipitation measurement community and how it may be used to improve data usage within the general framework of analysis techniques, as well as accuracy requirements from the perspective of climate modeling and global data assimilation.
Advanced study of global oceanographic requirements for EOS A/B: Appendix volume
NASA Technical Reports Server (NTRS)
1972-01-01
Tables and graphs are presented for a review of oceanographic studies using satellite-borne instruments. The topics considered include sensor requirements, error analysis for wind determination from glitter pattern measurements, coverage frequency plots, ground station rise and set times, a technique for reduction and analysis of ocean spectral data, rationale for the selection of a 2 PM descending orbit, and a priority analysis.
Application of optical correlation techniques to particle imaging velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Edwards, Robert V.
1988-01-01
Pulsed laser sheet velocimetry yields nonintrusive measurements of velocity vectors across an extended 2-dimensional region of the flow field. The application of optical correlation techniques to the analysis of multiple exposure laser light sheet photographs can reduce and/or simplify the data reduction time and hardware. Here, Matched Spatial Filters (MSF) are used in a pattern recognition system. Usually MSFs are used to identify the assembly line parts. In this application, the MSFs are used to identify the iso-velocity vector contours in the flow. The patterns to be recognized are the recorded particle images in a pulsed laser light sheet photograph. Measurement of the direction of the partical image displacements between exposures yields the velocity vector. The particle image exposure sequence is designed such that the velocity vector direction is determined unambiguously. A global analysis technique is used in comparison to the more common particle tracking algorithms and Young's fringe analysis technique.
Finite Volume Numerical Methods for Aeroheating Rate Calculations from Infrared Thermographic Data
NASA Technical Reports Server (NTRS)
Daryabeigi, Kamran; Berry, Scott A.; Horvath, Thomas J.; Nowak, Robert J.
2006-01-01
The use of multi-dimensional finite volume heat conduction techniques for calculating aeroheating rates from measured global surface temperatures on hypersonic wind tunnel models was investigated. Both direct and inverse finite volume techniques were investigated and compared with the standard one-dimensional semi-infinite technique. Global transient surface temperatures were measured using an infrared thermographic technique on a 0.333-scale model of the Hyper-X forebody in the NASA Langley Research Center 20-Inch Mach 6 Air tunnel. In these tests the effectiveness of vortices generated via gas injection for initiating hypersonic transition on the Hyper-X forebody was investigated. An array of streamwise-orientated heating striations was generated and visualized downstream of the gas injection sites. In regions without significant spatial temperature gradients, one-dimensional techniques provided accurate aeroheating rates. In regions with sharp temperature gradients caused by striation patterns multi-dimensional heat transfer techniques were necessary to obtain more accurate heating rates. The use of the one-dimensional technique resulted in differences of 20% in the calculated heating rates compared to 2-D analysis because it did not account for lateral heat conduction in the model.
Electromagnetic studies of global geodynamic processes
NASA Astrophysics Data System (ADS)
Tarits, Pascal
1994-03-01
The deep electromagnetic sounding (DES) technique is one of the few geophysical methods, along with seismology, gravity, heat flow, which may be use to probe the structure of the Earth's mantle directly. The interpretation of the DESs may provide electrical conductivity profiles down to the upper part of the lower mantle. The electrical conductivity is extremely sensitive to most of the thermodynamic processes we believe are acting in the Earth's mantle (temperature increases, partial melting, phase transition and to a lesser extent pressure). Therefore, in principle, results from DES along with laboratory measurements could be used to constrain models of these processes. The DES technique is reviewed in the light of recent results obtained in a variety of domains: data acquisition and analysis, global induction modeling and data inversion and interpretation. The mechanisms and the importance of surface distortions of the DES data are reviewed and techniques to model them are discussed. The recent results in terms of the conductivity distribution in the mantle from local and global DES are presented and a tentative synthesis is proposed. The geodynamic interpretations of the deep conductivity structures are reviewed. The existence of mantle lateral heterogeneities in conductivity at all scales and depths for which electromagnetic data are available is now well documented. A comparison with global results from seismology is presented.
NASA Astrophysics Data System (ADS)
Wang, Audrey; Price, David T.
2007-03-01
A simple integrated algorithm was developed to relate global climatology to distributions of tree plant functional types (PFT). Multivariate cluster analysis was performed to analyze the statistical homogeneity of the climate space occupied by individual tree PFTs. Forested regions identified from the satellite-based GLC2000 classification were separated into tropical, temperate, and boreal sub-PFTs for use in the Canadian Terrestrial Ecosystem Model (CTEM). Global data sets of monthly minimum temperature, growing degree days, an index of climatic moisture, and estimated PFT cover fractions were then used as variables in the cluster analysis. The statistical results for individual PFT clusters were found consistent with other global-scale classifications of dominant vegetation. As an improvement of the quantification of the climatic limitations on PFT distributions, the results also demonstrated overlapping of PFT cluster boundaries that reflected vegetation transitions, for example, between tropical and temperate biomes. The resulting global database should provide a better basis for simulating the interaction of climate change and terrestrial ecosystem dynamics using global vegetation models.
Global Study of the Simple Pendulum by the Homotopy Analysis Method
ERIC Educational Resources Information Center
Bel, A.; Reartes, W.; Torresi, A.
2012-01-01
Techniques are developed to find all periodic solutions in the simple pendulum by means of the homotopy analysis method (HAM). This involves the solution of the equations of motion in two different coordinate representations. Expressions are obtained for the cycles and periods of oscillations with a high degree of accuracy in the whole range of…
Hawthorne L. Beyer; Jeff Jenness; Samuel A. Cushman
2010-01-01
Spatial information systems (SIS) is a term that describes a wide diversity of concepts, techniques, and technologies related to the capture, management, display and analysis of spatial information. It encompasses technologies such as geographic information systems (GIS), global positioning systems (GPS), remote sensing, and relational database management systems (...
NASA Technical Reports Server (NTRS)
Li, Jing; Carlson, Barbara E.; Lacis, Andrew A.
2013-01-01
Many remote sensing techniques and passive sensors have been developed to measure global aerosol properties. While instantaneous comparisons between pixel-level data often reveal quantitative differences, here we use Empirical Orthogonal Function (EOF) analysis, also known as Principal Component Analysis, to demonstrate that satellite-derived aerosol optical depth (AOD) data sets exhibit essentially the same spatial and temporal variability and are thus suitable for large-scale studies. Analysis results show that the first four EOF modes of AOD account for the bulk of the variance and agree well across the four data sets used in this study (i.e., Aqua MODIS, Terra MODIS, MISR, and SeaWiFS). Only SeaWiFS data over land have slightly different EOF patterns. Globally, the first two EOF modes show annual cycles and are mainly related to Sahara dust in the northern hemisphere and biomass burning in the southern hemisphere, respectively. After removing the mean seasonal cycle from the data, major aerosol sources, including biomass burning in South America and dust in West Africa, are revealed in the dominant modes due to the different interannual variability of aerosol emissions. The enhancement of biomass burning associated with El Niño over Indonesia and central South America is also captured with the EOF technique.
NASA Technical Reports Server (NTRS)
Mccandless, S. W.; Miller, B. P.
1974-01-01
The SEASAT satellite system is planned as a user-oriented system for timely monitoring of global ocean dynamics and mapping the global ocean geoid. The satellite instrumentation and modular concept are discussed. Operational data capabilities will include oceanographic data services, direct satellite read-out to users, and conversational retrieval and analysis of stored data. A case-study technique, generalized through physical and econometric modeling, indicates potential economic benefit from SEASAT to users in the following areas: ship routing, iceberg reconnaissance, arctic operations, Alaska pipeline ship link, and off-shore oil production.
Global sensing of gaseous and aerosol trace species using automated instrumentation on 747 airliners
NASA Technical Reports Server (NTRS)
Perkins, P. J.; Papathakos, L. C.
1978-01-01
The Global Atmospheric Sampling Program (GASP) is collecting and analyzing data on gaseous and aerosol trace contaminants in the upper troposphere and lower stratosphere. Measurements are obtained from automated systems installed on four 747 airliners flying global air routes. Improved instruments and analysis techniques are providing an expanding data base for trace species including ozone, carbon monoxide, water vapor, condensation nuclei, and mass concentration of sulfates and nitrates. Simultaneous measurements of several trace species obtained frequently can be used to identify the source of the air mass as being typically tropospheric or stratospheric.
Analysis of local delaminations caused by angle ply matrix cracks
NASA Technical Reports Server (NTRS)
Salpekar, Satish A.; Obrien, T. Kevin; Shivakumar, K. N.
1993-01-01
Two different families of graphite/epoxy laminates with similar layups but different stacking sequences, (0,theta,-theta) sub s and (-theta/theta/0) sub s were analyzed using three-dimensional finite element analysis for theta = 15 and 30 degrees. Delaminations were modeled in the -theta/theta interface, bounded by a matrix crack and the stress free edge. The total strain energy release rate, G, along the delamination front was computed using three different techniques: the virtual crack closure technique (VCCT), the equivalent domain Integral (EDI) technique, and a global energy balance technique. The opening fracture mode component of the strain energy release rate, Gl, along the delamination front was also computed for various delamination lengths using VCCT. The effect of residual thermal and moisture stresses on G was evaluated.
Sources and implications of whole-brain fMRI signals in humans
Power, Jonathan D; Plitt, Mark; Laumann, Timothy O; Martin, Alex
2016-01-01
Whole-brain fMRI signals are a subject of intense interest: variance in the global fMRI signal (the spatial mean of all signals in the brain) indexes subject arousal, and psychiatric conditions such as schizophrenia and autism have been characterized by differences in the global fMRI signal. Further, vigorous debates exist on whether global signals ought to be removed from fMRI data. However, surprisingly little research has focused on the empirical properties of whole-brain fMRI signals. Here we map the spatial and temporal properties of the global signal, individually, in 1000+ fMRI scans. Variance in the global fMRI signal is strongly linked to head motion, to hardware artifacts, and to respiratory patterns and their attendant physiologic changes. Many techniques used to prepare fMRI data for analysis fail to remove these uninteresting kinds of global signal fluctuations. Thus, many studies include, at the time of analysis, prominent global effects of yawns, breathing changes, and head motion, among other signals. Such artifacts will mimic dynamic neural activity and will spuriously alter signal covariance throughout the brain. Methods capable of isolating and removing global artifactual variance while preserving putative “neural” variance are needed; this paper adopts no position on the topic of global signal regression. PMID:27751941
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Huaying, E-mail: zhaoh3@mail.nih.gov; Schuck, Peter, E-mail: zhaoh3@mail.nih.gov
2015-01-01
Global multi-method analysis for protein interactions (GMMA) can increase the precision and complexity of binding studies for the determination of the stoichiometry, affinity and cooperativity of multi-site interactions. The principles and recent developments of biophysical solution methods implemented for GMMA in the software SEDPHAT are reviewed, their complementarity in GMMA is described and a new GMMA simulation tool set in SEDPHAT is presented. Reversible macromolecular interactions are ubiquitous in signal transduction pathways, often forming dynamic multi-protein complexes with three or more components. Multivalent binding and cooperativity in these complexes are often key motifs of their biological mechanisms. Traditional solution biophysicalmore » techniques for characterizing the binding and cooperativity are very limited in the number of states that can be resolved. A global multi-method analysis (GMMA) approach has recently been introduced that can leverage the strengths and the different observables of different techniques to improve the accuracy of the resulting binding parameters and to facilitate the study of multi-component systems and multi-site interactions. Here, GMMA is described in the software SEDPHAT for the analysis of data from isothermal titration calorimetry, surface plasmon resonance or other biosensing, analytical ultracentrifugation, fluorescence anisotropy and various other spectroscopic and thermodynamic techniques. The basic principles of these techniques are reviewed and recent advances in view of their particular strengths in the context of GMMA are described. Furthermore, a new feature in SEDPHAT is introduced for the simulation of multi-method data. In combination with specific statistical tools for GMMA in SEDPHAT, simulations can be a valuable step in the experimental design.« less
Parallel algorithms for placement and routing in VLSI design. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Brouwer, Randall Jay
1991-01-01
The computational requirements for high quality synthesis, analysis, and verification of very large scale integration (VLSI) designs have rapidly increased with the fast growing complexity of these designs. Research in the past has focused on the development of heuristic algorithms, special purpose hardware accelerators, or parallel algorithms for the numerous design tasks to decrease the time required for solution. Two new parallel algorithms are proposed for two VLSI synthesis tasks, standard cell placement and global routing. The first algorithm, a parallel algorithm for global routing, uses hierarchical techniques to decompose the routing problem into independent routing subproblems that are solved in parallel. Results are then presented which compare the routing quality to the results of other published global routers and which evaluate the speedups attained. The second algorithm, a parallel algorithm for cell placement and global routing, hierarchically integrates a quadrisection placement algorithm, a bisection placement algorithm, and the previous global routing algorithm. Unique partitioning techniques are used to decompose the various stages of the algorithm into independent tasks which can be evaluated in parallel. Finally, results are presented which evaluate the various algorithm alternatives and compare the algorithm performance to other placement programs. Measurements are presented on the parallel speedups available.
Comparative and Quantitative Global Proteomics Approaches: An Overview
Deracinois, Barbara; Flahaut, Christophe; Duban-Deweer, Sophie; Karamanos, Yannis
2013-01-01
Proteomics became a key tool for the study of biological systems. The comparison between two different physiological states allows unravelling the cellular and molecular mechanisms involved in a biological process. Proteomics can confirm the presence of proteins suggested by their mRNA content and provides a direct measure of the quantity present in a cell. Global and targeted proteomics strategies can be applied. Targeted proteomics strategies limit the number of features that will be monitored and then optimise the methods to obtain the highest sensitivity and throughput for a huge amount of samples. The advantage of global proteomics strategies is that no hypothesis is required, other than a measurable difference in one or more protein species between the samples. Global proteomics methods attempt to separate quantify and identify all the proteins from a given sample. This review highlights only the different techniques of separation and quantification of proteins and peptides, in view of a comparative and quantitative global proteomics analysis. The in-gel and off-gel quantification of proteins will be discussed as well as the corresponding mass spectrometry technology. The overview is focused on the widespread techniques while keeping in mind that each approach is modular and often recovers the other. PMID:28250403
Efficient Reformulation of the Thermoelastic Higher-order Theory for Fgms
NASA Technical Reports Server (NTRS)
Bansal, Yogesh; Pindera, Marek-Jerzy; Arnold, Steven M. (Technical Monitor)
2002-01-01
Functionally graded materials (FGMs) are characterized by spatially variable microstructures which are introduced to satisfy given performance requirements. The microstructural gradation gives rise to continuously or discretely changing material properties which complicate FGM analysis. Various techniques have been developed during the past several decades for analyzing traditional composites and many of these have been adapted for the analysis of FGMs. Most of the available techniques use the so-called uncoupled approach in order to analyze graded structures. These techniques ignore the effect of microstructural gradation by employing specific spatial material property variations that are either assumed or obtained by local homogenization. The higher-order theory for functionally graded materials (HOTFGM) is a coupled approach developed by Aboudi et al. (1999) which takes the effect of microstructural gradation into consideration and does not ignore the local-global interaction of the spatially variable inclusion phase(s). Despite its demonstrated utility, however, the original formulation of the higher-order theory is computationally intensive. Herein, an efficient reformulation of the original higher-order theory for two-dimensional elastic problems is developed and validated. The use of the local-global conductivity and local-global stiffness matrix approach is made in order to reduce the number of equations involved. In this approach, surface-averaged quantities are the primary variables which replace volume-averaged quantities employed in the original formulation. The reformulation decreases the size of the global conductivity and stiffness matrices by approximately sixty percent. Various thermal, mechanical, and combined thermomechanical problems are analyzed in order to validate the accuracy of the reformulated theory through comparison with analytical and finite-element solutions. The presented results illustrate the efficiency of the reformulation and its advantages in analyzing functionally graded materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Squire, J.; Bhattacharjee, A.
2014-12-10
We study magnetorotational instability (MRI) using nonmodal stability techniques. Despite the spectral instability of many forms of MRI, this proves to be a natural method of analysis that is well-suited to deal with the non-self-adjoint nature of the linear MRI equations. We find that the fastest growing linear MRI structures on both local and global domains can look very different from the eigenmodes, invariably resembling waves shearing with the background flow (shear waves). In addition, such structures can grow many times faster than the least stable eigenmode over long time periods, and be localized in a completely different region ofmore » space. These ideas lead—for both axisymmetric and non-axisymmetric modes—to a natural connection between the global MRI and the local shearing box approximation. By illustrating that the fastest growing global structure is well described by the ordinary differential equations (ODEs) governing a single shear wave, we find that the shearing box is a very sensible approximation for the linear MRI, contrary to many previous claims. Since the shear wave ODEs are most naturally understood using nonmodal analysis techniques, we conclude by analyzing local MRI growth over finite timescales using these methods. The strong growth over a wide range of wave-numbers suggests that nonmodal linear physics could be of fundamental importance in MRI turbulence.« less
Recent developments in fast spectroscopy for plant mineral analysis
van Maarschalkerweerd, Marie; Husted, Søren
2015-01-01
Ideal fertilizer management to optimize plant productivity and quality is more relevant than ever, as global food demands increase along with the rapidly growing world population. At the same time, sub-optimal or excessive use of fertilizers leads to severe environmental damage in areas of intensive crop production. The approaches of soil and plant mineral analysis are briefly compared and discussed here, and the new techniques using fast spectroscopy that offer cheap, rapid, and easy-to-use analysis of plant nutritional status are reviewed. The majority of these methods use vibrational spectroscopy, such as visual-near infrared and to a lesser extent ultraviolet and mid-infrared spectroscopy. Advantages of and problems with application of these techniques are thoroughly discussed. Spectroscopic techniques considered having major potential for plant mineral analysis, such as chlorophyll a fluorescence, X-ray fluorescence, and laser-induced breakdown spectroscopy are also described. PMID:25852719
Remote sensing and GIS technology in the Global Land Ice Measurements from Space (GLIMS) Project
Raup, B.; Kääb, Andreas; Kargel, J.S.; Bishop, M.P.; Hamilton, G.; Lee, E.; Paul, F.; Rau, F.; Soltesz, D.; Khalsa, S.J.S.; Beedle, M.; Helm, C.
2007-01-01
Global Land Ice Measurements from Space (GLIMS) is an international consortium established to acquire satellite images of the world's glaciers, analyze them for glacier extent and changes, and to assess these change data in terms of forcings. The consortium is organized into a system of Regional Centers, each of which is responsible for glaciers in their region of expertise. Specialized needs for mapping glaciers in a distributed analysis environment require considerable work developing software tools: terrain classification emphasizing snow, ice, water, and admixtures of ice with rock debris; change detection and analysis; visualization of images and derived data; interpretation and archival of derived data; and analysis to ensure consistency of results from different Regional Centers. A global glacier database has been designed and implemented at the National Snow and Ice Data Center (Boulder, CO); parameters have been expanded from those of the World Glacier Inventory (WGI), and the database has been structured to be compatible with (and to incorporate) WGI data. The project as a whole was originated, and has been coordinated by, the US Geological Survey (Flagstaff, AZ), which has also led the development of an interactive tool for automated analysis and manual editing of glacier images and derived data (GLIMSView). This article addresses remote sensing and Geographic Information Science techniques developed within the framework of GLIMS in order to fulfill the goals of this distributed project. Sample applications illustrating the developed techniques are also shown. ?? 2006 Elsevier Ltd. All rights reserved.
Multi objective climate change impact assessment using multi downscaled climate scenarios
NASA Astrophysics Data System (ADS)
Rana, Arun; Moradkhani, Hamid
2016-04-01
Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional and global scale. In the present study, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from a set of statistically downscaled GCM projections for Columbia River Basin (CRB). Analysis is performed using 2 different statistically downscaled climate projections namely the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. Analysis is performed on spatial, temporal and frequency based parameters in the future period at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice-versa for temperature. Frequency analysis provided insights into possible explanation to changes in precipitation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwadron, N. A.; Moebius, E.; Kucharek, H.
2014-11-01
The Interstellar Boundary Explorer (IBEX) observes the IBEX ribbon, which stretches across much of the sky observed in energetic neutral atoms (ENAs). The ribbon covers a narrow (∼20°-50°) region that is believed to be roughly perpendicular to the interstellar magnetic field. Superimposed on the IBEX ribbon is the globally distributed flux that is controlled by the processes and properties of the heliosheath. This is a second study that utilizes a previously developed technique to separate ENA emissions in the ribbon from the globally distributed flux. A transparency mask is applied over the ribbon and regions of high emissions. We thenmore » solve for the globally distributed flux using an interpolation scheme. Previously, ribbon separation techniques were applied to the first year of IBEX-Hi data at and above 0.71 keV. Here we extend the separation analysis down to 0.2 keV and to five years of IBEX data enabling first maps of the ribbon and the globally distributed flux across the full sky of ENA emissions. Our analysis shows the broadening of the ribbon peak at energies below 0.71 keV and demonstrates the apparent deformation of the ribbon in the nose and heliotail. We show global asymmetries of the heliosheath, including both deflection of the heliotail and differing widths of the lobes, in context of the direction, draping, and compression of the heliospheric magnetic field. We discuss implications of the ribbon maps for the wide array of concepts that attempt to explain the ribbon's origin. Thus, we present the five-year separation of the IBEX ribbon from the globally distributed flux in preparation for a formal IBEX data release of ribbon and globally distributed flux maps to the heliophysics community.« less
Modified GMDH-NN algorithm and its application for global sensitivity analysis
NASA Astrophysics Data System (ADS)
Song, Shufang; Wang, Lu
2017-11-01
Global sensitivity analysis (GSA) is a very useful tool to evaluate the influence of input variables in the whole distribution range. Sobol' method is the most commonly used among variance-based methods, which are efficient and popular GSA techniques. High dimensional model representation (HDMR) is a popular way to compute Sobol' indices, however, its drawbacks cannot be ignored. We show that modified GMDH-NN algorithm can calculate coefficients of metamodel efficiently, so this paper aims at combining it with HDMR and proposes GMDH-HDMR method. The new method shows higher precision and faster convergent rate. Several numerical and engineering examples are used to confirm its advantages.
Gravity anomaly map of Mars and Moon and analysis of Venus gravity field: New analysis procedures
NASA Technical Reports Server (NTRS)
1984-01-01
The technique of harmonic splines allows direct estimation of a complete planetary gravity field (geoid, gravity, and gravity gradients) everywhere over the planet's surface. Harmonic spline results of Venus are presented as a series of maps at spacecraft and constant altitudes. Global (except for polar regions) and local relations of gravity to topography are described.
Prediction of advertisement preference by fusing EEG response and sentiment analysis.
Gauba, Himaanshu; Kumar, Pradeep; Roy, Partha Pratim; Singh, Priyanka; Dogra, Debi Prosad; Raman, Balasubramanian
2017-08-01
This paper presents a novel approach to predict rating of video-advertisements based on a multimodal framework combining physiological analysis of the user and global sentiment-rating available on the internet. We have fused Electroencephalogram (EEG) waves of user and corresponding global textual comments of the video to understand the user's preference more precisely. In our framework, the users were asked to watch the video-advertisement and simultaneously EEG signals were recorded. Valence scores were obtained using self-report for each video. A higher valence corresponds to intrinsic attractiveness of the user. Furthermore, the multimedia data that comprised of the comments posted by global viewers, were retrieved and processed using Natural Language Processing (NLP) technique for sentiment analysis. Textual contents from review comments were analyzed to obtain a score to understand sentiment nature of the video. A regression technique based on Random forest was used to predict the rating of an advertisement using EEG data. Finally, EEG based rating is combined with NLP-based sentiment score to improve the overall prediction. The study was carried out using 15 video clips of advertisements available online. Twenty five participants were involved in our study to analyze our proposed system. The results are encouraging and these suggest that the proposed multimodal approach can achieve lower RMSE in rating prediction as compared to the prediction using only EEG data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Flight motor set 360L001 (STS-26R). (Reconstructed dynamic loads analysis)
NASA Technical Reports Server (NTRS)
Call, V. B.
1989-01-01
A transient analysis was performed to correlate the predicted versus measured behavior of the Redesigned Solid Rocket Booster (RSRB) during Flight 360L001 (STS-26R) liftoff. Approximately 9 accelerometers, 152 strain gages, and 104 girth gages were bonded to the motors during this event. Prior to Flight 360L001, a finite element model of the RSRB was analyzed to predict the accelerations, strains, and displacements measured by this developmental flight instrumentation (DFI) within an order of magnitude. Subsequently, an analysis has been performed which uses actual Flight 360L001 liftoff loading conditions, and makes more precise predictions for the RSRB structural behavior. Essential information describing the analytical model, analytical techniques used, correlation of the predicted versus measured RSRB behavior, and conclusions, are presented. A detailed model of the RSRB was developed and correlated for use in analyzing the motor behavior during liftoff loading conditions. This finite element model, referred to as the RSRB global model, uses super-element techniques to model all components of the RSRB. The objective of the RSRB global model is to accurately predict deflections and gap openings in the field joints to an accuracy of approximately 0.001 inch. The model of the field joint component was correlated to Referee and Joint Environment Simulation (JES) tests. The accuracy of the assembled RSRB global model was validated by correlation to static-fire tests such DM-8, DM-9, QM-7, and QM-8. This validated RSRB global model was used to predict RSRB structural behavior and joint gap opening during Flight 360L001 liftoff. The results of a transient analysis of the RSRB global model with imposed liftoff loading conditions are presented. Rockwell used many gage measurements to reconstruct the load parameters which were imposed on the RSRB during the Flight 360L001 liftoff. Each load parameter, and its application, is described. Also presented are conclusions and recommendations based on the analysis of this load case and the resulting correlation between predicted and measured RSRB structural behavior.
Determination of Earth orientation using the Global Positioning System
NASA Technical Reports Server (NTRS)
Freedman, A. P.
1989-01-01
Modern spacecraft tracking and navigation require highly accurate Earth-orientation parameters. For near-real-time applications, errors in these quantities and their extrapolated values are a significant error source. A globally distributed network of high-precision receivers observing the full Global Positioning System (GPS) configuration of 18 or more satellites may be an efficient and economical method for the rapid determination of short-term variations in Earth orientation. A covariance analysis using the JPL Orbit Analysis and Simulation Software (OASIS) was performed to evaluate the errors associated with GPS measurements of Earth orientation. These GPS measurements appear to be highly competitive with those from other techniques and can potentially yield frequent and reliable centimeter-level Earth-orientation information while simultaneously allowing the oversubscribed Deep Space Network (DSN) antennas to be used more for direct project support.
Simulation studies of wide and medium field of view earth radiation data analysis
NASA Technical Reports Server (NTRS)
Green, R. N.
1978-01-01
A parameter estimation technique is presented to estimate the radiative flux distribution over the earth from radiometer measurements at satellite altitude. The technique analyzes measurements from a wide field of view (WFOV), horizon to horizon, nadir pointing sensor with a mathematical technique to derive the radiative flux estimates at the top of the atmosphere for resolution elements smaller than the sensor field of view. A computer simulation of the data analysis technique is presented for both earth-emitted and reflected radiation. Zonal resolutions are considered as well as the global integration of plane flux. An estimate of the equator-to-pole gradient is obtained from the zonal estimates. Sensitivity studies of the derived flux distribution to directional model errors are also presented. In addition to the WFOV results, medium field of view results are presented.
Global tropospheric chemistry: Chemical fluexes in the global atmosphere
NASA Technical Reports Server (NTRS)
Lenschow, Donald H. (Editor); Hicks, Bruce B. (Editor)
1989-01-01
In October 1987, NSF, NASA, and NOAA jointly sponsored a workshop at Columbia University to assess the experimental tools and analysis procedures in use and under development to measure and understand gas and particle fluxes across this critical air-surface boundary. Results are presented for that workshop. It is published to summarize the present understanding of the various measurement techniques that are available, identify promising new technological developments for improved measurements, and stimulate thinking about this important measurement challenge.
NASA Technical Reports Server (NTRS)
Page, Lance; Shen, C. N.
1991-01-01
This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.
Anantha M. Prasad; Louis R. Iverson; Andy Liaw; Andy Liaw
2006-01-01
We evaluated four statistical models - Regression Tree Analysis (RTA), Bagging Trees (BT), Random Forests (RF), and Multivariate Adaptive Regression Splines (MARS) - for predictive vegetation mapping under current and future climate scenarios according to the Canadian Climate Centre global circulation model.
NASA Technical Reports Server (NTRS)
Larsen, K. W.; Arvidson, R. E.; Jolliff, B. L.; Clark, B. C.
2000-01-01
Correspondence and Least Squares Mixing Analysis techniques are applied to the chemical composition of Viking 1 soils and Pathfinder rocks and soils. Implications for the parent composition of local and global materials are discussed.
GLOBAL CLIMATE CHANGE AND ITS IMPACT ON DISEASE IMBEDDED IN ECOLOGICAL COMMUNITIES
We present the techniques of qualitative analysis of complex communities and discuss the impact of climate change as a press perturbation. In particular, we focus on the difficult problem of disease and parasites embedded in animal communities, notably zoonotic diseases. Climate ...
NASA Astrophysics Data System (ADS)
Casson, David; Werner, Micha; Weerts, Albrecht; Schellekens, Jaap; Solomatine, Dimitri
2017-04-01
Hydrological modelling in the Canadian Sub-Arctic is hindered by the limited spatial and temporal coverage of local meteorological data. Local watershed modelling often relies on data from a sparse network of meteorological stations with a rough density of 3 active stations per 100,000 km2. Global datasets hold great promise for application due to more comprehensive spatial and extended temporal coverage. A key objective of this study is to demonstrate the application of global datasets and data assimilation techniques for hydrological modelling of a data sparse, Sub-Arctic watershed. Application of available datasets and modelling techniques is currently limited in practice due to a lack of local capacity and understanding of available tools. Due to the importance of snow processes in the region, this study also aims to evaluate the performance of global SWE products for snowpack modelling. The Snare Watershed is a 13,300 km2 snowmelt driven sub-basin of the Mackenzie River Basin, Northwest Territories, Canada. The Snare watershed is data sparse in terms of meteorological data, but is well gauged with consistent discharge records since the late 1970s. End of winter snowpack surveys have been conducted every year from 1978-present. The application of global re-analysis datasets from the EU FP7 eartH2Observe project are investigated in this study. Precipitation data are taken from Multi-Source Weighted-Ensemble Precipitation (MSWEP) and temperature data from Watch Forcing Data applied to European Reanalysis (ERA)-Interim data (WFDEI). GlobSnow-2 is a global Snow Water Equivalent (SWE) measurement product funded by the European Space Agency (ESA) and is also evaluated over the local watershed. Downscaled precipitation, temperature and potential evaporation datasets are used as forcing data in a distributed version of the HBV model implemented in the WFLOW framework. Results demonstrate the successful application of global datasets in local watershed modelling, but that validation of actual frozen precipitation and snowpack conditions is very difficult. The distributed hydrological model shows good streamflow simulation performance based on statistical model evaluation techniques. Results are also promising for inter-annual variability, spring snowmelt onset and time to peak flows. It is expected that data assimilation of stream flow using an Ensemble Kalman Filter will further improve model performance. This study shows that global re-analysis datasets hold great potential for understanding the hydrology and snowpack dynamics of the expansive and data sparse sub-Arctic. However, global SWE products will require further validation and algorithm improvements, particularly over boreal forest and lake-rich regions.
A proposal to extend our understanding of the global economy
NASA Technical Reports Server (NTRS)
Hough, Robbin R.; Ehlers, Manfred
1991-01-01
Satellites acquire information on a global and repetitive basis. They are thus ideal tools for use when global scale and analysis over time is required. Data from satellites comes in digital form which means that it is ideally suited for incorporation in digital data bases and that it can be evaluated using automated techniques. The development of a global multi-source data set which integrates digital information is proposed regarding some 15,000 major industrial sites worldwide with remotely sensed images of the sites. The resulting data set would provide the basis for a wide variety of studies of the global economy. The preliminary results give promise of a new class of global policy model which is far more detailed and helpful to local policy makers than its predecessors. The central thesis of this proposal is that major industrial sites can be identified and their utilization can be tracked with the aid of satellite images.
Andrew J. Hartsell
2015-01-01
This study will investigate how global and local predictors differ with varying spatial scale in relation to species evenness and richness in the gulf coastal plain. Particularly, all-live trees >= one-inch d.b.h. Forest Inventory and Analysis (FIA) data was used as the basis for the study. Watersheds are defined by the USGS 12 digit hydrologic units. The...
Global-Local Finite Element Analysis of Bonded Single-Lap Joints
NASA Technical Reports Server (NTRS)
Kilic, Bahattin; Madenci, Erdogan; Ambur, Damodar R.
2004-01-01
Adhesively bonded lap joints involve dissimilar material junctions and sharp changes in geometry, possibly leading to premature failure. Although the finite element method is well suited to model the bonded lap joints, traditional finite elements are incapable of correctly resolving the stress state at junctions of dissimilar materials because of the unbounded nature of the stresses. In order to facilitate the use of bonded lap joints in future structures, this study presents a finite element technique utilizing a global (special) element coupled with traditional elements. The global element includes the singular behavior at the junction of dissimilar materials with or without traction-free surfaces.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Chou, Shih-Hung; Jedlovec, Gary
2012-01-01
Improvements to global and regional numerical weather prediction (NWP) have been demonstrated through assimilation of data from NASA s Atmospheric Infrared Sounder (AIRS). Current operational data assimilation systems use AIRS radiances, but impact on regional forecasts has been much smaller than for global forecasts. Retrieved profiles from AIRS contain much of the information that is contained in the radiances and may be able to reveal reasons for this reduced impact. Assimilating AIRS retrieved profiles in an identical analysis configuration to the radiances, tracking the quantity and quality of the assimilated data in each technique, and examining analysis increments and forecast impact from each data type can yield clues as to the reasons for the reduced impact. By doing this with regional scale models individual synoptic features (and the impact of AIRS on these features) can be more easily tracked. This project examines the assimilation of hyperspectral sounder data used in operational numerical weather prediction by comparing operational techniques used for AIRS radiances and research techniques used for AIRS retrieved profiles. Parallel versions of a configuration of the Weather Research and Forecasting (WRF) model with Gridpoint Statistical Interpolation (GSI) that mimics the analysis methodology, domain, and observational datasets for the regional North American Mesoscale (NAM) model run at the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC) are run to examine the impact of each type of AIRS data set. The first configuration will assimilate the AIRS radiance data along with other conventional and satellite data using techniques implemented within the operational system; the second configuration will assimilate AIRS retrieved profiles instead of AIRS radiances in the same manner. Preliminary results of this study will be presented and focus on the analysis impact of the radiances and profiles for selected cases.
A TRMM-Calibrated Infrared Technique for Global Rainfall Estimation
NASA Technical Reports Server (NTRS)
Negri, Andrew J.; Adler, Robert F.
2002-01-01
The development of a satellite infrared (IR) technique for estimating convective and stratiform rainfall and its application in studying the diurnal variability of rainfall on a global scale is presented. The Convective-Stratiform Technique (CST), calibrated by coincident, physically retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), is applied over the global tropics during 2001. The technique is calibrated separately over land and ocean, making ingenious use of the IR data from the TRMM Visible/Infrared Scanner (VIRS) before application to global geosynchronous satellite data. The low sampling rate of TRMM PR imposes limitations on calibrating IR-based techniques; however, our research shows that PR observations can be applied to improve IR-based techniques significantly by selecting adequate calibration areas and calibration length. The diurnal cycle of rainfall, as well as the division between convective and stratiform rainfall will be presented. The technique is validated using available data sets and compared to other global rainfall products such as Global Precipitation Climatology Project (GPCP) IR product, calibrated with TRMM Microwave Imager (TMI) data. The calibrated CST technique has the advantages of high spatial resolution (4 km), filtering of non-raining cirrus clouds, and the stratification of the rainfall into its convective and stratiform components, the latter being important for the calculation of vertical profiles of latent heating.
Mapping the Similarities of Spectra: Global and Locally-biased Approaches to SDSS Galaxies
NASA Astrophysics Data System (ADS)
Lawlor, David; Budavári, Tamás; Mahoney, Michael W.
2016-12-01
We present a novel approach to studying the diversity of galaxies. It is based on a novel spectral graph technique, that of locally-biased semi-supervised eigenvectors. Our method introduces new coordinates that summarize an entire spectrum, similar to but going well beyond the widely used Principal Component Analysis (PCA). Unlike PCA, however, this technique does not assume that the Euclidean distance between galaxy spectra is a good global measure of similarity. Instead, we relax that condition to only the most similar spectra, and we show that doing so yields more reliable results for many astronomical questions of interest. The global variant of our approach can identify very finely numerous astronomical phenomena of interest. The locally-biased variants of our basic approach enable us to explore subtle trends around a set of chosen objects. The power of the method is demonstrated in the Sloan Digital Sky Survey Main Galaxy Sample, by illustrating that the derived spectral coordinates carry an unprecedented amount of information.
NASA Astrophysics Data System (ADS)
Murthy, P. Krishna; Krishnaswamy, G.; Armaković, Stevan; Armaković, Sanja J.; Suchetan, P. A.; Desai, Nivedita R.; Suneetha, V.; SreenivasaRao, R.; Bhargavi, G.; Aruna Kumar, D. B.
2018-06-01
The title compound 2-(6-hydroxy-1-benzofuran-3-yl) acetic acid (abbreviated as HBFAA) has been synthetized and characterized by FT-IR, FT-Raman and NMR spectroscopic techniques. Solid state crystal structure of HBFAA has been determined by single crystal X-ray diffraction technique. The crystal structure features O-H⋯O and C-H⋯O intermolecular interactions resulting in a two dimensional supramolecular architecture. The presence of various intermolecular interactions is well supported by the Hirshfeld surface analysis. The molecular properties of HBFAA were performed by Density functional theory (DFT) using B3LYP/6-311G++(d,p) method at ground state in gas phase, compile these results with experimental values and shows mutual agreement. The vibrational spectral analysis were carried out using FT-IR and FT-Raman spectroscopic techniques and assignment of each vibrational wavenumber made on the basis of potential energy distribution (PED). And also frontier orbital analysis (FMOs), global reactivity descriptors, non-linear optical properties (NLO) and natural bond orbital analysis (NBO) of HBFAA were computed with same method. Efforts were made in order to understand global and local reactivity properties of title compound by calculations of MEP, ALIE, BDE and Fukui function surfaces in gas phase, together with thermodynamic properties. Molecular dynamics simulation and radial distribution functions were also used in order to understand the influence of water to the stability of title compound. Charge transfer between molecules of HBFAA has been investigated thanks to the combination of MD simulations and DFT calculations.
Seminal nanotechnology literature: a review.
Kostoff, Ronald N; Koytcheff, Raymond G; Lau, Clifford G Y
2009-11-01
This paper uses complementary text mining techniques to identify and retrieve the high impact (seminal) nanotechnology literature over a span of time. Following a brief scientometric analysis of the seminal articles retrieved, these seminal articles are then used as a basis for a comprehensive literature survey of nanoscience and nanotechnology. The paper ends with a global analysis of the relation of seminal nanotechnology document production to total nanotechnology document production.
The Diamond Model of Intrusion Analysis
2013-07-05
infrastructure pivot) which were then “ sinkholed ”12 to identify global victims (infrastructure-to-victim pivot). Each victim was then further identified...which would have matching social-political needs using cyber-victimology (§5.1.2) [43]. 12“ Sinkholing ” is an aggressive defender technique to takeover
DOE Office of Scientific and Technical Information (OSTI.GOV)
J Squire, A Bhattacharjee
We study the magnetorotational instability (MRI) (Balbus & Hawley 1998) using non-modal stability techniques.Despite the spectral instability of many forms of the MRI, this proves to be a natural method of analysis that is well-suited to deal with the non-self-adjoint nature of the linear MRI equations. We find that the fastest growing linear MRI structures on both local and global domains can look very diff erent to the eigenmodes, invariably resembling waves shearing with the background flow (shear waves). In addition, such structures can grow many times faster than the least stable eigenmode over long time periods, and be localizedmore » in a completely di fferent region of space. These ideas lead – for both axisymmetric and non-axisymmetric modes – to a natural connection between the global MRI and the local shearing box approximation. By illustrating that the fastest growing global structure is well described by the ordinary diff erential equations (ODEs) governing a single shear wave, we find that the shearing box is a very sensible approximation for the linear MRI, contrary to many previous claims. Since the shear wave ODEs are most naturally understood using non-modal analysis techniques, we conclude by analyzing local MRI growth over finite time-scales using these methods. The strong growth over a wide range of wave-numbers suggests that non-modal linear physics could be of fundamental importance in MRI turbulence (Squire & Bhattacharjee 2014).« less
Ma, Hong; Xie, Rong-Ai; Gao, Li-Jian; Zhang, Jin-Ping; Wu, Wei-Chun; Wang, Hao
2015-10-01
The purpose of this study was to investigate the diagnostic value of 3-dimensional (3D) speckle-tracking echocardiography for estimating left ventricular filling pressure in patients with coronary artery disease (CAD) and a preserved left ventricular ejection fraction. Altogether, 84 patients with CAD and 30 age- and sex-matched healthy control participants in sinus rhythm were recruited prospectively. All participants underwent conventional and 3D speckle-tracking echocardiography. Global strain values were automatically calculated by 3D speckle-tracking analysis. The left ventricular end-diastolic pressure (LVEDP) was determined invasively by left heart catheterization. Echocardiography and cardiac catheterization were performed within 24 hours. Compared with the controls, patients with CAD showed lower global longitudinal strain, global circumferential strain, global area strain, and global radial strain. Patients with CAD who had an elevated LVEDP had much lower levels of all 4 3D-speckle-tracking echocardiographic variables. Pearson correlation analysis revealed that the LVEDP correlated positively with the early transmitral flow velocity/early diastolic myocardial velocity (E/E') ratio, global longitudinal strain, global circumferential strain, and global area strain. It correlated negatively with global radial strain. Receiver operating characteristic curve analysis revealed that these 3D speckle-tracking echocardiographic indices could effectively predict elevated left ventricular filling pressure (LVEDP >15 mm Hg) in patients with CAD (areas under the curve: global longitudinal strain, 0.78; global radial strain, 0.77; global circumferential strain, 0.75; and global area strain, 0.74). These parameters, however, showed no advantages over the commonly used E/E' ratio (area under the curve, 0.84). Three-dimensional speckle-tracking echocardiography was a practical technique for predicting elevated left ventricular filling pressure, but it might not be superior to the commonly used E/E' ratio in patients with CAD who have a normal left ventricular ejection fraction. © 2015 by the American Institute of Ultrasound in Medicine.
Progress in multidisciplinary design optimization at NASA Langley
NASA Technical Reports Server (NTRS)
Padula, Sharon L.
1993-01-01
Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.
NASA Technical Reports Server (NTRS)
Doyle, James D.; Warner, Thomas T.
1987-01-01
Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.
Pragmatics and Language Learning. Monograph Series Volume 6.
ERIC Educational Resources Information Center
Bouton, Lawrence F., Ed.
The series of articles in this volume were selected from among those presented at the 8th Annual International Conference on Pragmatics and Language Learning in April 1994. Articles include: "The Right Tool for the Job: Techniques for Analysis of Natural Language Use" (Georgia M. Green); "Sinclair & Coulthard Revisited: Global-…
Exploiting symmetries in the modeling and analysis of tires
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Andersen, C. M.; Tanner, John A.
1989-01-01
A computational procedure is presented for reducing the size of the analysis models of tires having unsymmetric material, geometry and/or loading. The two key elements of the procedure when applied to anisotropic tires are: (1) decomposition of the stiffness matrix into the sum of an orthotropic and nonorthotropic parts; and (2) successive application of the finite-element method and the classical Rayleigh-Ritz technique. The finite-element method is first used to generate few global approximation vectors (or modes). Then the amplitudes of these modes are computed by using the Rayleigh-Ritz technique. The proposed technique has high potential for handling practical tire problems with anisotropic materials, unsymmetric imperfections and asymmetric loading. It is also particularly useful for use with three-dimensional finite-element models of tires.
A TRMM-Calibrated Infrared Technique for Global Rainfall Estimation
NASA Technical Reports Server (NTRS)
Negri, Andrew J.; Adler, Robert F.; Xu, Li-Ming
2003-01-01
This paper presents the development of a satellite infrared (IR) technique for estimating convective and stratiform rainfall and its application in studying the diurnal variability of rainfall on a global scale. The Convective-Stratiform Technique (CST), calibrated by coincident, physically retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), is applied over the global tropics during summer 2001. The technique is calibrated separately over land and ocean, making ingenious use of the IR data from the TRMM Visible/Infrared Scanner (VIRS) before application to global geosynchronous satellite data. The low sampling rate of TRMM PR imposes limitations on calibrating IR- based techniques; however, our research shows that PR observations can be applied to improve IR-based techniques significantly by selecting adequate calibration areas and calibration length. The diurnal cycle of rainfall, as well as the division between convective and t i f m rainfall will be presented. The technique is validated using available data sets and compared to other global rainfall products such as Global Precipitation Climatology Project (GPCP) IR product, calibrated with TRMM Microwave Imager (TMI) data. The calibrated CST technique has the advantages of high spatial resolution (4 km), filtering of non-raining cirrus clouds, and the stratification of the rainfall into its convective and stratiform components, the latter being important for the calculation of vertical profiles of latent heating.
Van Steen, Kristel; Curran, Desmond; Kramer, Jocelyn; Molenberghs, Geert; Van Vreckem, Ann; Bottomley, Andrew; Sylvester, Richard
2002-12-30
Clinical and quality of life (QL) variables from an EORTC clinical trial of first line chemotherapy in advanced breast cancer were used in a prognostic factor analysis of survival and response to chemotherapy. For response, different final multivariate models were obtained from forward and backward selection methods, suggesting a disconcerting instability. Quality of life was measured using the EORTC QLQ-C30 questionnaire completed by patients. Subscales on the questionnaire are known to be highly correlated, and therefore it was hypothesized that multicollinearity contributed to model instability. A correlation matrix indicated that global QL was highly correlated with 7 out of 11 variables. In a first attempt to explore multicollinearity, we used global QL as dependent variable in a regression model with other QL subscales as predictors. Afterwards, standard diagnostic tests for multicollinearity were performed. An exploratory principal components analysis and factor analysis of the QL subscales identified at most three important components and indicated that inclusion of global QL made minimal difference to the loadings on each component, suggesting that it is redundant in the model. In a second approach, we advocate a bootstrap technique to assess the stability of the models. Based on these analyses and since global QL exacerbates problems of multicollinearity, we therefore recommend that global QL be excluded from prognostic factor analyses using the QLQ-C30. The prognostic factor analysis was rerun without global QL in the model, and selected the same significant prognostic factors as before. Copyright 2002 John Wiley & Sons, Ltd.
Carter, Patrick M; Desmond, Jeffery S; Akanbobnaab, Christopher; Oteng, Rockefeller A; Rominski, Sarah D; Barsan, William G; Cunningham, Rebecca M
2012-03-01
Although many global health programs focus on providing clinical care or medical education, improving clinical operations can have a significant effect on patient care delivery, especially in developing health systems without high-level operations management. Lean manufacturing techniques have been effective in decreasing emergency department (ED) length of stay, patient waiting times, numbers of patients leaving without being seen, and door-to-balloon times for ST-elevation myocardial infarction in developed health systems, but use of Lean in low to middle income countries with developing emergency medicine (EM) systems has not been well characterized. To describe the application of Lean manufacturing techniques to improve clinical operations at Komfo Anokye Teaching Hospital (KATH) in Ghana and to identify key lessons learned to aid future global EM initiatives. A 3-week Lean improvement program focused on the hospital admissions process at KATH was completed by a 14-person team in six stages: problem definition, scope of project planning, value stream mapping, root cause analysis, future state planning, and implementation planning. The authors identified eight lessons learned during our use of Lean to optimize the operations of an ED in a global health setting: 1) the Lean process aided in building a partnership with Ghanaian colleagues; 2) obtaining and maintaining senior institutional support is necessary and challenging; 3) addressing power differences among the team to obtain feedback from all team members is critical to successful Lean analysis; 4) choosing a manageable initial project is critical to influence long-term Lean use in a new environment; 5) data intensive Lean tools can be adapted and are effective in a less resourced health system; 6) several Lean tools focused on team problem-solving techniques worked well in a low-resource system without modification; 7) using Lean highlighted that important changes do not require an influx of resources; and 8) despite different levels of resources, root causes of system inefficiencies are often similar across health care systems, but require unique solutions appropriate to the clinical setting. Lean manufacturing techniques can be successfully adapted for use in developing health systems. Lessons learned from this Lean project will aid future introduction of advanced operations management techniques in low- to middle-income countries. © 2012 by the Society for Academic Emergency Medicine.
Carter, Patrick M.; Desmond, Jeffery S.; Akanbobnaab, Christopher; Oteng, Rockefeller A.; Rominski, Sarah; Barsan, William G.; Cunningham, Rebecca
2012-01-01
Background Although many global health programs focus on providing clinical care or medical education, improving clinical operations can have a significant effect on patient care delivery, especially in developing health systems without high-level operations management. Lean manufacturing techniques have been effective in decreasing emergency department (ED) length of stay, patient waiting times, numbers of patients leaving without being seen, and door-to-balloon times for ST-elevation myocardial infarction in developed health systems; but use of Lean in low to middle income countries with developing emergency medicine systems has not been well characterized. Objectives To describe the application of Lean manufacturing techniques to improve clinical operations at Komfo Anokye Teaching Hospital in Ghana and to identify key lessons learned to aid future global EM initiatives. Methods A three-week Lean improvement program focused on the hospital admissions process at Komfo Anokye Teaching Hospital was completed by a 14-person team in six stages: problem definition, scope of project planning, value stream mapping, root cause analysis, future state planning, and implementation planning. Results The authors identified eight lessons learned during our use of Lean to optimize the operations of an ED in a global health setting: 1) the Lean process aided in building a partnership with Ghanaian colleagues; 2) obtaining and maintaining senior institutional support is necessary and challenging; 3) addressing power differences among the team to obtain feedback from all team members is critical to successful Lean analysis; 4) choosing a manageable initial project is critical to influence long-term Lean use in a new environment; 5) data intensive Lean tools can be adapted and are effective in a less resourced health system; 6) several Lean tools focused on team problem solving techniques worked well in a low resource system without modification; 7) using Lean highlighted that important changes do not require an influx of resources; 8) despite different levels of resources, root causes of system inefficiencies are often similar across health care systems, but require unique solutions appropriate to the clinical setting. Conclusions Lean manufacturing techniques can be successfully adapted for use in developing health systems. Lessons learned from this Lean project will aid future introduction of advanced operations management techniques in low to middle income countries. PMID:22435868
Advection modes by optimal mass transfer
NASA Astrophysics Data System (ADS)
Iollo, Angelo; Lombardi, Damiano
2014-02-01
Classical model reduction techniques approximate the solution of a physical model by a limited number of global modes. These modes are usually determined by variants of principal component analysis. Global modes can lead to reduced models that perform well in terms of stability and accuracy. However, when the physics of the model is mainly characterized by advection, the nonlocal representation of the solution by global modes essentially reduces to a Fourier expansion. In this paper we describe a method to determine a low-order representation of advection. This method is based on the solution of Monge-Kantorovich mass transfer problems. Examples of application to point vortex scattering, Korteweg-de Vries equation, and hurricane Dean advection are discussed.
Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera
2003-11-01
Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.
Automated Quantitative Nuclear Cardiology Methods
Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.
2016-01-01
Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779
Global stability of a multiple infected compartments model for waterborne diseases
NASA Astrophysics Data System (ADS)
Wang, Yi; Cao, Jinde
2014-10-01
In this paper, mathematical analysis is carried out for a multiple infected compartments model for waterborne diseases, such as cholera, giardia, and rotavirus. The model accounts for both person-to-person and water-to-person transmission routes. Global stability of the equilibria is studied. In terms of the basic reproduction number R0, we prove that, if R0⩽1, then the disease-free equilibrium is globally asymptotically stable and the infection always disappears; whereas if R0>1, there exists a unique endemic equilibrium which is globally asymptotically stable for the corresponding fast-slow system. Numerical simulations verify our theoretical results and present that the decay rate of waterborne pathogens has a significant impact on the epidemic growth rate. Also, we observe numerically that the unique endemic equilibrium is globally asymptotically stable for the whole system. This statement indicates that the present method need to be improved by other techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mistry, Nilesh N., E-mail: nmistry@som.umaryland.edu; Diwanji, Tejan; Shi, Xiutao
2013-11-15
Purpose: Current implementations of methods based on Hounsfield units to evaluate regional lung ventilation do not directly incorporate tissue-based mass changes that occur over the respiratory cycle. To overcome this, we developed a 4-dimensional computed tomography (4D-CT)-based technique to evaluate fractional regional ventilation (FRV) that uses an individualized ratio of tidal volume to end-expiratory lung volume for each voxel. We further evaluated the effect of different breathing maneuvers on regional ventilation. The results from this work will help elucidate the relationship between global and regional lung function. Methods and Materials: Eight patients underwent 3 sets of 4D-CT scans during 1more » session using free-breathing, audiovisual guidance, and active breathing control. FRV was estimated using a density-based algorithm with mass correction. Internal validation between global and regional ventilation was performed by use of the imaging data collected during the use of active breathing control. The impact of breathing maneuvers on FRV was evaluated comparing the tidal volume from 3 breathing methods. Results: Internal validation through comparison between the global and regional changes in ventilation revealed a strong linear correlation (slope of 1.01, R{sup 2} of 0.97) between the measured global lung volume and the regional lung volume calculated by use of the “mass corrected” FRV. A linear relationship was established between the tidal volume measured with the automated breathing control system and FRV based on 4D-CT imaging. Consistently larger breathing volumes were observed when coached breathing techniques were used. Conclusions: The technique presented improves density-based evaluation of lung ventilation and establishes a link between global and regional lung ventilation volumes. Furthermore, the results obtained are comparable with those of other techniques of functional evaluation such as spirometry and hyperpolarized-gas magnetic resonance imaging. These results were demonstrated on retrospective analysis of patient data, and further research using prospective data is under way to validate this technique against established clinical tests.« less
SVD analysis of Aura TES spectral residuals
NASA Technical Reports Server (NTRS)
Beer, Reinhard; Kulawik, Susan S.; Rodgers, Clive D.; Bowman, Kevin W.
2005-01-01
Singular Value Decomposition (SVD) analysis is both a powerful diagnostic tool and an effective method of noise filtering. We present the results of an SVD analysis of an ensemble of spectral residuals acquired in September 2004 from a 16-orbit Aura Tropospheric Emission Spectrometer (TES) Global Survey and compare them to alternative methods such as zonal averages. In particular, the technique highlights issues such as the orbital variation of instrument response and incompletely modeled effects of surface emissivity and atmospheric composition.
Instability of a solidifying binary mixture
NASA Technical Reports Server (NTRS)
Antar, B. N.
1982-01-01
An analysis is performed on the stability of a solidifying binary mixture due to surface tension variation of the free liquid surface. The basic state solution is obtained numerically as a nonstationary function of time. Due to the time dependence of the basic state, the stability analysis is of the global type which utilizes a variational technique. Also due to the fact that the basic state is a complex function of both space and time, the stability analysis is performed through numerical means.
Management of Globally Distributed Software Development Projects in Multiple-Vendor Constellations
NASA Astrophysics Data System (ADS)
Schott, Katharina; Beck, Roman; Gregory, Robert Wayne
Global information systems development outsourcing is an apparent trend that is expected to continue in the foreseeable future. Thereby, IS-related services are not only increasingly provided from different geographical sites simultaneously but beyond that from multiple service providers based in different countries. The purpose of this paper is to understand how the involvement of multiple service providers affects the management of the globally distributed information systems development projects. As research on this topic is scarce, we applied an exploratory in-depth single-case study design as research approach. The case we analyzed comprises a global software development outsourcing project initiated by a German bank together with several globally distributed vendors. For data collection and data analysis we have adopted techniques suggested by the grounded theory method. Whereas the extant literature points out the increased management overhead associated with multi-sourcing, the analysis of our case suggests that the required effort for managing global outsourcing projects with multiple vendors depends among other things on the maturation level of the cooperation within the vendor portfolio. Furthermore, our data indicate that this interplay maturity is positively impacted through knowledge about the client that has been derived based on already existing client-vendor relationships. The paper concludes by offering theoretical and practical implications.
Seismic Constraints on Interior Solar Convection
NASA Technical Reports Server (NTRS)
Hanasoge, Shravan M.; Duvall, Thomas L.; DeRosa, Marc L.
2010-01-01
We constrain the velocity spectral distribution of global-scale solar convective cells at depth using techniques of local helioseismology. We calibrate the sensitivity of helioseismic waves to large-scale convective cells in the interior by analyzing simulations of waves propagating through a velocity snapshot of global solar convection via methods of time-distance helioseismology. Applying identical analysis techniques to observations of the Sun, we are able to bound from above the magnitudes of solar convective cells as a function of spatial convective scale. We find that convection at a depth of r/R(solar) = 0.95 with spatial extent l < 30, where l is the spherical harmonic degree, comprise weak flow systems, on the order of 15 m/s or less. Convective features deeper than r/R(solar) = 0.95 are more difficult to image due to the rapidly decreasing sensitivity of helioseismic waves.
Quad-Tree Visual-Calculus Analysis of Satellite Coverage
NASA Technical Reports Server (NTRS)
Lo, Martin W.; Hockney, George; Kwan, Bruce
2003-01-01
An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.
2011-01-01
Background Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task. PMID:21867520
NASA Astrophysics Data System (ADS)
Zhou, Xiang
Using an innovative portable holographic inspection and testing system (PHITS) developed at the Australian Defence Force Academy, fatigue cracks in riveted lap joints can be detected by visually inspecting the abnormal fringe changes recorded on holographic interferograms. In this thesis, for automatic crack detection, some modern digital image processing techniques are investigated and applied to holographic interferogram evaluation. Fringe analysis algorithms are developed for identification of the crack-induced fringe changes. Theoretical analysis of PHITS and riveted lap joints and two typical experiments demonstrate that the fatigue cracks in lightly-clamped joints induce two characteristic fringe changes: local fringe discontinuities at the cracking sites; and the global crescent fringe distribution near to the edge of the rivet hole. Both of the fringe features are used for crack detection in this thesis. As a basis of the fringe feature extraction, an algorithm for local fringe orientation calculation is proposed. For high orientation accuracy and computational efficiency, Gaussian gradient filtering and neighboring direction averaging are used to minimize the effects of image background variations and random noise. The neighboring direction averaging is also used to approximate the fringe directions in centerlines of bright and dark fringes. Experimental results indicate that for high orientation accuracy the scales of the Gaussian filter and neighboring direction averaging should be chosen according to the local fringe spacings. The orientation histogram technique is applied to detect the local fringe discontinuity due to the fatigue cracks. The Fourier descriptor technique is used to characterize the global fringe distribution change from a circular to a crescent distribution with the fatigue crack growth. Experiments and computer simulations are conducted to analyze the detectability and reliability of crack detection using the two techniques. Results demonstrate that the Fourier descriptor technique is more promising in the detection of the short cracks near the edge of the rivet head. However, it is not as reliable as the fringe orientation technique for detection of the long through cracks. For reliability, both techniques should be used in practical crack detection. Neither the Fourier descriptor technique nor the orientation histogram technique have been previously applied to holographic interferometry. While this work related primarily to interferograms of cracked rivets, the techniques would be readily applied to other areas of fringe pattern analysis.
Aitkenhead, A H; Rowbottom, C G; Mackay, R I
2013-10-07
We report on the design of Marvin, a Model Anatomy for Radiotherapy Verification and audit In the head and Neck and present results demonstrating its use in the development of the Elekta volumetric modulated arc therapy (VMAT) technique at the Christie, and in the audit of TomoTherapy and Varian RapidArc at other institutions. The geometry of Marvin was generated from CT datasets of eight male and female patients lying in the treatment position, with removable inhomogeneities modelling the sinuses and mandible. A modular system allows the phantom to be used with a range of detectors, with the locations of the modules being based on an analysis of a range of typical treatment plans (27 in total) which were mapped onto the phantom geometry. Results demonstrate the use of Gafchromic EBT2/EBT3 film for measurement of relative dose in a plane through the target and organs-at-risk, and the use of a small-volume ionization chamber for measurement of absolute dose in the target and spinal cord. Measurements made during the development of the head and neck VMAT protocol at the Christie quantified the improvement in plan delivery resulting from the installation of the Elekta Integrity upgrade (which permits an effectively continuously variable dose rate), with plans delivered before and after the upgrade having 88.5 ± 9.4% and 98.0 ± 2.2% respectively of points passing a gamma analysis (at 4%, 4 mm, global). Audits of TomoTherapy and Varian RapidArc neck techniques at other institutions showed a similar quality of plan delivery as for post-Integrity Elekta VMAT: film measurements for both techniques had >99% of points passing a gamma analysis at the clinical criteria of 4%, 4 mm, global, and >95% of points passing at tighter criteria of 3%, 3 mm, global; and absolute dose measurements in the PTV and spinal cord were within 1.5% and 3.5% of the planned doses respectively for both techniques. The results demonstrate that Marvin is an efficient and effective means of assessing the quality of delivery of complex radiotherapy in the head and neck, and is a useful tool to assist development and audit of these techniques.
Marvin: an anatomical phantom for dosimetric evaluation of complex radiotherapy of the head and neck
NASA Astrophysics Data System (ADS)
Aitkenhead, A. H.; Rowbottom, C. G.; Mackay, R. I.
2013-10-01
We report on the design of Marvin, a Model Anatomy for Radiotherapy Verification and audit In the head and Neck and present results demonstrating its use in the development of the Elekta volumetric modulated arc therapy (VMAT) technique at the Christie, and in the audit of TomoTherapy and Varian RapidArc at other institutions. The geometry of Marvin was generated from CT datasets of eight male and female patients lying in the treatment position, with removable inhomogeneities modelling the sinuses and mandible. A modular system allows the phantom to be used with a range of detectors, with the locations of the modules being based on an analysis of a range of typical treatment plans (27 in total) which were mapped onto the phantom geometry. Results demonstrate the use of Gafchromic EBT2/EBT3 film for measurement of relative dose in a plane through the target and organs-at-risk, and the use of a small-volume ionization chamber for measurement of absolute dose in the target and spinal cord. Measurements made during the development of the head and neck VMAT protocol at the Christie quantified the improvement in plan delivery resulting from the installation of the Elekta Integrity upgrade (which permits an effectively continuously variable dose rate), with plans delivered before and after the upgrade having 88.5 ± 9.4% and 98.0 ± 2.2% respectively of points passing a gamma analysis (at 4%, 4 mm, global). Audits of TomoTherapy and Varian RapidArc neck techniques at other institutions showed a similar quality of plan delivery as for post-Integrity Elekta VMAT: film measurements for both techniques had >99% of points passing a gamma analysis at the clinical criteria of 4%, 4 mm, global, and >95% of points passing at tighter criteria of 3%, 3 mm, global; and absolute dose measurements in the PTV and spinal cord were within 1.5% and 3.5% of the planned doses respectively for both techniques. The results demonstrate that Marvin is an efficient and effective means of assessing the quality of delivery of complex radiotherapy in the head and neck, and is a useful tool to assist development and audit of these techniques.
Melenteva, Anastasiia; Galyanin, Vladislav; Savenkova, Elena; Bogomolov, Andrey
2016-07-15
A large set of fresh cow milk samples collected from many suppliers over a large geographical area in Russia during a year has been analyzed by optical spectroscopy in the range 400-1100 nm in accordance with previously developed scatter-based technique. The global (i.e. resistant to seasonal, genetic, regional and other variations of the milk composition) models for fat and total protein content, which were built using partial least-squares (PLS) regression, exhibit satisfactory prediction performances enabling their practical application in the dairy. The root mean-square errors of prediction (RMSEP) were 0.09 and 0.10 for fat and total protein content, respectively. The issues of raw milk analysis and multivariate modelling based on the historical spectroscopic data have been considered and approaches to the creation of global models and their transfer between the instruments have been proposed. Availability of global models should significantly facilitate the dissemination of optical spectroscopic methods for the laboratory and in-line quantitative milk analysis. Copyright © 2016. Published by Elsevier Ltd.
Culinary culture and globalization: an analysis of British and German Michelin-starred restaurants.
Lane, Christel
2011-12-01
The high-end restaurant segment in Britain and Germany has long been shaped by the cultural hegemony of French haute cuisine, perpetuated by multiple processes, including the influence of the Michelin or Red Guide. Traditionally, this hegemony has been expressed in the prevalence of French expatriate chefs, culinary techniques and style and even restaurant culture. This paper investigates whether processes of globalization have weakened or even undermined this French cultural dominance in fine-dining restaurants and their culinary culture. To this end, the study identifies the various forms taken by globalization processes in this industry segment and then assesses their impact on the dominance of the French paradigm of culinary culture. The investigation focuses on British and German Michelin-starred restaurants, underlining both commonalities and divergences in the process of interaction between French, global and local influences. The study employs a qualitative method, using a number of case studies to discern cross-industry patterns. All chefs with two or three stars in the two countries, i.e. 45 chefs, were selected for the analysis of their cuisine. © London School of Economics and Political Science 2011.
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1993-01-01
This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.
Data Mining in Earth System Science (DMESS 2011)
Forrest M. Hoffman; J. Walter Larson; Richard Tran Mills; Bhorn-Gustaf Brooks; Auroop R. Ganguly; William Hargrove; et al
2011-01-01
From field-scale measurements to global climate simulations and remote sensing, the growing body of very large and long time series Earth science data are increasingly difficult to analyze, visualize, and interpret. Data mining, information theoretic, and machine learning techniquesâsuch as cluster analysis, singular value decomposition, block entropy, Fourier and...
Evaluation of the Impact of AIRS Radiance and Profile Data Assimilation in Partly Cloudy Regions
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Srikishen, Jayanthi; Jedlovec, Gary
2013-01-01
Improvements to global and regional numerical weather prediction have been demonstrated through assimilation of data from NASA s Atmospheric Infrared Sounder (AIRS). Current operational data assimilation systems use AIRS radiances, but impact on regional forecasts has been much smaller than for global forecasts. Retrieved profiles from AIRS contain much of the information that is contained in the radiances and may be able to reveal reasons for this reduced impact. Assimilating AIRS retrieved profiles in an identical analysis configuration to the radiances, tracking the quantity and quality of the assimilated data in each technique, and examining analysis increments and forecast impact from each data type can yield clues as to the reasons for the reduced impact. By doing this with regional scale models individual synoptic features (and the impact of AIRS on these features) can be more easily tracked. This project examines the assimilation of hyperspectral sounder data used in operational numerical weather prediction by comparing operational techniques used for AIRS radiances and research techniques used for AIRS retrieved profiles. Parallel versions of a configuration of the Weather Research and Forecasting (WRF) model with Gridpoint Statistical Interpolation (GSI) are run to examine the impact AIRS radiances and retrieved profiles. Statistical evaluation of a long-term series of forecast runs will be compared along with preliminary results of in-depth investigations for select case comparing the analysis increments in partly cloudy regions and short-term forecast impacts.
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Srikishen, Jayanthi; Jedlovec, Gary
2013-01-01
Improvements to global and regional numerical weather prediction have been demonstrated through assimilation of data from NASA s Atmospheric Infrared Sounder (AIRS). Current operational data assimilation systems use AIRS radiances, but impact on regional forecasts has been much smaller than for global forecasts. Retrieved profiles from AIRS contain much of the information that is contained in the radiances and may be able to reveal reasons for this reduced impact. Assimilating AIRS retrieved profiles in an identical analysis configuration to the radiances, tracking the quantity and quality of the assimilated data in each technique, and examining analysis increments and forecast impact from each data type can yield clues as to the reasons for the reduced impact. By doing this with regional scale models individual synoptic features (and the impact of AIRS on these features) can be more easily tracked. This project examines the assimilation of hyperspectral sounder data used in operational numerical weather prediction by comparing operational techniques used for AIRS radiances and research techniques used for AIRS retrieved profiles. Parallel versions of a configuration of the Weather Research and Forecasting (WRF) model with Gridpoint Statistical Interpolation (GSI) are run to examine the impact AIRS radiances and retrieved profiles. Statistical evaluation of 6 weeks of forecast runs will be compared along with preliminary results of in-depth investigations for select case comparing the analysis increments in partly cloudy regions and short-term forecast impacts.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen
2016-04-01
Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5
VIRTIS on Venus Express: retrieval of real surface emissivity on global scales
NASA Astrophysics Data System (ADS)
Arnold, Gabriele E.; Kappel, David; Haus, Rainer; Telléz Pedroza, Laura; Piccioni, Giuseppe; Drossart, Pierre
2015-09-01
The extraction of surface emissivity data provides the data base for surface composition analyses and enables to evaluate Venus' geology. The Visible and InfraRed Thermal Imaging Spectrometer (VIRTIS) aboard ESA's Venus Express mission measured, inter alia, the nightside thermal emission of Venus in the near infrared atmospheric windows between 1.0 and 1.2 μm. These data can be used to determine information about surface properties on global scales. This requires a sophisticated approach to understand and consider the effects and interferences of different atmospheric and surface parameters influencing the retrieved values. In the present work, results of a new technique for retrieval of the 1.0 - 1.2 μm - surface emissivity are summarized. It includes a Multi-Window Retrieval Technique, a Multi-Spectrum Retrieval technique (MSR), and a detailed reliability analysis. The MWT bases on a detailed radiative transfer model making simultaneous use of information from different atmospheric windows of an individual spectrum. MSR regularizes the retrieval by incorporating available a priori mean values, standard deviations as well as spatial-temporal correlations of parameters to be retrieved. The capability of this method is shown for a selected surface target area. Implications for geologic investigations are discussed. Based on these results, the work draws conclusions for future Venus surface composition analyses on global scales using spectral remote sensing techniques. In that context, requirements for observational scenarios and instrumental performances are investigated, and recommendations are derived to optimize spectral measurements for Venus' surface studies.
2005-11-01
more random. Autonomous systems can exchange entropy statistics for packet streams with no confidentiality concerns, potentially enabling timely and... analysis began with simulation results, which were validated by analysis of actual data from an Autonomous System (AS). A scale-free network is one...traffic—for example, time series of flux at given nodes and mean path length Outputs the time series from any node queried Calculates
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
2003-01-01
This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.
Development and verification of global/local analysis techniques for laminated composites
NASA Technical Reports Server (NTRS)
Thompson, Danniella Muheim; Griffin, O. Hayden, Jr.
1991-01-01
A two-dimensional to three-dimensional global/local finite element approach was developed, verified, and applied to a laminated composite plate of finite width and length containing a central circular hole. The resulting stress fields for axial compression loads were examined for several symmetric stacking sequences and hole sizes. Verification was based on comparison of the displacements and the stress fields with those accepted trends from previous free edge investigations and a complete three-dimensional finite element solution of the plate. The laminates in the compression study included symmetric cross-ply, angle-ply and quasi-isotropic stacking sequences. The entire plate was selected as the global model and analyzed with two-dimensional finite elements. Displacements along a region identified as the global/local interface were applied in a kinematically consistent fashion to independent three-dimensional local models. Local areas of interest in the plate included a portion of the straight free edge near the hole, and the immediate area around the hole. Interlaminar stress results obtained from the global/local analyses compares well with previously reported trends, and some new conclusions about interlaminar stress fields in plates with different laminate orientations and hole sizes are presented for compressive loading. The effectiveness of the global/local procedure in reducing the computational effort required to solve these problems is clearly demonstrated through examination of the computer time required to formulate and solve the linear, static system of equations which result for the global and local analyses to those required for a complete three-dimensional formulation for a cross-ply laminate. Specific processors used during the analyses are described in general terms. The application of this global/local technique is not limited software system, and was developed and described in as general a manner as possible.
Simplified Parallel Domain Traversal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson III, David J
2011-01-01
Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less
A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde
NASA Astrophysics Data System (ADS)
Cofer, Wesley R.; Edahl, Robert A.
This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH 2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g. CH 2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH 2O at global background levels (˜ 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH 2O, once concentrated, is accomplished using high performance liquid chromatography (HPLC) with ultraviolet photometric detection. The CH 2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H 2SO 4 acidified aqueous solution, is detected as CH 2O.
A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde
NASA Technical Reports Server (NTRS)
Cofer, W. R., III; Edahl, R. A., Jr.
1986-01-01
This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g., CH2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH2O at global background levels (about 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH2O, once concentrated, is accomplished using high performance liquid chromatography with ultraviolet photometric detection. The CH2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H2SO4 acidified aqueous solution, is detected as CH2O.
NASA Astrophysics Data System (ADS)
Haley, Craig Stuart
2009-12-01
Key to understanding and predicting the effects of global environmental problems such as ozone depletion and global warming is a detailed understanding of the atmospheric processes, both dynamical and chemical. Essential to this understanding are accurate global data sets of atmospheric constituents with adequate temporal and spatial (vertical and horizontal) resolutions. For this purpose the Canadian satellite instrument OSIRIS (Optical Spectrograph and Infrared Imager System) was launched on the Odin satellite in 2001. OSIRIS is primarily designed to measure minor stratospheric constituents, including ozone (O3) and nitrogen dioxide (NO2), employing the novel limb-scattered sunlight technique, which can provide both good vertical resolution and near global coverage. This dissertation presents a method to retrieve stratospheric O 3 and NO2 from the OSIRIS limb-scatter observations. The retrieval method incorporates an a posteriori optimal estimator combined with an intermediate spectral analysis, specifically differential optical absorption spectroscopy (DOAS). A detailed description of the retrieval method is presented along with the results of a thorough error analysis and a geophysical validation exercise. It is shown that OSIRIS limb-scatter observations successfully produce accurate stratospheric O3 and NO2 number density profiles throughout the stratosphere, clearly demonstrating the strength of the limb-scatter technique. The OSIRIS observations provide an extremely useful data set that is of particular importance for studies of the chemistry of the middle atmosphere. The long OSIRIS record of stratospheric ozone and nitrogen dioxide may also prove useful for investigating variability and trends.
NASA Technical Reports Server (NTRS)
Huffman, George J.; Adler, Robert F.; Rudolf, Bruno; Schneider, Udo; Keehn, Peter R.
1995-01-01
The 'satellite-gauge model' (SGM) technique is described for combining precipitation estimates from microwave satellite data, infrared satellite data, rain gauge analyses, and numerical weather prediction models into improved estimates of global precipitation. Throughout, monthly estimates on a 2.5 degrees x 2.5 degrees lat-long grid are employed. First, a multisatellite product is developed using a combination of low-orbit microwave and geosynchronous-orbit infrared data in the latitude range 40 degrees N - 40 degrees S (the adjusted geosynchronous precipitation index) and low-orbit microwave data alone at higher latitudes. Then the rain gauge analysis is brougth in, weighting each field by its inverse relative error variance to produce a nearly global, observationally based precipitation estimate. To produce a complete global estimate, the numerical model results are used to fill data voids in the combined satellite-gauge estimate. Our sequential approach to combining estimates allows a user to select the multisatellite estimate, the satellite-gauge estimate, or the full SGM estimate (observationally based estimates plus the model information). The primary limitation in the method is imperfections in the estimation of relative error for the individual fields. The SGM results for one year of data (July 1987 to June 1988) show important differences from the individual estimates, including model estimates as well as climatological estimates. In general, the SGM results are drier in the subtropics than the model and climatological results, reflecting the relatively dry microwave estimates that dominate the SGM in oceanic regions.
Optical skin friction measurement technique in hypersonic wind tunnel
NASA Astrophysics Data System (ADS)
Chen, Xing; Yao, Dapeng; Wen, Shuai; Pan, Junjie
2016-10-01
Shear-sensitive liquid-crystal coatings (SSLCCs) have an optical characteristic that they are sensitive to the applied shear stress. Based on this, a novel technique is developed to measure the applied shear stress of the model surface regarding both its magnitude and direction in hypersonic flow. The system of optical skin friction measurement are built in China Academy of Aerospace Aerodynamics (CAAA). A series of experiments of hypersonic vehicle is performed in wind tunnel of CAAA. Global skin friction distribution of the model which shows complicated flow structures is discussed, and a brief mechanism analysis and an evaluation on optical measurement technique have been made.
NASA Technical Reports Server (NTRS)
Doyle, James D.; Warner, Thomas T.
1988-01-01
Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) Procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.
Fractals and Spatial Methods for Mining Remote Sensing Imagery
NASA Technical Reports Server (NTRS)
Lam, Nina; Emerson, Charles; Quattrochi, Dale
2003-01-01
The rapid increase in digital remote sensing and GIS data raises a critical problem -- how can such an enormous amount of data be handled and analyzed so that useful information can be derived quickly? Efficient handling and analysis of large spatial data sets is central to environmental research, particularly in global change studies that employ time series. Advances in large-scale environmental monitoring and modeling require not only high-quality data, but also reliable tools to analyze the various types of data. A major difficulty facing geographers and environmental scientists in environmental assessment and monitoring is that spatial analytical tools are not easily accessible. Although many spatial techniques have been described recently in the literature, they are typically presented in an analytical form and are difficult to transform to a numerical algorithm. Moreover, these spatial techniques are not necessarily designed for remote sensing and GIS applications, and research must be conducted to examine their applicability and effectiveness in different types of environmental applications. This poses a chicken-and-egg problem: on one hand we need more research to examine the usability of the newer techniques and tools, yet on the other hand, this type of research is difficult to conduct if the tools to be explored are not accessible. Another problem that is fundamental to environmental research are issues related to spatial scale. The scale issue is especially acute in the context of global change studies because of the need to integrate remote-sensing and other spatial data that are collected at different scales and resolutions. Extrapolation of results across broad spatial scales remains the most difficult problem in global environmental research. There is a need for basic characterization of the effects of scale on image data, and the techniques used to measure these effects must be developed and implemented to allow for a multiple scale assessment of the data before any useful process-oriented modeling involving scale-dependent data can be conducted. Through the support of research grants from NASA, we have developed a software module called ICAMS (Image Characterization And Modeling System) to address the need to develop innovative spatial techniques and make them available to the broader scientific communities. ICAMS provides new spatial techniques, such as fractal analysis, geostatistical functions, and multiscale analysis that are not easily available in commercial GIS/image processing software. By bundling newer spatial methods in a user-friendly software module, researchers can begin to test and experiment with the new spatial analysis methods and they can gauge scale effects using a variety of remote sensing imagery. In the following, we describe briefly the development of ICAMS and present application examples.
NASA Technical Reports Server (NTRS)
OBrien, T. Kevin (Technical Monitor); Krueger, Ronald; Minguet, Pierre J.
2004-01-01
The application of a shell/3D modeling technique for the simulation of skin/stringer debond in a specimen subjected to tension and three-point bending was studied. The global structure was modeled with shell elements. A local three-dimensional model, extending to about three specimen thicknesses on either side of the delamination front was used to model the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from shell/3D simulations were in good agreement with results obtained from full solid models. The good correlation of the results demonstrated the effectiveness of the shell/3D modeling technique for the investigation of skin/stiffener separation due to delamination in the adherents. In addition, the application of the submodeling technique for the simulation of skin/stringer debond was also studied. Global models made of shell elements and solid elements were studied. Solid elements were used for local submodels, which extended between three and six specimen thicknesses on either side of the delamination front to model the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from the simulations using the submodeling technique were not in agreement with results obtained from full solid models.
Non-invasive imaging of global and regional cardiac function in pulmonary hypertension
Crowe, Tim; Jayasekera, Geeshath
2017-01-01
Pulmonary hypertension (PH) is a progressive illness characterized by elevated pulmonary artery pressure; however, the main cause of mortality in PH patients is right ventricular (RV) failure. Historically, improving the hemodynamics of pulmonary circulation was the focus of treatment; however, it is now evident that cardiac response to a given level of pulmonary hemodynamic overload is variable but plays an important role in the subsequent prognosis. Non-invasive tests of RV function to determine prognosis and response to treatment in patients with PH is essential. Although the right ventricle is the focus of attention, it is clear that cardiac interaction can cause left ventricular dysfunction, thus biventricular assessment is paramount. There is also focus on the atrial chambers in their contribution to cardiac function in PH. Furthermore, there is evidence of regional dysfunction of the two ventricles in PH, so it would be useful to understand both global and regional components of dysfunction. In order to understand global and regional cardiac function in PH, the most obvious non-invasive imaging techniques are echocardiography and cardiac magnetic resonance imaging (CMRI). Both techniques have their advantages and disadvantages. Echocardiography is widely available, relatively inexpensive, provides information regarding RV function, and can be used to estimate RV pressures. CMRI, although expensive and less accessible, is the gold standard of biventricular functional measurements. The advent of 3D echocardiography and techniques including strain analysis and stress echocardiography have improved the usefulness of echocardiography while new CMRI technology allows the measurement of strain and measuring cardiac function during stress including exercise. In this review, we have analyzed the advantages and disadvantages of the two techniques and discuss pre-existing and novel forms of analysis where echocardiography and CMRI can be used to examine atrial, ventricular, and interventricular function in patients with PH at rest and under stress. PMID:29064323
Middendorf, Jill M; Shortkroff, Sonya; Dugopolski, Caroline; Kennedy, Stephen; Siemiatkoski, Joseph; Bartell, Lena R; Cohen, Itai; Bonassar, Lawrence J
2017-11-07
Many studies have measured the global compressive properties of tissue engineered (TE) cartilage grown on porous scaffolds. Such scaffolds are known to exhibit strain softening due to local buckling under loading. As matrix is deposited onto these scaffolds, the global compressive properties increase. However the relationship between the amount and distribution of matrix in the scaffold and local buckling is unknown. To address this knowledge gap, we studied how local strain and construct buckling in human TE constructs changes over culture times and GAG content. Confocal elastography techniques and digital image correlation (DIC) were used to measure and record buckling modes and local strains. Receiver operating characteristic (ROC) curves were used to quantify construct buckling. The results from the ROC analysis were placed into Kaplan-Meier survival function curves to establish the probability that any point in a construct buckled. These analysis techniques revealed the presence of buckling at early time points, but bending at later time points. An inverse correlation was observed between the probability of buckling and the total GAG content of each construct. This data suggests that increased GAG content prevents the onset of construct buckling and improves the microscale compressive tissue properties. This increase in GAG deposition leads to enhanced global compressive properties by prevention of microscale buckling. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analysis of airfoil leading edge separation bubbles
NASA Technical Reports Server (NTRS)
Carter, J. E.; Vatsa, V. N.
1982-01-01
A local inviscid-viscous interaction technique was developed for the analysis of low speed airfoil leading edge transitional separation bubbles. In this analysis an inverse boundary layer finite difference analysis is solved iteratively with a Cauchy integral representation of the inviscid flow which is assumed to be a linear perturbation to a known global viscous airfoil analysis. Favorable comparisons with data indicate the overall validity of the present localized interaction approach. In addition numerical tests were performed to test the sensitivity of the computed results to the mesh size, limits on the Cauchy integral, and the location of the transition region.
Prell, Christina; Sun, Laixiang; Feng, Kuishuang; Myroniuk, Tyler W
2015-01-01
In this paper we investigate how structural patterns of international trade give rise to emissions inequalities across countries, and how such inequality in turn impact countries' mortality rates. We employ Multi-regional Input-Output analysis to distinguish between sulfur-dioxide (SO2) emissions produced within a country's boarders (production-based emissions) and emissions triggered by consumption in other countries (consumption-based emissions). We use social network analysis to capture countries' level of integration within the global trade network. We then apply the Prais-Winsten panel estimation technique to a panel data set across 172 countries over 20 years (1990-2010) to estimate the relationships between countries' level of integration and SO2 emissions, and the impact of trade integration and SO2 emission on mortality rates. Our findings suggest a positive, (log-) linear relationship between a country's level of integration and both kinds of emissions. In addition, although more integrated countries are mainly responsible for both forms of emissions, our findings indicate that they also tend to experience lower mortality rates. Our approach offers a unique combination of social network analysis with multiregional input-output analysis, which better operationalizes intuitive concepts about global trade and trade structure.
NASA Astrophysics Data System (ADS)
Safaei, S.; Haghnegahdar, A.; Razavi, S.
2016-12-01
Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.
Studying the Diurnal Cycle of Convection Using a TRMM-Calibrated Infrared Rain Algorithm
NASA Technical Reports Server (NTRS)
Negri, Andrew J.
2005-01-01
The development of a satellite infrared (IR) technique for estimating convective and stratiform rainfall and its application in studying the diurnal variability of rainfall on a global scale is presented. The Convective-Stratiform Technique (CST), calibrated by coincident, physically retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), is applied over the global tropics. The technique makes use of the IR data from the TRMM Visible/Infrared Scanner (VIRS) before application to global geosynchronous satellite data. The calibrated CST technique has the advantages of high spatial resolution (4 km), filtering of nonraining cirrus clouds, and the stratification of the rainfall into its convective and stratiform components, the last being important for the calculation of vertical profiles of latent heating. The diurnal cycle of rainfall, as well as the division between convective and Stratiform rainfall will be presented. The technique is validated using available data sets and compared to other global rainfall products such as Global Precipitation Climatology Project (GPCP) IR product, calibrated with TRMM Microwave Imager (TMI) data. Results from five years of PR data will show the global-tropical partitioning of convective and stratiform rainfall.
Non-biased and efficient global amplification of a single-cell cDNA library
Huang, Huan; Goto, Mari; Tsunoda, Hiroyuki; Sun, Lizhou; Taniguchi, Kiyomi; Matsunaga, Hiroko; Kambara, Hideki
2014-01-01
Analysis of single-cell gene expression promises a more precise understanding of molecular mechanisms of a living system. Most techniques only allow studies of the expressions for limited numbers of gene species. When amplification of cDNA was carried out for analysing more genes, amplification biases were frequently reported. A non-biased and efficient global-amplification method, which uses a single-cell cDNA library immobilized on beads, was developed for analysing entire gene expressions for single cells. Every step in this analysis from reverse transcription to cDNA amplification was optimized. By removing degrading excess primers, the bias due to the digestion of cDNA was prevented. Since the residual reagents, which affect the efficiency of each subsequent reaction, could be removed by washing beads, the conditions for uniform and maximized amplification of cDNAs were achieved. The differences in the amplification rates for randomly selected eight genes were within 1.5-folds, which could be negligible for most of the applications of single-cell analysis. The global amplification gives a large amount of amplified cDNA (>100 μg) from a single cell (2-pg mRNA), and that amount is enough for downstream analysis. The proposed global-amplification method was used to analyse transcript ratios of multiple cDNA targets (from several copies to several thousand copies) quantitatively. PMID:24141095
Region Spherical Harmonic Magnetic Modeling from Near-Surface and Satellite-Altitude Anomlaies
NASA Technical Reports Server (NTRS)
Kim, Hyung Rae; von Frese, Ralph R. B.; Taylor, Patrick T.
2013-01-01
The compiled near-surface data and satellite crustal magnetic measured data are modeled with a regionally concentrated spherical harmonic presentation technique over Australia and Antarctica. Global crustal magnetic anomaly studies have used a spherical harmonic analysis to represent the Earth's magnetic crustal field. This global approach, however is best applied where the data are uniformly distributed over the entire Earth. Satellite observations generally meet this requirement, but unequally distributed data cannot be easily adapted in global modeling. Even for the satellite observations, due to the errors spread over the globe, data smoothing is inevitable in the global spherical harmonic presentations. In addition, global high-resolution modeling requires a great number of global spherical harmonic coefficients for the regional presentation of crustal magnetic anomalies, whereas a lesser number of localized spherical coefficients will satisfy. We compared methods in both global and regional approaches and for a case where the errors were propagated outside the region of interest. For observations from the upcoming Swarm constellation, the regional modeling will allow the production a lesser number of spherical coefficients that are relevant to the region of interest
Yang, Wengui; Yu, Wenwu; Cao, Jinde; Alsaadi, Fuad E; Hayat, Tasawar
2018-02-01
This paper investigates the stability and lag synchronization for memristor-based fuzzy Cohen-Grossberg bidirectional associative memory (BAM) neural networks with mixed delays (asynchronous time delays and continuously distributed delays) and impulses. By applying the inequality analysis technique, homeomorphism theory and some suitable Lyapunov-Krasovskii functionals, some new sufficient conditions for the uniqueness and global exponential stability of equilibrium point are established. Furthermore, we obtain several sufficient criteria concerning globally exponential lag synchronization for the proposed system based on the framework of Filippov solution, differential inclusion theory and control theory. In addition, some examples with numerical simulations are given to illustrate the feasibility and validity of obtained results. Copyright © 2017 Elsevier Ltd. All rights reserved.
A self-consistent global emissions inventory spanning 1850 ...
While emissions inventory development has advanced significantly in recent years, the scientific community still lacks a global inventory utilizing consistent estimation approaches spanning multiple centuries. In this analysis, we investigate the strengths and weaknesses of current approaches to effectively address inventory development over not just a global spatial scale but also a timescale spanning two centuries – from early industrialization into the near future. We discuss the need within the scientific community for a dataset such as this and the landscape of questions it would allow the scientific community to address. In particular, we focus on questions that the scientific community cannot adequately address using the currently available techniques and information.We primarily focus on the difficulties and potential obstacles associated with developing an inventory of this scope and magnitude. We discuss many of the hurdles that the field has already overcome and also highlight the challenges that researchers in the field still face. We detail the complexities related to the extent of spatial and temporal scales required for an undertaking of this magnitude. In addition, we point to areas where the community currently lacks the necessary data to move forward. Our analysis focuses on one direction in which the development of global emissions inventories is heading rather than an in-depth analysis of the path of emissions inventory development
NASA Technical Reports Server (NTRS)
1990-01-01
Various papers on remote sensing (RS) for the nineties are presented. The general topics addressed include: subsurface methods, radar scattering, oceanography, microwave models, atmospheric correction, passive microwave systems, RS in tropical forests, moderate resolution land analysis, SAR geometry and SNR improvement, image analysis, inversion and signal processing for geoscience, surface scattering, rain measurements, sensor calibration, wind measurements, terrestrial ecology, agriculture, geometric registration, subsurface sediment geology, radar modulation mechanisms, radar ocean scattering, SAR calibration, airborne radar systems, water vapor retrieval, forest ecosystem dynamics, land analysis, multisensor data fusion. Also considered are: geologic RS, RS sensor optical measurements, RS of snow, temperature retrieval, vegetation structure, global change, artificial intelligence, SAR processing techniques, geologic RS field experiment, stochastic modeling, topography and Digital Elevation model, SAR ocean waves, spaceborne lidar and optical, sea ice field measurements, millimeter waves, advanced spectroscopy, spatial analysis and data compression, SAR polarimetry techniques. Also discussed are: plant canopy modeling, optical RS techniques, optical and IR oceanography, soil moisture, sea ice back scattering, lightning cloud measurements, spatial textural analysis, SAR systems and techniques, active microwave sensing, lidar and optical, radar scatterometry, RS of estuaries, vegetation modeling, RS systems, EOS/SAR Alaska, applications for developing countries, SAR speckle and texture.
Investigation of Models and Estimation Techniques for GPS Attitude Determination
NASA Technical Reports Server (NTRS)
Garrick, J.
1996-01-01
Much work has been done in the Flight Dynamics Analysis Branch (FDAB) in developing algorithms to met the new and growing field of attitude determination using the Global Positioning SYstem (GPS) constellation of satellites. Flight Dynamics has the responsibility to investigate any new technology and incorporate the innovations in the attitude ground support systems developed to support future missions. The work presented here is an investigative analysis that will produce the needed adaptation to allow the Flight Dynamics Support System (FDSS) to incorporate GPS phase measurements and produce observation measurements compatible with the FDSS. A simulator was developed to produce the necessary measurement data to test the models developed for the different estimation techniques used by FDAB. This paper gives an overview of the current modeling capabilities of the simulator models and algorithms for the adaptation of GPS measurement data and results from each of the estimation techniques. Future analysis efforts to evaluate the simulator and models against inflight GPS measurement data are also outlined.
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Nondynamic Tracking Using The Global Positioning System
NASA Technical Reports Server (NTRS)
Yunck, T. P.; Wu, Sien-Chong
1988-01-01
Report describes technique for using Global Positioning System (GPS) to determine position of low Earth orbiter without need for dynamic models. Differential observing strategy requires GPS receiver on user vehicle and network of six ground receivers. Computationally efficient technique delivers decimeter accuracy on orbits down to lowest altitudes. New technique nondynamic long-arc strategy having potential for accuracy of best dynamic techniques while retaining much of computational simplicity of geometric techniques.
Distress detection, location, and communications using advanced space technology
NASA Technical Reports Server (NTRS)
Sivertson, W. E., Jr.
1977-01-01
This paper briefly introduces a concept for low-cost, global, day-night, all-weather disaster warning and assistance. Evolving, advanced space technology with passive radio frequency reflectors in conjunction with an imaging synthetic aperture radar is employed to detect, identify, locate, and provide passive communication with earth users in distress. This concept evolved from a broad NASA research on new global search and rescue techniques. Appropriate airborne radar test results from this research are reviewed and related to potential disaster applications. The analysis indicates the approach has promise for disaster communications relative to floods, droughts, earthquakes, volcanic eruptions, and severe storms.
Acceleration techniques in the univariate Lipschitz global optimization
NASA Astrophysics Data System (ADS)
Sergeyev, Yaroslav D.; Kvasov, Dmitri E.; Mukhametzhanov, Marat S.; De Franco, Angela
2016-10-01
Univariate box-constrained Lipschitz global optimization problems are considered in this contribution. Geometric and information statistical approaches are presented. The novel powerful local tuning and local improvement techniques are described in the contribution as well as the traditional ways to estimate the Lipschitz constant. The advantages of the presented local tuning and local improvement techniques are demonstrated using the operational characteristics approach for comparing deterministic global optimization algorithms on the class of 100 widely used test functions.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Load balancing for massively-parallel soft-real-time systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hailperin, M.
1988-09-01
Global load balancing, if practical, would allow the effective use of massively-parallel ensemble architectures for large soft-real-problems. The challenge is to replace quick global communications, which is impractical in a massively-parallel system, with statistical techniques. In this vein, the author proposes a novel approach to decentralized load balancing based on statistical time-series analysis. Each site estimates the system-wide average load using information about past loads of individual sites and attempts to equal that average. This estimation process is practical because the soft-real-time systems of interest naturally exhibit loads that are periodic, in a statistical sense akin to seasonality in econometrics.more » It is shown how this load-characterization technique can be the foundation for a load-balancing system in an architecture employing cut-through routing and an efficient multicast protocol.« less
Earth Survey Applications Division. [a bibliography
NASA Technical Reports Server (NTRS)
Carpenter, L. (Editor)
1981-01-01
Accomplishments of research and data analysis conducted to study physical parameters and processes inside the Earth and on the Earth's surface, to define techniques and systems for remotely sensing the processes and measuring the parameters of scientific and applications interest, and the transfer of promising operational applications techniques to the user community of Earth resources monitors, managers, and decision makers are described. Research areas covered include: geobotany, magnetic field modeling, crustal studies, crustal dynamics, sea surface topography, land resources, remote sensing of vegetation and soils, and hydrological sciences. Major accomplishments include: production of global maps of magnetic anomalies using Magsat data; computation of the global mean sea surface using GEOS-3 and Seasat altimetry data; delineation of the effects of topography on the interpretation of remotely-sensed data; application of snowmelt runoff models to water resources management; and mapping of snow depth over wheat growing areas using Nimbus microwave data.
USDA-ARS?s Scientific Manuscript database
Genetic manipulation is an essential technique to analyze gene function; however, limited methods are available for Babesia bovis, a causative pathogen of the globally important cattle disease, bovine babesiosis. To date, two stable transfection systems have been developed for B. bovis, using select...
What does nonforest land contribute to the global carbon balance?
Jennifer C. Jenkins; Rachel Riemann
2002-01-01
An inventory of land traditionally called "nonforest" and therefore not sampled by the Forest Inventory and Analysis (FIA) program was implemented by the FIA unit at the Northeastern Station in 1999 for five counties in Maryland. Biomass and biomass increment were estimated from the nonforest inventory data using techniques developed for application to large-...
Gene expression profiling--Opening the black box of plant ecosystem responses to global change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leakey, A.D.B.; Ainsworth, E.A.; Bernard, S.M.
The use of genomic techniques to address ecological questions is emerging as the field of genomic ecology. Experimentation under environmentally realistic conditions to investigate the molecular response of plants to meaningful changes in growth conditions and ecological interactions is the defining feature of genomic ecology. Since the impact of global change factors on plant performance are mediated by direct effects at the molecular, biochemical and physiological scales, gene expression analysis promises important advances in understanding factors that have previously been consigned to the 'black box' of unknown mechanism. Various tools and approaches are available for assessing gene expression in modelmore » and non-model species as part of global change biology studies. Each approach has its own unique advantages and constraints. A first generation of genomic ecology studies in managed ecosystems and mesocosms have provided a testbed for the approach and have begun to reveal how the experimental design and data analysis of gene expression studies can be tailored for use in an ecological context.« less
Global Single and Multiple Cloud Classification with a Fuzzy Logic Expert System
NASA Technical Reports Server (NTRS)
Welch, Ronald M.; Tovinkere, Vasanth; Titlow, James; Baum, Bryan A.
1996-01-01
An unresolved problem in remote sensing concerns the analysis of satellite imagery containing both single and multiple cloud layers. While cloud parameterizations are very important both in global climate models and in studies of the Earth's radiation budget, most cloud retrieval schemes, such as the bispectral method used by the International Satellite Cloud Climatology Project (ISCCP), have no way of determining whether overlapping cloud layers exist in any group of satellite pixels. Coakley (1983) used a spatial coherence method to determine whether a region contained more than one cloud layer. Baum et al. (1995) developed a scheme for detection and analysis of daytime multiple cloud layers using merged AVHRR (Advanced Very High Resolution Radiometer) and HIRS (High-resolution Infrared Radiometer Sounder) data collected during the First ISCCP Regional Experiment (FIRE) Cirrus 2 field campaign. Baum et al. (1995) explored the use of a cloud classification technique based on AVHRR data. This study examines the feasibility of applying the cloud classifier to global satellite imagery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, Christopher A.
In this dissertation the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas is investigated. To properly assess this possibility, data from both numerical simulations and experiment are analyzed. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos in the data. These tools include phase portraits and Poincare sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulatemore » the plasma dynamics. These are the DEBS code, which models global RFP dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low dimensional chaos and simple determinism. Experimental date were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or low simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
"Heroes" and "villains" of world history across cultures.
Hanke, Katja; Liu, James H; Sibley, Chris G; Paez, Dario; Gaines, Stanley O; Moloney, Gail; Leong, Chan-Hoong; Wagner, Wolfgang; Licata, Laurent; Klein, Olivier; Garber, Ilya; Böhm, Gisela; Hilton, Denis J; Valchev, Velichko; Khan, Sammyh S; Cabecinhas, Rosa
2015-01-01
Emergent properties of global political culture were examined using data from the World History Survey (WHS) involving 6,902 university students in 37 countries evaluating 40 figures from world history. Multidimensional scaling and factor analysis techniques found only limited forms of universality in evaluations across Western, Catholic/Orthodox, Muslim, and Asian country clusters. The highest consensus across cultures involved scientific innovators, with Einstein having the most positive evaluation overall. Peaceful humanitarians like Mother Theresa and Gandhi followed. There was much less cross-cultural consistency in the evaluation of negative figures, led by Hitler, Osama bin Laden, and Saddam Hussein. After more traditional empirical methods (e.g., factor analysis) failed to identify meaningful cross-cultural patterns, Latent Profile Analysis (LPA) was used to identify four global representational profiles: Secular and Religious Idealists were overwhelmingly prevalent in Christian countries, and Political Realists were common in Muslim and Asian countries. We discuss possible consequences and interpretations of these different representational profiles.
Gene Editing in Humans: Towards a Global and Inclusive Debate for Responsible Research
de Lecuona, Itziar; Casado, María; Marfany, Gemma; Lopez Baroni, Manuel; Escarrabill, Mar
2017-01-01
In December 2016, the Opinion Group of the Bioethics and Law Observatory (OBD) of the University of Barcelona launched a Declaration on Bioethics and Gene Editing in Humans analyzing the use of genome editing techniques and their social, ethical, and legal implications through a multidisciplinary approach. It focuses on CRISPR/Cas9, a genome modification technique that enables researchers to edit specific sections of the DNA sequence of humans and other living beings. This technique has generated expectations and worries that deserve an interdisciplinary analysis and an informed social debate. The research work developed by the OBD presents a set of recommendations addressed to different stakeholders and aims at being a tool to learn more about CRISPR/Cas9 while finding an appropriate ethical and legal framework for this new technology. This article gathers and compares reports that have been published in Europe and the USA since the OBD Declaration. It aims at being a tool to foster a global and interdisciplinary discussion of this new genome editing technology. PMID:29259532
On the utilization of engineering knowledge in design optimization
NASA Technical Reports Server (NTRS)
Papalambros, P.
1984-01-01
Some current research work conducted at the University of Michigan is described to illustrate efforts for incorporating knowledge in optimization in a nontraditional way. The incorporation of available knowledge in a logic structure is examined in two circumstances. The first examines the possibility of introducing global design information in a local active set strategy implemented during the iterations of projection-type algorithms for nonlinearly constrained problems. The technique used algorithms for nonlinearly constrained problems. The technique used combines global and local monotinicity analysis of the objective and constraint functions. The second examines a knowledge-based program which aids the user to create condigurations that are most desirable from the manufacturing assembly viewpoint. The data bank used is the classification scheme suggested by Boothroyd. The important aspect of this program is that it is an aid for synthesis intended for use in the design concept phase in a way similar to the so-called idea-triggers in creativity-enhancement techniques like brain-storming. The idea generation, however, is not random but it is driven by the goal of achieving the best acceptable configuration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Weber, Gunther H.
2014-03-31
Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.
Zhu, Z.; Waller, E.
2003-01-01
Many countries periodically produce national reports on the status and changes of forest resources, using statistical surveys and spatial mapping of remotely sensed data. At the global level, the Food and Agriculture Organization (FAO) of the United Nations has conducted a Forest Resources Assessment (FRA) program every 10 yr since 1980, producing statistics and analysis that give a global synopsis of forest resources in the world. For the year 2000 of the FRA program (FRA2000), a global forest cover map was produced to provide spatial context to the extensive survey. The forest cover map, produced at the U.S. Geological Survey (USGS) EROS Data Center (EDC), has five classes: closed forest, open or fragmented forest, other wooded land, other land cover, and water. The first two forested classes at the global scale were delineated using combinations of temporal compositing, modified mixture analysis, geographic stratification, and other classification techniques. The remaining three FAO classes were derived primarily from the USGS global land cover characteristics database (Loveland et al. 1999). Validated on the basis of existing reference data sets, the map is estimated to be 77% accurate for the first four classes (no reference data were available for water), and 86% accurate for the forest and nonforest classification. The final map will be published as an insert to the FAO FRA2000 report.
Requirements for a next generation global flood inundation models
NASA Astrophysics Data System (ADS)
Bates, P. D.; Neal, J. C.; Smith, A.; Sampson, C. C.
2016-12-01
In this paper we review the current status of global hydrodynamic models for flood inundation prediction and highlight recent successes and current limitations. Building on this analysis we then go on to consider what is required to develop the next generation of such schemes and show that to achieve this a number of fundamental science problems will need to be overcome. New data sets and new types of analysis will be required, and we show that these will only partially be met by currently planned satellite missions and data collection initiatives. A particular example is the quality of available global Digital Elevation data. The current best data set for flood modelling, SRTM, is only available at a relatively modest 30m resolution, contains pixel-to-pixel noise of 6m and is corrupted by surface artefacts. Creative processing techniques have sought to address these issues with some success, but fundamentally the quality of the available global terrain data limits flood modelling and needs to be overcome. Similar arguments can be made for many other elements of global hydrodynamic models including their bathymetry data, boundary conditions, flood defence information and model validation data. We therefore systematically review each component of global flood models and document whether planned new technology will solve current limitations and, if not, what exactly will be required to do so.
NASA Astrophysics Data System (ADS)
Elshambaky, Hossam Talaat
2018-01-01
Owing to the appearance of many global geopotential models, it is necessary to determine the most appropriate model for use in Egyptian territory. In this study, we aim to investigate three global models, namely EGM2008, EIGEN-6c4, and GECO. We use five mathematical transformation techniques, i.e., polynomial expression, exponential regression, least-squares collocation, multilayer feed forward neural network, and radial basis neural networks to make the conversion from regional geometrical geoid to global geoid models and vice versa. From a statistical comparison study based on quality indexes between previous transformation techniques, we confirm that the multilayer feed forward neural network with two neurons is the most accurate of the examined transformation technique, and based on the mean tide condition, EGM2008 represents the most suitable global geopotential model for use in Egyptian territory to date. The final product gained from this study was the corrector surface that was used to facilitate the transformation process between regional geometrical geoid model and the global geoid model.
[Introduction to Exploratory Factor Analysis (EFA)].
Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón
2012-03-01
Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Investigating effects of communications modulation technique on targeting performance
NASA Astrophysics Data System (ADS)
Blasch, Erik; Eusebio, Gerald; Huling, Edward
2006-05-01
One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.
Peng, Xiao; Wu, Huaiqin; Song, Ka; Shi, Jiaxin
2017-10-01
This paper is concerned with the global Mittag-Leffler synchronization and the synchronization in finite time for fractional-order neural networks (FNNs) with discontinuous activations and time delays. Firstly, the properties with respect to Mittag-Leffler convergence and convergence in finite time, which play a critical role in the investigation of the global synchronization of FNNs, are developed, respectively. Secondly, the novel state-feedback controller, which includes time delays and discontinuous factors, is designed to realize the synchronization goal. By applying the fractional differential inclusion theory, inequality analysis technique and the proposed convergence properties, the sufficient conditions to achieve the global Mittag-Leffler synchronization and the synchronization in finite time are addressed in terms of linear matrix inequalities (LMIs). In addition, the upper bound of the setting time of the global synchronization in finite time is explicitly evaluated. Finally, two examples are given to demonstrate the validity of the proposed design method and theoretical results. Copyright © 2017 Elsevier Ltd. All rights reserved.
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.
GLOBAL SOLUTIONS TO FOLDED CONCAVE PENALIZED NONCONVEX LEARNING
Liu, Hongcheng; Yao, Tao; Li, Runze
2015-01-01
This paper is concerned with solving nonconvex learning problems with folded concave penalty. Despite that their global solutions entail desirable statistical properties, there lack optimization techniques that guarantee global optimality in a general setting. In this paper, we show that a class of nonconvex learning problems are equivalent to general quadratic programs. This equivalence facilitates us in developing mixed integer linear programming reformulations, which admit finite algorithms that find a provably global optimal solution. We refer to this reformulation-based technique as the mixed integer programming-based global optimization (MIPGO). To our knowledge, this is the first global optimization scheme with a theoretical guarantee for folded concave penalized nonconvex learning with the SCAD penalty (Fan and Li, 2001) and the MCP penalty (Zhang, 2010). Numerical results indicate a significant outperformance of MIPGO over the state-of-the-art solution scheme, local linear approximation, and other alternative solution techniques in literature in terms of solution quality. PMID:27141126
NASA Technical Reports Server (NTRS)
Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.
1987-01-01
The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.
Comparison of Globally Complete Versions of GPCP and CMAP Monthly Precipitation Analyses
NASA Technical Reports Server (NTRS)
Curtis, Scott; Adler, Robert; Huffman, George
1998-01-01
In this study two global observational precipitation products, namely the Global Precipitation Climatology Project's (GPCP) community data set and CPC's Merged Analysis of Precipitation (CMAP), are compared on global to regional scales in the context of the different satellite and gauge data inputs and merger techniques. The average annual global precipitation rates, calculated from data common in regions/times to both GPCP and CMAP, are similar for the two. However, CMAP is larger than GPCP in the tropics because: (1) CMAP values in the tropics are adjusted month-by month to atoll gauge data in the West Pacific, which are greater than any satellite observations used; and (2) CMAP is produced from a linear combination of data inputs, which tends to give higher values than the microwave emission estimates alone to which the inputs are adjusted in the GPCP merger over the ocean. The CMAP month-to-month adjustment to the atolls also appears to introduce temporal variations throughout the tropics which are not detected by satellite-only products. On the other hand, GPCP is larger than CMAP in the high-latitude oceans, where CMAP includes the scattering based microwave estimates which are consistently smaller than the emission estimates used in both techniques. Also, in the polar regions GPCP transitions from the emission microwave estimates to the larger TOVS-based estimates. Finally, in high-latitude land areas GPCP can be significantly larger than CMAP because GPCP attempts to correct the gauge estimates for errors due to wind loss effects.
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research. PMID:29385151
Olijnyk, Nicholas V
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research.
Yoon, Sunmoo
2017-01-01
Background Twitter can address the mental health challenges of dementia care. The aims of this study is to explore the contents and user interactions of tweets mentioning dementia to gain insights for dementia care. Methods We collected 35,260 tweets mentioning Alzheimer’s or dementia on World Alzheimer’s Day, September 21st in 2015. Topic modeling and social network analysis were applied to uncover content and structure of user communication. Results Global users generated keywords related to mental health and care including #psychology and #mental health. There were similarities and differences between the UK and the US in tweet content. The macro-level analysis uncovered substantial public interest on dementia. The meso-level network analysis revealed that top leaders of communities were spiritual organizations and traditional media. Conclusions The application of topic modeling and multi-level network analysis while incorporating visualization techniques can promote a global level understanding regarding public attention, interests, and insights regarding dementia care and mental health. PMID:27803262
Danish; Saud, Shah; Baloch, Muhammad Awais; Lodhi, Rab Nawaz
2018-04-28
In the modern era of globalization, the economic activities expand with the passage of time. This expansion may increase demand for energy both in developing and developed countries. Therefore, this study assesses the impact of financial development on energy consumption incorporating the role of globalization in Next-11 countries. A group of panel estimation techniques is used to analyze the panel data and time series data for the time 1990-2014. The empirical results of the study suggest that financial development stimulates energy consumption. Also, globalization increases demand for energy consumption, although the single country analysis suggests that the effect of globalization on energy demand is heterogeneous among N-11 countries. Furthermore, feedback hypothesis is confirmed between financial development and energy consumption. Also, bidirectional causality is found between economic growth and energy consumption. The findings urge for the attention of policymaker in emerging countries to develop a strategy to reduce the consequences of energy consumption by controlling resource transfer through globalization to the host country and by adopting energy conversation policies.
On the global geodetic observing system: Africa's preparedness and challenges
NASA Astrophysics Data System (ADS)
Botai, O. J.; Combrinck, Ludwig; Rautenbach, C. J. Hannes
2013-02-01
Space geodetic techniques and satellite missions play a crucial role in the determination and monitoring of geo-kinematics, Earth's rotation and gravity fields. These three pillars of geodesy provide the basis for determining the geodetic reference frames with high accuracy, spatial resolution and temporal stability. Space geodetic techniques have been used for the assessment of geo-hazards, anthropogenic hazards and in the design of early warning systems for hazard and disasters. In general, space geodesy provides products for Earth observation, science and influences many activities (e.g., building and management) in a modern society. In order to further promote the application of space geodetic methods to solving Earth science problems, the Global Geodetic Observing System (GGOS) of the International Association of Geodesy (IAG) was commissioned as an important geodetic infrastructure that integrates different geodetic techniques (such as Global Navigation Satellite Systems, Very Long Baseline Interferometry, Satellite Laser Ranging, Interferometric Synthetic Aperture Radar and Doppler Orbitography and Radio-positioning Integrated by Satellite), models and analysis techniques for the purpose of ensuring long-term, precise monitoring of geodetic observables vital for monitoring Earth system processes. Since its inception, there has been considerable progress made towards setting up the infrastructure necessary for the establishment of the GGOS database. While the challenges that beleaguer the GGOS are acknowledged (at least at global level), the assessment of an attuned GGOS infrastructure in the African context is necessary, yet lacking. In the present contribution, (a) the African preparedness and response to the observing system is assessed, and (b) the specific scientific and technological challenges of establishing a regional GGOS hub for Africa are reviewed. Currently only South Africa has a fundamental geodetic observatory located at Hartebeesthoek, Pretoria. Other countries in Africa have shown interest to participate in global geodetic activities, in particular through interest in the development of a unified African geodetic reference frame (AFREF). In particular interest has been shown in the proposed African VLBI Network (AVN), which will be partially based on existing ex-telecommunication radio antennas. Several countries are investigating their participation in the AVN, including Kenya, Nigeria and Ghana.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
Cost-revenue analysis in the surgical treatment of the obstructed defecation syndrome.
Schiano di Visconte, Michele; Piccin, Alessandra; Di Bella, Raimondo; Giomo, Priscilla; Pederiva, Vania; Cina, Livio Dal; Munegato, Gabriele
2006-01-01
The obstructed defecation syndrome is a frequent condition in the female population. Rectocele and rectal intussusception may cause symptoms of obstructed defecation. The aim of this study is to carry out an economic cost-revenue analysis comparing the rectocele and the rectal intussusception surgical techniques using a double-transanal, circular stapler (Stapled Trans-Anal Rectal Resection - STARR) with other techniques used to repair the same defects. The analysis involved the systematic calculation of the costs incurred during hospitalisation. The revenue estimate was obtained according to the rate quantification of the Diagnosis Related Group (DRG) associated with each hospitalisation. Our analysis confirmed that the global expenditure for the STARR technique amounts to 3,579.09 Euro as against 5,401.15 Euro for rectocele abdominal repair and 3,469.32 Euro for perineal repair. The intussusception repair cost according to Delorme's procedure amounts to 5,877.41Euro as against 3,579.09 Euro for the STARR technique. The revenue analysis revealed a substantial gain for the Health Authority as regards the treatment of rectocele and rectal intussusception for obstructed defecation syndrome. The highest revenue, 6,168. 52 Euro, was obtained with intussusception repair with STARR as compared to Delorme's procedure which presented revenue amounting to 2,359.04. Lower revenues are recorded if the STARR technique is intended for rectocele repair; in this case the revenue amounts to 1,778.12 Euro as against 869.67 Euro and 1,887.89 Euro for abdominal and perineal repair, respectively.
Simulated transition from RCP8.5 to RCP4.5 through three different Radiation Management techniques
NASA Astrophysics Data System (ADS)
Muri, H.; Kristjansson, J. E.; Adakudlu, M.; Grini, A.; Lauvset, S. K.; Otterå, O. H.; Schulz, M.; Tjiputra, J. F.
2016-12-01
Scenario studies have shown that in order to limit global warming to 2°C above pre-industrial levels, negative CO2 emissions are required. Currently, no safe and well-established technologies exist for achieving such negative emissions. Hence, although carbon dioxide removal may appear less risky and controversial than Radiation Management (RM) techniques, the latter type of climate engineering (CE) techniques cannot be ruled out as a future policy option. The EXPECT project, funded by the Norwegian Research Council, explores the potential and risks of RM through Earth System Model Simulations. We here describe results from a study that simulates a 21st century transition from an RCP8.5 to a RCP4.5 scenario through Radiation Management. The study uses the Norwegian Earth System Model (NorESM) to compare the results from the following three RM techniques: a) Stratospheric Aerosol Injections (SAI); b) Marine Sky Brightening (MSB); c) Cirrus Cloud Thinning (CCT). All three simulations start from the year 2020 and run until 2100. Whereas both SAI and MSB successfully simulate the desired negative radiative forcing throughout the 21st century, the CCT simulations have a +0.5 W m-2 residual forcing (on top of RCP4.5) at the end of the century. Although all three techniques obtain approximately the same global temperature evolution, precipitation responses are very different. In particular, the CCT simulation has even more globally averaged precipitation at year 2100 than RCP8.5, whereas both SAI and MSB simulate less precipitation than RCP4.5. In addition, there are significant differences in geographical patterns of precipitation. Natural variability in the Earth System also exhibits sensitivity to the choice of RM technique: Both the Atlantic Meridional Overturning Circulation and the Pacific Decadal Oscillation respond differently to the choice of SAI, MSB or CCT. We will present a careful analysis, as well as a physical interpretation of the above results.
Diagnostic features of Alzheimer's disease extracted from PET sinograms
NASA Astrophysics Data System (ADS)
Sayeed, A.; Petrou, M.; Spyrou, N.; Kadyrov, A.; Spinks, T.
2002-01-01
Texture analysis of positron emission tomography (PET) images of the brain is a very difficult task, due to the poor signal to noise ratio. As a consequence, very few techniques can be implemented successfully. We use a new global analysis technique known as the Trace transform triple features. This technique can be applied directly to the raw sinograms to distinguish patients with Alzheimer's disease (AD) from normal volunteers. FDG-PET images of 18 AD and 10 normal controls obtained from the same CTI ECAT-953 scanner were used in this study. The Trace transform triple feature technique was used to extract features that were invariant to scaling, translation and rotation, referred to as invariant features, as well as features that were sensitive to rotation but invariant to scaling and translation, referred to as sensitive features in this study. The features were used to classify the groups using discriminant function analysis. Cross-validation tests using stepwise discriminant function analysis showed that combining both sensitive and invariant features produced the best results, when compared with the clinical diagnosis. Selecting the five best features produces an overall accuracy of 93% with sensitivity of 94% and specificity of 90%. This is comparable with the classification accuracy achieved by Kippenhan et al (1992), using regional metabolic activity.
Combining Capillary Electrophoresis with Mass Spectrometry for Applications in Proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, David C.; Smith, Richard D.
2005-04-01
Throughout the field of global proteomics, ranging from simple organism studies to human medical applications, the high sample complexity creates demands for improved separations and analysis techniques. Furthermore, with increased organism complexity, the correlation between proteome and genome becomes less certain due to extensive mRNA processing prior to translation. In this way, the same DNA sequence can potentially code for regions in a number of distinct proteins; quantitative differences in expression (or abundance) between these often-related species are of significant interest. Well-established proteomics techniques, which use genomic information to identify peptides that originate from protease digestion, often cannot easily distinguishmore » between such gene products; intact protein-level analyses are required to complete the picture, particularly for identifying post-translational modifications. While chromatographic techniques are currently better suited to peptide analysis, capillary electrophoresis (CE) in combination with mass spectrometry (MS) may become important for intact protein analysis. This review focuses on CE/MS instrumentation and techniques showing promise for such applications, highlighting those with greatest potential. Reference will also be made to developments relevant to peptide-level analyses for use in time- or sample-limited situations.« less
NASA Astrophysics Data System (ADS)
Cicone, A.; Zhou, H.; Piersanti, M.; Materassi, M.; Spogli, L.
2017-12-01
Nonlinear and nonstationary signals are ubiquitous in real life. Their decomposition and analysis is of crucial importance in many research fields. Traditional techniques, like Fourier and wavelet Transform have been proved to be limited in this context. In the last two decades new kind of nonlinear methods have been developed which are able to unravel hidden features of these kinds of signals. In this poster we present a new method, called Adaptive Local Iterative Filtering (ALIF). This technique, originally developed to study mono-dimensional signals, unlike any other algorithm proposed so far, can be easily generalized to study two or higher dimensional signals. Furthermore, unlike most of the similar methods, it does not require any a priori assumption on the signal itself, so that the technique can be applied as it is to any kind of signals. Applications of ALIF algorithm to real life signals analysis will be presented. Like, for instance, the behavior of the water level near the coastline in presence of a Tsunami, length of the day signal, pressure measured at ground level on a global grid, radio power scintillation from GNSS signals,
Chaos in plasma simulation and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, C.; Newman, D.E.; Sprott, J.C.
1993-09-01
We investigate the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas using data from both numerical simulations and experiment. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos. These tools include phase portraits and Poincard sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulate the plasma dynamics. These are -the DEBS code, which models global RFPmore » dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low,dimensional chaos and simple determinism. Experimental data were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or other simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less
NASA Astrophysics Data System (ADS)
Chan, Chun-Kai; Loh, Chin-Hsiung; Wu, Tzu-Hsiu
2015-04-01
In civil engineering, health monitoring and damage detection are typically carry out by using a large amount of sensors. Typically, most methods require global measurements to extract the properties of the structure. However, some sensors, like LVDT, cannot be used due to in situ limitation so that the global deformation remains unknown. An experiment is used to demonstrate the proposed algorithms: a one-story 2-bay reinforce concrete frame under weak and strong seismic excitation. In this paper signal processing techniques and nonlinear identification are used and applied to the response measurements of seismic response of reinforced concrete structures subject to different level of earthquake excitations. Both modal-based and signal-based system identification and feature extraction techniques are used to study the nonlinear inelastic response of RC frame using both input and output response data or output only measurement. From the signal-based damage identification method, which include the enhancement of time-frequency analysis of acceleration responses and the estimation of permanent deformation using directly from acceleration response data. Finally, local deformation measurement from dense optical tractor is also use to quantify the damage of the RC frame structure.
NASA Astrophysics Data System (ADS)
Szafranek, K.; Schillak, S.; Araszkiewicz, A.; Figurski, M.; Lehmann, M.; Lejba, P.
2012-04-01
Up-to-date investigations concerning space geodesy are mostly aimed at data of various techniques joint processing. The poster presents solutions (North, East, Up components) of selected stations (McDonald, Yarragadee, Greenbelt, Monument Peak, Zimmerwald, Borowiec, Mt.Stromlo-Orroral, Potsdam, Graz, Herstmonceux and Wettzell), which adopted Satellite Laser Ranging (SLR) and Global Navigation Satellite System (GNSS) techniques and which were gathering the data in the same time (from 1994 to 2010). Processing of both types of data was made according to Global Geodetic Observing System (GGOS) recommendations, the same models and parameters from IERS Conventions 2010 were used in both processing strategies (if it was possible). The main goal was to obtain coordinates and their changes in time (velocities) basing on both techniques and to compare the results. The station coordinates were determined for the common reference epoch of both techniques - for first day of each month. Monthly orbital arcs for laser observations were created basing on solutions from several SLR sites (observations to LAGEOS-1 and LAGEOS-2 satellites) with the best solutions quality and the highest amount of observations. For GNSS coordinates determination about 130 sites belonging to International GNSS Service (IGS) were selected: 30 with local ties to SLR sites and others basing on their geolocalization (length of the baselines) and solutions time series analysis. Mainly, core IGS stations were used. Solutions of both techniques were analyzed in order to verify agreement of both techniques and for independent control of local ties.
Prell, Christina; Sun, Laixiang; Feng, Kuishuang; Myroniuk, Tyler W.
2015-01-01
In this paper we investigate how structural patterns of international trade give rise to emissions inequalities across countries, and how such inequality in turn impact countries’ mortality rates. We employ Multi-regional Input-Output analysis to distinguish between sulfur-dioxide (SO2) emissions produced within a country’s boarders (production-based emissions) and emissions triggered by consumption in other countries (consumption-based emissions). We use social network analysis to capture countries’ level of integration within the global trade network. We then apply the Prais-Winsten panel estimation technique to a panel data set across 172 countries over 20 years (1990–2010) to estimate the relationships between countries’ level of integration and SO2 emissions, and the impact of trade integration and SO2 emission on mortality rates. Our findings suggest a positive, (log-) linear relationship between a country’s level of integration and both kinds of emissions. In addition, although more integrated countries are mainly responsible for both forms of emissions, our findings indicate that they also tend to experience lower mortality rates. Our approach offers a unique combination of social network analysis with multiregional input-output analysis, which better operationalizes intuitive concepts about global trade and trade structure. PMID:26642202
Superposed epoch analysis of ion temperatures during CME- and CIR/HSS-driven storms
NASA Astrophysics Data System (ADS)
Keesee, A. M.; Scime, E. E.
2012-12-01
The NASA Two Wide-angle Imaging Neutral atom Spectrometers (TWINS) Mission provides a global view of the magnetosphere with near-continuous coverage. Utilizing a novel technique to calculate ion temperatures from the TWINS energetic neutral atom (ENA) measurements, we generate ion temperature maps of the magnetosphere. These maps can be used to study ion temperature evolution during geomagnetic storms. A superposed epoch analysis of the ion temperature evolution during 48 storms will be presented. Zaniewski et al. [2006] performed a superposed epoch analysis of ion temperatures by storm interval using data from the MENA instrument on the IMAGE mission, demonstrating significant dayside ion heating during the main phase. The TWINS measurements provide more continuous coverage and improved spatial and temporal resolution. Denton and Borovsky [2008] noted differences in ion temperature evolution at geosynchronous orbit between coronal mass ejection (CME)- and corotating interaction region (CIR)/high speed stream (HSS)- driven storms. Using our global ion temperature maps, we have found consistent results for select individual storms [Keesee et al., 2012]. We will present superposed epoch analyses for the subgroups of CME- and CIR/HSS-driven storms to compare global ion temperature evolution during the two types of storms.
Stochastic approach to data analysis in fluorescence correlation spectroscopy.
Rao, Ramachandra; Langoju, Rajesh; Gösch, Michael; Rigler, Per; Serov, Alexandre; Lasser, Theo
2006-09-21
Fluorescence correlation spectroscopy (FCS) has emerged as a powerful technique for measuring low concentrations of fluorescent molecules and their diffusion constants. In FCS, the experimental data is conventionally fit using standard local search techniques, for example, the Marquardt-Levenberg (ML) algorithm. A prerequisite for these categories of algorithms is the sound knowledge of the behavior of fit parameters and in most cases good initial guesses for accurate fitting, otherwise leading to fitting artifacts. For known fit models and with user experience about the behavior of fit parameters, these local search algorithms work extremely well. However, for heterogeneous systems or where automated data analysis is a prerequisite, there is a need to apply a procedure, which treats FCS data fitting as a black box and generates reliable fit parameters with accuracy for the chosen model in hand. We present a computational approach to analyze FCS data by means of a stochastic algorithm for global search called PGSL, an acronym for Probabilistic Global Search Lausanne. This algorithm does not require any initial guesses and does the fitting in terms of searching for solutions by global sampling. It is flexible as well as computationally faster at the same time for multiparameter evaluations. We present the performance study of PGSL for two-component with triplet fits. The statistical study and the goodness of fit criterion for PGSL are also presented. The robustness of PGSL on noisy experimental data for parameter estimation is also verified. We further extend the scope of PGSL by a hybrid analysis wherein the output of PGSL is fed as initial guesses to ML. Reliability studies show that PGSL and the hybrid combination of both perform better than ML for various thresholds of the mean-squared error (MSE).
Linear retrieval and global measurements of wind speed from the Seasat SMMR
NASA Technical Reports Server (NTRS)
Pandey, P. C.
1983-01-01
Retrievals of wind speed (WS) from Seasat Scanning Multichannel Microwave Radiometer (SMMR) were performed using a two-step statistical technique. Nine subsets of two to five SMMR channels were examined for wind speed retrieval. These subsets were derived by using a leaps and bound procedure based on the coefficient of determination selection criteria to a statistical data base of brightness temperatures and geophysical parameters. Analysis of Monsoon Experiment and ocean station PAPA data showed a strong correlation between sea surface temperature and water vapor. This relation was used in generating the statistical data base. Global maps of WS were produced for one and three month periods.
NASA Technical Reports Server (NTRS)
Sivertson, W. E., Jr.
1977-01-01
This paper briefly introduces a concept for low-cost, global, day-night, all-weather disaster warning and assistance. Evolving, advanced space technology with passive radio frequency reflectors in conjunction with an imaging synthetic aperture radar is employed to detect, identify, locate, and provide passive communication with earth users in distress. This concept evolved from a broad NASA research on new global search and rescue techniques. Appropriate airborne radar test results from this research are reviewed and related to potential disaster applications. The analysis indicates the approach has promise for disaster communications relative to floods, droughts, earthquakes, volcanic eruptions, and severe storms.
A strawman SLR program plan for the 1990s
NASA Technical Reports Server (NTRS)
Degnan, John J.
1994-01-01
A series of programmatic and technical goals for the satellite laser ranging (SLR) network are presented. They are: (1) standardize the performance of the global SLR network; (2) improve the geographic distribution of stations; (3) reduce costs of field operations and data processing; (4) expand the 24 hour temporal coverage to better serve the growing constellation of satellites; (5) improve absolute range accuracy to 2 mm at key stations; (6) improve satellite force, radiative propagation, and station motion models and investigate alternative geodetic analysis techniques; (7) support technical intercomparison and the Terrestrial Reference Frame through global collocations; (8) investigate potential synergisms between GPS and SLR.
Non-Intrusive Measurement Techniques Applied to the Hybrid Solid Fuel Degradation
NASA Astrophysics Data System (ADS)
Cauty, F.
2004-10-01
The knowledge of the solid fuel regression rate and the time evolution of the grain geometry are requested for hybrid motor design and control of its operating conditions. Two non-intrusive techniques (NDT) have been applied to hybrid propulsion : both are based on wave propagation, the X-rays and the ultrasounds, through the materials. X-ray techniques allow local thickness measurements (attenuated signal level) using small probes or 2D images (Real Time Radiography), with a link between the size of field of view and accuracy. Beside the safety hazards associated with the high-intensity X-ray systems, the image analysis requires the use of quite complex post-processing techniques. The ultrasound technique is more widely used in energetic material applications, including hybrid fuels. Depending upon the transducer size and the associated equipment, the application domain is large, from tiny samples to the quad-port wagon wheel grain of the 1.1 MN thrust HPDP motor. The effect of the physical quantities has to be taken into account in the wave propagation analysis. With respect to the various applications, there is no unique and perfect experimental method to measure the fuel regression rate. The best solution could be obtained by combining two techniques at the same time, each technique enhancing the quality of the global data.
Satellite-Enhanced Dynamical Downscaling of Extreme Events
NASA Astrophysics Data System (ADS)
Nunes, A.
2015-12-01
Severe weather events can be the triggers of environmental disasters in regions particularly susceptible to changes in hydrometeorological conditions. In that regard, the reconstruction of past extreme weather events can help in the assessment of vulnerability and risk mitigation actions. Using novel modeling approaches, dynamical downscaling of long-term integrations from global circulation models can be useful for risk analysis, providing more accurate climate information at regional scales. Originally developed at the National Centers for Environmental Prediction (NCEP), the Regional Spectral Model (RSM) is being used in the dynamical downscaling of global reanalysis, within the South American Hydroclimate Reconstruction Project. Here, RSM combines scale-selective bias correction with assimilation of satellite-based precipitation estimates to downscale extreme weather occurrences. Scale-selective bias correction is a method employed in the downscaling, similar to the spectral nudging technique, in which the downscaled solution develops in agreement with its coarse boundaries. Precipitation assimilation acts on modeled deep-convection, drives the land-surface variables, and therefore the hydrological cycle. During the downscaling of extreme events that took place in Brazil in recent years, RSM continuously assimilated NCEP Climate Prediction Center morphing technique precipitation rates. As a result, RSM performed better than its global (reanalysis) forcing, showing more consistent hydrometeorological fields compared with more sophisticated global reanalyses. Ultimately, RSM analyses might provide better-quality initial conditions for high-resolution numerical predictions in metropolitan areas, leading to more reliable short-term forecasting of severe local storms.
Nanomaterials-Based Optical Techniques for the Detection of Acetylcholinesterase and Pesticides
Xia, Ning; Wang, Qinglong; Liu, Lin
2015-01-01
The large amount of pesticide residues in the environment is a threat to global health by inhibition of acetylcholinesterase (AChE). Biosensors for inhibition of AChE have been thus developed for the detection of pesticides. In line with the rapid development of nanotechnology, nanomaterials have attracted great attention and have been intensively studied in biological analysis due to their unique chemical, physical and size properties. The aim of this review is to provide insight into nanomaterial-based optical techniques for the determination of AChE and pesticides, including colorimetric and fluorescent assays and surface plasmon resonance. PMID:25558991
[Proteomics in infectious diseases].
Quero, Sara; Párraga-Niño, Noemí; García-Núñez, Marian; Sabrià, Miquel
2016-04-01
Infectious diseases have a high incidence in the population, causing a major impact on global health. In vitro culture of microorganisms is the first technique applied for infection diagnosis which is laborious and time consuming. In recent decades, efforts have been focused on the applicability of "Omics" sciences, highlighting the progress provided by proteomic techniques in the field of infectious diseases. This review describes the management, processing and analysis of biological samples for proteomic research. Copyright © 2014 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
NASA Astrophysics Data System (ADS)
Boisserie, Marie
The goal of this dissertation research is to produce empirical soil moisture initial conditions (soil moisture analysis) and investigate its impact on the short-term (2 weeks) to subseasonal (2 months) forecasting skill of 2-m air temperature and precipitation. Because of soil moisture has a long memory and plays a role in controlling the surface water and energy budget, an accurate soil moisture analysis is today widely recognized as having the potential to increase summertime climate forecasting skill. However, because of a lack of global observations of soil moisture, there has been no scientific consensus on the importance of the contribution of a soil moisture initialization as close to the truth as possible to climate forecasting skill. In this study, the initial conditions are generated using a Precipitation Assimilation Reanalysis (PAR) technique to produce a soil moisture analysis. This technique consists mainly of nudging precipitation in the atmosphere component of a land-atmosphere model by adjusting the vertical air humidity profile based on the difference between the rate of the model-derived precipitation rate and the observed rate. The unique aspects of the PAR technique are the following: (1) based on the PAR technique, the soil moisture analysis is generated using a coupled land-atmosphere forecast model; therefore, no bias between the initial conditions and the forecast model (spinup problem) is encountered; and (2) the PAR technique is physically consistent; the surface and radiative fluxes remains in conjunction with the soil moisture analysis. To our knowledge, there has been no attempt to use a physically consistent soil moisture land assimilation system into a land-atmosphere model in a coupled mode. The effect of the PAR technique on the model soil moisture estimates is evaluated using the Global Soil Wetness Project Phase 2 (GSWP-2) multimodel analysis product (used as a proxy for global soil moisture observations) and actual in-situ observations from the state of Illinois. The results show that overall the PAR technique is effective; across most of the globe, the seasonal and anomaly variability of the model soil moisture estimates well reproduce the values of GSWP-2 in the top 1.5 m soil layer; by comparing to in-situ observations in Illinois, we find that the seasonal and anomaly soil moisture variability is also well represented deep into the soil. Therefore, in this study, we produce a new global soil moisture analysis dataset that can be used for many land surface studies (crop modeling, water resource management, soil erosion, etc.). Then, the contribution of the resulting soil moisture analysis (used as initial conditions) on air temperature and precipitation forecasts are investigated. For this, we follow the experimental set up of a model intercomparison study over the time period 1986-1995, the Global Land-Atmosphere Coupling Experiment second phase (GLACE-2), in which the FSU/COAPS climate model has participated. The results of the summertime air temperature forecasts show a significant increase in skill across most of the U.S. at short-term to subseasonal time scales. No increase in summertime precipitation forecasting skill is found at short-term to subseasonal time scales between 1986 and 1995, except for the anomalous drought year of 1988. We also analyze the forecasts of two extreme hydrological events, the 1988 U.S. drought and the 1993 U.S. flood. In general, the comparison of these two extreme hydrological event forecasts shows greater improvement for the summertime of 1988 than that of 1993, suggesting that soil moisture contributes more to the development of a drought than a flood. This result is consistent with Dirmeyer and Brubaker [1999] and Weaver et al. [2009]. By analyzing the evaporative sources of these two extreme events using the back-trajectory methodology of Dirmeyer and Brubaker [1999], we find similar results as this latter paper; the soil moisture-precipitation feedback mechanism seems to play a greater role during the drought year of 1988 than the flood year of 1993. Finally, the accuracy of this soil moisture initialization depends upon the quality of the precipitation dataset that is assimilated. Because of the lack of observed precipitation at a high temporal resolution (3-hourly) for the study period (1986-1995), a reanalysis product is used for precipitation assimilation in this study. It is important to keep in mind that precipitation data in reanalysis sometimes differ significantly from observations since precipitation is often not assimilated into the reanalysis model. In order to investigate that aspect, a similar analysis to that we performed in this study could be done using the 3-hourly Tropical Rainfall Measuring Mission (TRMM) dataset available for a the time period 1998-present. Then, since the TRMM dataset is a fully observational dataset, we expect the soil moisture initialization to be improved over that obtained in this study, which, in turn, may further increase the forecast skill.
Linear approximations of global behaviors in nonlinear systems with moderate or strong noise
NASA Astrophysics Data System (ADS)
Liang, Junhao; Din, Anwarud; Zhou, Tianshou
2018-03-01
While many physical or chemical systems can be modeled by nonlinear Langevin equations (LEs), dynamical analysis of these systems is challenging in the cases of moderate and strong noise. Here we develop a linear approximation scheme, which can transform an often intractable LE into a linear set of binomial moment equations (BMEs). This scheme provides a feasible way to capture nonlinear behaviors in the sense of probability distribution and is effective even when the noise is moderate or big. Based on BMEs, we further develop a noise reduction technique, which can effectively handle tough cases where traditional small-noise theories are inapplicable. The overall method not only provides an approximation-based paradigm to analysis of the local and global behaviors of nonlinear noisy systems but also has a wide range of applications.
Signal classification using global dynamical models, Part II: SONAR data analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kremliovsky, M.; Kadtke, J.
1996-06-01
In Part I of this paper, we described a numerical method for nonlinear signal detection and classification which made use of techniques borrowed from dynamical systems theory. Here in Part II of the paper, we will describe an example of data analysis using this method, for data consisting of open ocean acoustic (SONAR) recordings of marine mammal transients, supplied from NUWC sources. The purpose here is two-fold: first to give a more operational description of the technique and provide rules-of-thumb for parameter choices; and second to discuss some new issues raised by the analysis of non-ideal (real-world) data sets. Themore » particular data set considered here is quite non-stationary, relatively noisy, is not clearly localized in the background, and as such provides a difficult challenge for most detection/classification schemes. {copyright} {ital 1996 American Institute of Physics.}« less
The integrated analysis capability (IAC Level 2.0)
NASA Technical Reports Server (NTRS)
Frisch, Harold P.; Vos, Robert G.
1988-01-01
The critical data management issues involved in the development of the integral analysis capability (IAC), Level 2, to support the design analysis and performance evaluation of large space structures, are examined. In particular, attention is given to the advantages and disadvantages of the formalized data base; merging of the matrix and relational data concepts; data types, query operators, and data handling; sequential versus direct-access files; local versus global data access; programming languages and host machines; and data flow techniques. The discussion also covers system architecture, recent system level enhancements, executive/user interface capabilities, and technology applications.
Efficient Global Aerodynamic Modeling from Flight Data
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2012-01-01
A method for identifying global aerodynamic models from flight data in an efficient manner is explained and demonstrated. A novel experiment design technique was used to obtain dynamic flight data over a range of flight conditions with a single flight maneuver. Multivariate polynomials and polynomial splines were used with orthogonalization techniques and statistical modeling metrics to synthesize global nonlinear aerodynamic models directly and completely from flight data alone. Simulation data and flight data from a subscale twin-engine jet transport aircraft were used to demonstrate the techniques. Results showed that global multivariate nonlinear aerodynamic dependencies could be accurately identified using flight data from a single maneuver. Flight-derived global aerodynamic model structures, model parameter estimates, and associated uncertainties were provided for all six nondimensional force and moment coefficients for the test aircraft. These models were combined with a propulsion model identified from engine ground test data to produce a high-fidelity nonlinear flight simulation very efficiently. Prediction testing using a multi-axis maneuver showed that the identified global model accurately predicted aircraft responses.
Synthesis in land change science: methodological patterns, challenges, and guidelines.
Magliocca, Nicholas R; Rudel, Thomas K; Verburg, Peter H; McConnell, William J; Mertz, Ole; Gerstner, Katharina; Heinimann, Andreas; Ellis, Erle C
Global and regional economic and environmental changes are increasingly influencing local land-use, livelihoods, and ecosystems. At the same time, cumulative local land changes are driving global and regional changes in biodiversity and the environment. To understand the causes and consequences of these changes, land change science (LCS) draws on a wide array synthetic and meta-study techniques to generate global and regional knowledge from local case studies of land change. Here, we review the characteristics and applications of synthesis methods in LCS and assess the current state of synthetic research based on a meta-analysis of synthesis studies from 1995 to 2012. Publication of synthesis research is accelerating, with a clear trend toward increasingly sophisticated and quantitative methods, including meta-analysis. Detailed trends in synthesis objectives, methods, and land change phenomena and world regions most commonly studied are presented. Significant challenges to successful synthesis research in LCS are also identified, including issues of interpretability and comparability across case-studies and the limits of and biases in the geographic coverage of case studies. Nevertheless, synthesis methods based on local case studies will remain essential for generating systematic global and regional understanding of local land change for the foreseeable future, and multiple opportunities exist to accelerate and enhance the reliability of synthetic LCS research in the future. Demand for global and regional knowledge generation will continue to grow to support adaptation and mitigation policies consistent with both the local realities and regional and global environmental and economic contexts of land change.
Using Java to generate globally unique identifiers for DICOM objects.
Kamauu, Aaron W C; Duvall, Scott L; Avrin, David E
2009-03-01
Digital imaging and communication in medicine (DICOM) specifies that all DICOM objects have globally unique identifiers (UIDs). Creating these UIDs can be a difficult task due to the variety of techniques in use and the requirement to ensure global uniqueness. We present a simple technique of combining a root organization identifier, assigned descriptive identifiers, and JAVA generated unique identifiers to construct DICOM compliant UIDs.
Image analysis and machine learning for detecting malaria.
Poostchi, Mahdieh; Silamut, Kamolrat; Maude, Richard J; Jaeger, Stefan; Thoma, George
2018-04-01
Malaria remains a major burden on global health, with roughly 200 million cases worldwide and more than 400,000 deaths per year. Besides biomedical research and political efforts, modern information technology is playing a key role in many attempts at fighting the disease. One of the barriers toward a successful mortality reduction has been inadequate malaria diagnosis in particular. To improve diagnosis, image analysis software and machine learning methods have been used to quantify parasitemia in microscopic blood slides. This article gives an overview of these techniques and discusses the current developments in image analysis and machine learning for microscopic malaria diagnosis. We organize the different approaches published in the literature according to the techniques used for imaging, image preprocessing, parasite detection and cell segmentation, feature computation, and automatic cell classification. Readers will find the different techniques listed in tables, with the relevant articles cited next to them, for both thin and thick blood smear images. We also discussed the latest developments in sections devoted to deep learning and smartphone technology for future malaria diagnosis. Published by Elsevier Inc.
Russell, Claire L.; Smith, Edward M.; Calvo-Bado, Leonides A.; Green, Laura E.; Wellington, Elizabeth M.H.; Medley, Graham F.; Moore, Lynda J.; Grogono-Thomas, Rosemary
2014-01-01
Dichelobacter nodosus is a Gram-negative, anaerobic bacterium and the causal agent of footrot in sheep. Multiple locus variable number tandem repeat (VNTR) analysis (MLVA) is a portable technique that involves the identification and enumeration of polymorphic tandem repeats across the genome. The aims of this study were to develop an MLVA scheme for D. nodosus suitable for use as a molecular typing tool, and to apply it to a global collection of isolates. Seventy-seven isolates selected from regions with a long history of footrot (GB, Australia) and regions where footrot has recently been reported (India, Scandinavia), were characterised. From an initial 61 potential VNTR regions, four loci were identified as usable and in combination had the attributes required of a typing method for use in bacterial epidemiology: high discriminatory power (D > 0.95), typeability and reproducibility. Results from the analysis indicate that D. nodosus appears to have evolved via recombinational exchanges and clonal diversification. This has resulted in some clonal complexes that contain isolates from multiple countries and continents; and others that contain isolates from a single geographic location (country or region). The distribution of alleles between countries matches historical accounts of sheep movements, suggesting that the MLVA technique is sufficiently specific and sensitive for an epidemiological investigation of the global distribution of D. nodosus. PMID:23748018
Digital correlation detector for low-cost Omega navigation
NASA Technical Reports Server (NTRS)
Chamberlin, K. A.
1976-01-01
Techniques to lower the cost of using the Omega global navigation network with phase-locked loops (PLL) were developed. The technique that was accepted as being "optimal" is called the memory-aided phase-locked loop (MAPLL) since it allows operation on all eight Omega time slots with one PLL through the implementation of a random access memory. The receiver front-end and the signals that it transmits to the PLL were first described. A brief statistical analysis of these signals was then made to allow a rough comparison between the front-end presented in this work and a commercially available front-end to be made. The hardware and theory of application of the MAPLL were described, ending with an analysis of data taken with the MAPLL. Some conclusions and recommendations were also given.
Application of the Shell/3D Modeling Technique for the Analysis of Skin-Stiffener Debond Specimens
NASA Technical Reports Server (NTRS)
Krueger, Ronald; O'Brien, T. Kevin; Minguet, Pierre J.
2002-01-01
The application of a shell/3D modeling technique for the simulation of skin/stringer debond in a specimen subjected to three-point bending is demonstrated. The global structure was modeled with shell elements. A local three-dimensional model, extending to about three specimen thicknesses on either side of the delamination front was used to capture the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from shell/13D simulations were in good agreement with results obtained from full solid models. The good correlations of the results demonstrated the effectiveness of the shell/3D modeling technique for the investigation of skin/stiffener separation due to delamination in the adherents.
NASA Astrophysics Data System (ADS)
Crocker, N. A.; Kubota, S.; Peebles, W. A.; Rhodes, T. L.; Fredrickson, E. D.; Belova, E.; Diallo, A.; LeBlanc, B. P.; Sabbagh, S. A.
2018-01-01
Reflectometry measurements of compressional (CAE) and global (GAE) Alfvén eigenmodes are analyzed to obtain the amplitude and spatial structure of the density perturbations associated with the modes. A novel analysis technique developed for this purpose is presented. The analysis also naturally yields the amplitude and spatial structure of the density contour radial displacement, which is found to be 2-4 times larger than the value estimated directly from the reflectometer measurements using the much simpler ‘mirror approximation’. The modes were driven by beam ions in a high power (6 MW) neutral beam heated H-mode discharge (#141398) in the National Spherical Torus Experiment. The results of the analysis are used to assess the contribution of the modes to core energy transport and ion heating. The total displacement amplitude of the modes, which is shown to be larger than previously estimated (Crocker et al 2013 Nucl. Fusion 53 43017), is compared to the predicted threshold (Gorelenkov et al 2010 Nucl. Fusion 50 84012) for the anomalously high heat diffusion inferred from transport modeling in similar NSTX discharges. The results of the analysis also have strong implications for the energy transport via coupling of CAEs to kinetic Alfvén waves seen in simulations with the Hybrid MHD code (Belova et al 2015 Phys. Rev. Lett. 115 15001). Finally, the amplitudes of the observed CAEs fall well below the threshold for causing significant ion heating by stochastic velocity space diffusion (Gates et al 2001 Phys. Rev. Lett. 87 205003).
NASA Astrophysics Data System (ADS)
Llewellyn-Jones, David; Good, Simon; Corlett, Gary
A pc-based analysis package has been developed, for the dual purposes of, firstly, providing ‘quick-look' capability to research workers inspecting long time-series of global satellite datasets of Sea-surface Temperature (SST); and, secondly, providing an introduction for students, either undergraduates, or advanced high-school students to the characteristics of commonly used analysis techniques for large geophysical data-sets from satellites. Students can also gain insight into the behaviour of some basic climate-related large-scale or global processes. The package gives students immediate access to up to 16 years of continuous global SST data, mainly from the Advanced Along-Track Scanning Radiometer, currently flying on ESA's Envisat satellite. The data are available and are presented in the form of monthly averages and spatial averaged to half-degree or one-sixth degree longitude-latitude grids. There are simple button-operated facilities for defining and calculating box-averages; producing time-series of such averages; defining and displaying transects and their evolution over time; and the examination anomalous behaviour by displaying the difference between observed values and values derived from climatological means. By using these facilities a student rapidly gains familiarity with such processes as annual variability, the El Nĩo effect, as well as major current systems n such as the Gulf Stream and other climatically important phenomena. In fact, the student is given immediate insights into the basic methods of examining geophysical data in a research context, without needing to acquire special analysis skills are go trough lengthy data retrieval and preparation procedures which are more generally required, as precursors to serious investigation, in the research laboratory. This software package, called the Leicester AAATSR Global Analyser (LAGA), is written in a well-known and widely used analysis language and the package can be run by using software that is readily available free-of-charge.
Coxiella Burnetti Vaccine Development: Lipopolysaccharide Structural Analysis
1989-12-29
linkage, branching, and sequence, by periodate oxidation, supercritical fluid chromatography , and mass spectrometry. These techniques combine to pro... Supercritical fluid chromatography of PFBAB labeled maltodextrin sample prepared as the acetate derivative. C-anopropyl SFC column using CO 2 as the...8217 ide the elements of a global approach to oligosaccharide structure. The utility of s"pr critical fluid chromatography for a determination of Lipid-A
Global Ray Tracing Simulations of the SABER Gravity Wave Climatology
2009-01-01
atmosphere , the residual temperature profiles are analyzed by a combi- nation of maximum entropy method (MEM) and harmonic analysis, thus providing the...accepted 24 February 2009; published 30 April 2009. [1] Since February 2002, the SABER (sounding of the atmosphere using broadband emission radiometry...satellite instrument has measured temperatures throughout the entire middle atmosphere . Employing the same techniques as previously used for CRISTA
Vanina Fissore; Renzo Motta; Brian J. Palik; Enrico Borgogno Mondino
2015-01-01
In the debate over global warming, treeline position is considered an important ecological indicator of climate change. Currently, analysis of upward treeline shift is often based on various spatial data processed by geomatic techniques. In this work, considering a selection of 31 reference papers, we assessed how the scientific community is using different methods to...
Analysis and Experimentation of Control Strategies for Underactuated Spacecraft
2009-09-01
control techniques that provide time -invariant global asymptotic stability of the fully actuated spacecraft system of equations. Although these control ...momentum wheel actuators in finite time under the restriction that the total angular momentum vector of the system is zero. This control methodology...can be stabilizable to an arbitrarily small region about the equilibrium of the system via time -invariant smooth state feedback control
Two dimensional wavefront retrieval using lateral shearing interferometry
NASA Astrophysics Data System (ADS)
Mancilla-Escobar, B.; Malacara-Hernández, Z.; Malacara-Hernández, D.
2018-06-01
A new zonal two-dimensional method for wavefront retrieval from a surface under test using lateral shearing interferometry is presented. A modified Saunders method and phase shifting techniques are combined to generate a method for wavefront reconstruction. The result is a wavefront with an error below 0.7 λ and without any global high frequency filtering. A zonal analysis over square cells along the surfaces is made, obtaining a polynomial expression for the wavefront deformations over each cell. The main advantage of this method over previously published methods is that a global filtering of high spatial frequencies is not present. Thus, a global smoothing of the wavefront deformations is avoided, allowing the detection of deformations with relatively small extensions, that is, with high spatial frequencies. Additionally, local curvature and low order aberration coefficients are obtained in each cell.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mengel, S.K.; Morrison, D.B.
1985-01-01
Consideration is given to global biogeochemical issues, image processing, remote sensing of tropical environments, global processes, geology, landcover hydrology, and ecosystems modeling. Topics discussed include multisensor remote sensing strategies, geographic information systems, radars, and agricultural remote sensing. Papers are presented on fast feature extraction; a computational approach for adjusting TM imagery terrain distortions; the segmentation of a textured image by a maximum likelihood classifier; analysis of MSS Landsat data; sun angle and background effects on spectral response of simulated forest canopies; an integrated approach for vegetation/landcover mapping with digital Landsat images; geological and geomorphological studies using an image processing technique;more » and wavelength intensity indices in relation to tree conditions and leaf-nutrient content.« less
Global sensing of gaseous and aerosol trace species using automated instrumentation on 747 airliners
NASA Technical Reports Server (NTRS)
Perkins, P. J.; Papathakos, L. C.
1977-01-01
The Global Atmospheric Sampling Program (GASP) by NASA is collecting and analyzing data on gaseous and aerosol trace species in the upper troposphere and lower stratosphere. Measurements are obtained from automated systems installed on four 747 airliners flying global air routes. Advances were made in airborne sampling instrumentation. Improved instruments and analysis techniques are providing an expanding data base for trace species including ozone, carbon monoxide, water vapor, condensation nuclei and mass concentrations of sulfates and nitrates. Simultaneous measurements of several trace species obtained frequently can be used to uniquely identify the source of the air mass as being typically tropospheric or stratospheric. A quantitative understanding of the tropospheric-stratospheric exchange processes leads to better knowledge of the atmospheric impact of pollution through the development of improved simulation models of the atmosphere.
Methodological concerns for meta-analyses of meditation: Comment on Sedlmeier et al. (2012).
Orme-Johnson, David W; Dillbeck, Michael C
2014-03-01
We commend Sedlmeier et al. (2012) for their significant undertaking of meta-analysis of all meditation types on all psychological variables, but additional analyses may modify some of their conclusions. Whereas they suggest from visual inspection of funnel diagrams that there may be publication bias of underreporting low-effect studies on the Transcendental Meditation (TM) technique, quantitative tests do not indicate the presence of bias for any type of meditation. We additionally found that there was no significant difference in effect sizes between studies originating from researchers affiliated with a TM organization and studies from other universities. We found that comparison of different types of meditation on their global index was confounded because their global index aggregated different sets of variables for the different groups. That is, using composite indices that only aggregated variables for which each group had at least 3 studies confirmed the authors' conclusion that effect sizes for different research designs were not different, but found that effect sizes for the TM technique were significantly larger than effect sizes for mindfulness meditation or other meditations. We also located 35 studies on the TM technique that appear to meet the authors' inclusion criteria that were missed by their meta-analysis, and several others on important psychosocial behavioral variables, such as job performance, substance abuse, and prison recidivism that were not reviewed. In addition, we suggest that future meta-analyses on psychological variables include cross-validating physiological studies.
Radiometrically accurate scene-based nonuniformity correction for array sensors.
Ratliff, Bradley M; Hayat, Majeed M; Tyo, J Scott
2003-10-01
A novel radiometrically accurate scene-based nonuniformity correction (NUC) algorithm is described. The technique combines absolute calibration with a recently reported algebraic scene-based NUC algorithm. The technique is based on the following principle: First, detectors that are along the perimeter of the focal-plane array are absolutely calibrated; then the calibration is transported to the remaining uncalibrated interior detectors through the application of the algebraic scene-based algorithm, which utilizes pairs of image frames exhibiting arbitrary global motion. The key advantage of this technique is that it can obtain radiometric accuracy during NUC without disrupting camera operation. Accurate estimates of the bias nonuniformity can be achieved with relatively few frames, which can be fewer than ten frame pairs. Advantages of this technique are discussed, and a thorough performance analysis is presented with use of simulated and real infrared imagery.
NASA Technical Reports Server (NTRS)
Heath, D. F.; Hilsenrath, E.; Krueger, A. J.; Nordberg, W.; Prabhakara, C.; Theon, J. S.
1972-01-01
Brief descriptions are given of the techniques involved in determining the global structure of the mesosphere and stratosphere based on sounding rocket observations and satellite remotely sensed measurements.
NASA Astrophysics Data System (ADS)
Petronijević, R. B.; Velebit, B.; Baltić, T.
2017-09-01
Intentional modification of food or substitution of food ingredients with the aim of gaining profit is food fraud or economically motivated adulteration (EMA). EMA appeared in the food supply chain, and following the global expansion of the food market, has become a world-scale problem for the global economy. Food frauds have involved oils, milk and meat products, infant formula, honey, juices, spices, etc. New legislation was enacted in the last decade in order to fight EMA. Effective analytical methods for food fraud detection are few and still in development. The majority of the methods in common use today for EMA detection are time consuming and inappropriate for use on the production line or out of the laboratory. The next step in the evolution of analytical techniques to combat food fraud is development of fast, accurate methods applicable using portable or handheld devices. Spectrophotometric and spectroscopic methods combined with chemometric analysis, and perhaps in combination with other rapid physico-chemical techniques, could be the answer. This review discusses some analytical techniques based on spectrophotometry and spectroscopy, which are used to reveal food fraud and EMA.
Yang, Xujun; Li, Chuandong; Song, Qiankun; Chen, Jiyang; Huang, Junjian
2018-05-04
This paper talks about the stability and synchronization problems of fractional-order quaternion-valued neural networks (FQVNNs) with linear threshold neurons. On account of the non-commutativity of quaternion multiplication resulting from Hamilton rules, the FQVNN models are separated into four real-valued neural network (RVNN) models. Consequently, the dynamic analysis of FQVNNs can be realized by investigating the real-valued ones. Based on the method of M-matrix, the existence and uniqueness of the equilibrium point of the FQVNNs are obtained without detailed proof. Afterwards, several sufficient criteria ensuring the global Mittag-Leffler stability for the unique equilibrium point of the FQVNNs are derived by applying the Lyapunov direct method, the theory of fractional differential equation, the theory of matrix eigenvalue, and some inequality techniques. In the meanwhile, global Mittag-Leffler synchronization for the drive-response models of the addressed FQVNNs are investigated explicitly. Finally, simulation examples are designed to verify the feasibility and availability of the theoretical results. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Radicioni, F.; Matracchi, P.; Brigante, R.; Brozzi, A.; Cecconi, M.; Stoppini, A.; Tosi, G.
2017-05-01
The Tempio della Consolazione in Todi (16th cent.) has always been one of the most significant symbols of the Umbrian landscape. Since the first times after its completion (1606) the structure has exhibited evidences of instability, due to foundation subsiding and/or seismic activity. Structural and geotechnical countermeasures have been undertaken on the Tempio and its surroundings from the 17th century until recent times. Until now a truly satisfactory analysis of the overall deformation and attitude of the building has not been performed, since the existing surveys record the overhangs of the pillars, the crack pattern or the subsidence over limited time spans. Describing the attitude of the whole church is in fact a complex operation due to the architectural character of the building, consisting of four apses (three polygonal and one semicircular) covered with half domes, which surround the central area with the large dome. The present research aims to fill the gap of knowledge with a global study based on geomatic techniques for an accurate 3D reconstruction of geometry and attitude, integrated with a historical research on damage and interventions and a geotechnical analysis. The geomatic survey results from the integration of different techniques: GPS-GNSS for global georeferencing, laser scanning and digital photogrammetry for an accurate 3D reconstruction, high precision total station and geometric leveling for a direct survey of deformations and cracks, and for the alignment of the laser scans. The above analysis allowed to assess the dynamics of the cracks occurred in the last 25 years by a comparison with a previous survey. From the photographic colour associated to the point cloud was also possible to map the damp patches showing on the domes intrados, mapping their evolution over the last years.
NASA Astrophysics Data System (ADS)
Schrodt, Franziska; Shan, Hanhuai; Fazayeli, Farideh; Karpatne, Anuj; Kattge, Jens; Banerjee, Arindam; Reichstein, Markus; Reich, Peter
2013-04-01
With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Advanced Hierarchical Probabilistic Matrix Factorization (aHPMF) on the other hand includes climate and soil information into the matrix factorization by regressing the environmental variables against residuals of the HPMF. One unique opportunity opened up by aHPMF is out-of-sample prediction, where traits can be predicted for specific species at locations different to those sampled in the past. This has potentially far-reaching consequences for the study of global-scale plant functional trait patterns. We test the accuracy and effectiveness of HPMF and aHPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented coverage, the TRY database is still very sparse, severely limiting joint trait analyses. Plant traits are the key to understanding how plants as primary producers adjust to changes in environmental conditions and in turn influence them. Forming the basis for Dynamic Global Vegetation Models (DGVMs), plant traits are also fundamental in global change studies for predicting future ecosystem changes. It is thus imperative that missing data is imputed in as accurate and precise a way as possible. In this study, we show the advantages and disadvantages of applying probabilistic matrix factorization techniques in incorporating hierarchical and environmental information for the prediction of missing plant traits as compared to conventional imputation techniques such as the complete case and mean approaches. We will discuss the implications of using gap-filled data for global-scale studies of plant functional trait - environment relationship as opposed to the above-mentioned conventional techniques, using examples of out-of-sample predictions of foliar Nitrogen across several species' ranges and biomes.
Global DNA methylation analysis using methyl-sensitive amplification polymorphism (MSAP).
Yaish, Mahmoud W; Peng, Mingsheng; Rothstein, Steven J
2014-01-01
DNA methylation is a crucial epigenetic process which helps control gene transcription activity in eukaryotes. Information regarding the methylation status of a regulatory sequence of a particular gene provides important knowledge of this transcriptional control. DNA methylation can be detected using several methods, including sodium bisulfite sequencing and restriction digestion using methylation-sensitive endonucleases. Methyl-Sensitive Amplification Polymorphism (MSAP) is a technique used to study the global DNA methylation status of an organism and hence to distinguish between two individuals based on the DNA methylation status determined by the differential digestion pattern. Therefore, this technique is a useful method for DNA methylation mapping and positional cloning of differentially methylated genes. In this technique, genomic DNA is first digested with a methylation-sensitive restriction enzyme such as HpaII, and then the DNA fragments are ligated to adaptors in order to facilitate their amplification. Digestion using a methylation-insensitive isoschizomer of HpaII, MspI is used in a parallel digestion reaction as a loading control in the experiment. Subsequently, these fragments are selectively amplified by fluorescently labeled primers. PCR products from different individuals are compared, and once an interesting polymorphic locus is recognized, the desired DNA fragment can be isolated from a denaturing polyacrylamide gel, sequenced and identified based on DNA sequence similarity to other sequences available in the database. We will use analysis of met1, ddm1, and atmbd9 mutants and wild-type plants treated with a cytidine analogue, 5-azaC, or zebularine to demonstrate how to assess the genetic modulation of DNA methylation in Arabidopsis. It should be noted that despite the fact that MSAP is a reliable technique used to fish for polymorphic methylated loci, its power is limited to the restriction recognition sites of the enzymes used in the genomic DNA digestion.
Carbon budgets of biological soil crusts at micro-, meso-, and global scales
Sancho, Leopoldo G; Belnap, Jayne; Colesie, Claudia; Raggio, Jose; Weber, Bettina
2016-01-01
The importance of biocrusts in the ecology of arid lands across all continents is widely recognized. In spite of this broad distribution, contributions of biocrusts to the global biogeochemical cycles have only recently been considered. While these studies opened a new view on the global role of biocrusts, they also clearly revealed the lack of data for many habitats and of overall standards for measurements and analysis. In order to understand carbon cycling in biocrusts and the progress which has been made during the last 15 years, we offer a multi-scale approach covering different climatic regions. We also include a discussion on available measurement techniques at each scale: A micro-scale section focuses on the individual organism level, including modeling based on the combination of field and lab data. The meso-scale section addresses the CO2 exchange of a complete ecosystem or at the community level. Finally, we consider the contribution of biocrusts at a global scale, giving a general perspective of the most relevant findings regarding the role of biological soil crusts in the global terrestrial carbon cycle.
“Heroes” and “Villains” of World History across Cultures
Hanke, Katja; Liu, James H.; Sibley, Chris G.; Paez, Dario; Gaines, Stanley O.; Moloney, Gail; Leong, Chan-Hoong; Wagner, Wolfgang; Licata, Laurent; Klein, Olivier; Garber, Ilya; Böhm, Gisela; Hilton, Denis J.; Valchev, Velichko; Khan, Sammyh S.; Cabecinhas, Rosa
2015-01-01
Emergent properties of global political culture were examined using data from the World History Survey (WHS) involving 6,902 university students in 37 countries evaluating 40 figures from world history. Multidimensional scaling and factor analysis techniques found only limited forms of universality in evaluations across Western, Catholic/Orthodox, Muslim, and Asian country clusters. The highest consensus across cultures involved scientific innovators, with Einstein having the most positive evaluation overall. Peaceful humanitarians like Mother Theresa and Gandhi followed. There was much less cross-cultural consistency in the evaluation of negative figures, led by Hitler, Osama bin Laden, and Saddam Hussein. After more traditional empirical methods (e.g., factor analysis) failed to identify meaningful cross-cultural patterns, Latent Profile Analysis (LPA) was used to identify four global representational profiles: Secular and Religious Idealists were overwhelmingly prevalent in Christian countries, and Political Realists were common in Muslim and Asian countries. We discuss possible consequences and interpretations of these different representational profiles. PMID:25651504
System Theoretic Frameworks for Mitigating Risk Complexity in the Nuclear Fuel Cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Adam David; Mohagheghi, Amir H.; Cohn, Brian
In response to the expansion of nuclear fuel cycle (NFC) activities -- and the associated suite of risks -- around the world, this project evaluated systems-based solutions for managing such risk complexity in multimodal and multi-jurisdictional international spent nuclear fuel (SNF) transportation. By better understanding systemic risks in SNF transportation, developing SNF transportation risk assessment frameworks, and evaluating these systems-based risk assessment frameworks, this research illustrated interdependency between safety, security, and safeguards risks is inherent in NFC activities and can go unidentified when each "S" is independently evaluated. Two novel system-theoretic analysis techniques -- dynamic probabilistic risk assessment (DPRA) andmore » system-theoretic process analysis (STPA) -- provide integrated "3S" analysis to address these interdependencies and the research results suggest a need -- and provide a way -- to reprioritize United States engagement efforts to reduce global nuclear risks. Lastly, this research identifies areas where Sandia National Laboratories can spearhead technical advances to reduce global nuclear dangers.« less
Flight Mechanics of the Entry, Descent and Landing of the ExoMars Mission
NASA Technical Reports Server (NTRS)
HayaRamos, Rodrigo; Boneti, Davide
2007-01-01
ExoMars is ESA's current mission to planet Mars. A high mobility rover and a fixed station will be deployed on the surface of Mars. This paper regards the flight mechanics of the Entry, Descent and Landing (EDL) phases used for the mission analysis and design of the Baseline and back-up scenarios of the mission. The EDL concept is based on a ballistic entry, followed by a descent under parachutes and inflatable devices (airbags) for landing. The mission analysis and design is driven by the flexibility in terms of landing site, arrival dates and the very stringent requirement in terms of landing accuracy. The challenging requirements currently imposed to the mission need innovative analysis and design techniques to support system design trade-offs to cope with the variability in entry conditions. The concept of the Global Entry Corridor has been conceived, designed, implemented and successfully validated as a key tool to provide a global picture of the mission capabilities in terms of landing site reachability.
A new technique for ordering asymmetrical three-dimensional data sets in ecology.
Pavoine, Sandrine; Blondel, Jacques; Baguette, Michel; Chessel, Daniel
2007-02-01
The aim of this paper is to tackle the problem that arises from asymmetrical data cubes formed by two crossed factors fixed by the experimenter (factor A and factor B, e.g., sites and dates) and a factor which is not controlled for (the species). The entries of this cube are densities in species. We approach this kind of data by the comparison of patterns, that is to say by analyzing first the effect of factor B on the species-factor A pattern, and second the effect of factor A on the species-factor B pattern. The analysis of patterns instead of individual responses requires a correspondence analysis. We use a method we call Foucart's correspondence analysis to coordinate the correspondence analyses of several independent matrices of species x factor A (respectively B) type, corresponding to each modality of factor B (respectively A). Such coordination makes it possible to evaluate the effect of factor B (respectively A) on the species-factor A (respectively B) pattern. The results obtained by such a procedure are much more insightful than those resulting from a classical single correspondence analysis applied to the global matrix that is obtained by simply unrolling the data cube, juxtaposing for example the individual species x factor A matrices through modalities of factor B. This is because a single global correspondence analysis combines three effects of factors in a way that cannot be determined from factorial maps (factor A, factor B, and factor A x factor B interaction) whereas the applications of Foucart's correspondence analysis clearly discriminate two different issues. Using two data sets, we illustrate that this technique proves to be particularly powerful in the analyses of ecological convergence which include several distinct data sets and in the analyses of spatiotemporal variations of species distributions.
Dynamical analysis of the global business-cycle synchronization
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies. PMID:29408909
Dynamical analysis of the global business-cycle synchronization.
Lopes, António M; Tenreiro Machado, J A; Huffstot, John S; Mata, Maria Eugénia
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies.
An Efficient Analysis Methodology for Fluted-Core Composite Structures
NASA Technical Reports Server (NTRS)
Oremont, Leonard; Schultz, Marc R.
2012-01-01
The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.
Scaling law analysis of paraffin thin films on different surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dotto, M. E. R.; Camargo, S. S. Jr.
2010-01-15
The dynamics of paraffin deposit formation on different surfaces was analyzed based on scaling laws. Carbon-based films were deposited onto silicon (Si) and stainless steel substrates from methane (CH{sub 4}) gas using radio frequency plasma enhanced chemical vapor deposition. The different substrates were characterized with respect to their surface energy by contact angle measurements, surface roughness, and morphology. Paraffin thin films were obtained by the casting technique and were subsequently characterized by an atomic force microscope in noncontact mode. The results indicate that the morphology of paraffin deposits is strongly influenced by substrates used. Scaling laws analysis for coated substratesmore » present two distinct dynamics: a local roughness exponent ({alpha}{sub local}) associated to short-range surface correlations and a global roughness exponent ({alpha}{sub global}) associated to long-range surface correlations. The local dynamics is described by the Wolf-Villain model, and a global dynamics is described by the Kardar-Parisi-Zhang model. A local correlation length (L{sub local}) defines the transition between the local and global dynamics with L{sub local} approximately 700 nm in accordance with the spacing of planes measured from atomic force micrographs. For uncoated substrates, the growth dynamics is related to Edwards-Wilkinson model.« less
LaPrè, A K; Price, M A; Wedge, R D; Umberger, B R; Sup, Frank C
2018-04-01
Musculoskeletal modeling and marker-based motion capture techniques are commonly used to quantify the motions of body segments, and the forces acting on them during human gait. However, when these techniques are applied to analyze the gait of people with lower limb loss, the clinically relevant interaction between the residual limb and prosthesis socket is typically overlooked. It is known that there is considerable motion and loading at the residuum-socket interface, yet traditional gait analysis techniques do not account for these factors due to the inability to place tracking markers on the residual limb inside of the socket. In the present work, we used a global optimization technique and anatomical constraints to estimate the motion and loading at the residuum-socket interface as part of standard gait analysis procedures. We systematically evaluated a range of parameters related to the residuum-socket interface, such as the number of degrees of freedom, and determined the configuration that yields the best compromise between faithfully tracking experimental marker positions while yielding anatomically realistic residuum-socket kinematics and loads that agree with data from the literature. Application of the present model to gait analysis for people with lower limb loss will deepen our understanding of the biomechanics of walking with a prosthesis, which should facilitate the development of enhanced rehabilitation protocols and improved assistive devices. Copyright © 2017 John Wiley & Sons, Ltd.
Link-prediction to tackle the boundary specification problem in social network surveys
De Wilde, Philippe; Buarque de Lima-Neto, Fernando
2017-01-01
Diffusion processes in social networks often cause the emergence of global phenomena from individual behavior within a society. The study of those global phenomena and the simulation of those diffusion processes frequently require a good model of the global network. However, survey data and data from online sources are often restricted to single social groups or features, such as age groups, single schools, companies, or interest groups. Hence, a modeling approach is required that extrapolates the locally restricted data to a global network model. We tackle this Missing Data Problem using Link-Prediction techniques from social network research, network generation techniques from the area of Social Simulation, as well as a combination of both. We found that techniques employing less information may be more adequate to solve this problem, especially when data granularity is an issue. We validated the network models created with our techniques on a number of real-world networks, investigating degree distributions as well as the likelihood of links given the geographical distance between two nodes. PMID:28426826
Environmental assessment of Al-Hammar Marsh, Southern Iraq.
Al-Gburi, Hind Fadhil Abdullah; Al-Tawash, Balsam Salim; Al-Lafta, Hadi Salim
2017-02-01
(a) To determine the spatial distributions and levels of major and minor elements, as well as heavy metals, in water, sediment, and biota (plant and fish) in Al-Hammar Marsh, southern Iraq, and ultimately to supply more comprehensive information for policy-makers to manage the contaminants input into the marsh so that their concentrations do not reach toxic levels. (b) to characterize the seasonal changes in the marsh surface water quality. (c) to address the potential environmental risk of these elements by comparison with the historical levels and global quality guidelines (i.e., World Health Organization (WHO) standard limits). (d) to define the sources of these elements (i.e., natural and/or anthropogenic) using combined multivariate statistical techniques such as Principal Component Analysis (PCA) and Agglomerative Hierarchical Cluster Analysis (AHCA) along with pollution analysis (i.e., enrichment factor analysis). Water, sediment, plant, and fish samples were collected from the marsh, and analyzed for major and minor ions, as well as heavy metals, and then compared to historical levels and global quality guidelines (WHO guidelines). Then, multivariate statistical techniques, such as PCA and AHCA, were used to determine the element sourcing. Water analyses revealed unacceptable values for almost all physio-chemical and biological properties, according to WHO standard limits for drinking water. Almost all major ions and heavy metal concentrations in water showed a distinct decreasing trend at the marsh outlet station compared to other stations. In general, major and minor ions, as well as heavy metals exhibit higher concentrations in winter than in summer. Sediment analyses using multivariate statistical techniques revealed that Mg, Fe, S, P, V, Zn, As, Se, Mo, Co, Ni, Cu, Sr, Br, Cd, Ca, N, Mn, Cr, and Pb were derived from anthropogenic sources, while Al, Si, Ti, K, and Zr were primarily derived from natural sources. Enrichment factor analysis gave results compatible with multivariate statistical techniques findings. Analysis of heavy metals in plant samples revealed that there is no pollution in plants in Al-Hammar Marsh. However, the concentrations of heavy metals in fish samples showed that all samples were contaminated by Pb, Mn, and Ni, while some samples were contaminated by Pb, Mn, and Ni. Decreasing of Tigris and Euphrates discharges during the past decades due to drought conditions and upstream damming, as well as the increasing stress of wastewater effluents from anthropogenic activities, led to degradation of the downstream Al-Hammar Marsh water quality in terms of physical, chemical, and biological properties. As such properties were found to consistently exceed the historical and global quality objectives. However, element concentration decreasing trend at the marsh outlet station compared to other stations indicate that the marsh plays an important role as a natural filtration and bioremediation system. Higher element concentrations in winter were due to runoff from the washing of the surrounding Sabkha during flooding by winter rainstorms. Finally, the high concentrations of heavy metals in fish samples can be attributed to bioaccumulation and biomagnification processes.
Advanced phenotyping and phenotype data analysis for the study of plant growth and development.
Rahaman, Md Matiur; Chen, Dijun; Gillani, Zeeshan; Klukas, Christian; Chen, Ming
2015-01-01
Due to an increase in the consumption of food, feed, fuel and to meet global food security needs for the rapidly growing human population, there is a necessity to breed high yielding crops that can adapt to the future climate changes, particularly in developing countries. To solve these global challenges, novel approaches are required to identify quantitative phenotypes and to explain the genetic basis of agriculturally important traits. These advances will facilitate the screening of germplasm with high performance characteristics in resource-limited environments. Recently, plant phenomics has offered and integrated a suite of new technologies, and we are on a path to improve the description of complex plant phenotypes. High-throughput phenotyping platforms have also been developed that capture phenotype data from plants in a non-destructive manner. In this review, we discuss recent developments of high-throughput plant phenotyping infrastructure including imaging techniques and corresponding principles for phenotype data analysis.
NASA Astrophysics Data System (ADS)
Jorris, Timothy R.
2007-12-01
To support the Air Force's Global Reach concept, a Common Aero Vehicle is being designed to support the Global Strike mission. "Waypoints" are specified for reconnaissance or multiple payload deployments and "no-fly zones" are specified for geopolitical restrictions or threat avoidance. Due to time critical targets and multiple scenario analysis, an autonomous solution is preferred over a time-intensive, manually iterative one. Thus, a real-time or near real-time autonomous trajectory optimization technique is presented to minimize the flight time, satisfy terminal and intermediate constraints, and remain within the specified vehicle heating and control limitations. This research uses the Hypersonic Cruise Vehicle (HCV) as a simplified two-dimensional platform to compare multiple solution techniques. The solution techniques include a unique geometric approach developed herein, a derived analytical dynamic optimization technique, and a rapidly emerging collocation numerical approach. This up-and-coming numerical technique is a direct solution method involving discretization then dualization, with pseudospectral methods and nonlinear programming used to converge to the optimal solution. This numerical approach is applied to the Common Aero Vehicle (CAV) as the test platform for the full three-dimensional reentry trajectory optimization problem. The culmination of this research is the verification of the optimality of this proposed numerical technique, as shown for both the two-dimensional and three-dimensional models. Additionally, user implementation strategies are presented to improve accuracy and enhance solution convergence. Thus, the contributions of this research are the geometric approach, the user implementation strategies, and the determination and verification of a numerical solution technique for the optimal reentry trajectory problem that minimizes time to target while satisfying vehicle dynamics and control limitation, and heating, waypoint, and no-fly zone constraints.
Guevara, María Ángeles; de María, Nuria; Sáez-Laguna, Enrique; Vélez, María Dolores; Cervera, María Teresa; Cabezas, José Antonio
2017-01-01
Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is the methylation-sensitive amplified polymorphism technique (MSAP) which is a modification of amplified fragment length polymorphism (AFLP). It has been used to study methylation of anonymous CCGG sequences in different fungi, plants, and animal species. The main variation of this technique resides on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent-cutter restriction enzyme. For each sample, MSAP analysis is performed using both EcoRI/HpaII- and EcoRI/MspI-digested samples. A comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) methylation-insensitive polymorphisms that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples and (2) methylation-sensitive polymorphisms which are associated with the amplified fragments that differ in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses the modifications that can be applied to adjust the technology to different species of interest.
[Global patent overview of Ginkgo biloba preparation].
Cheng, Xin-Min; Lei, Hai-Min; Liu, Wei
2013-09-01
With related global patent data as analysis samples, worldwide patent overview of Ginkgo biloba preparation is analyzed in application, applicant, technical distribution and so on. This research shows that the most important areas of G. biloba preparation are Europe and China. The European applicants start earliest along with developing smoothly, moreover, their patents have best quality. The Chinese applicants start late along with the fastest growing, and have already certain research capabilities, moreover, their patents' quality needs to be improved. This research result provides reference for development of G. biloba preparation. The author suggest that Chinese applicants learn techniques and layout experiences of other's patents fully to enhance the level of new drug development and patent protection.
NASA Technical Reports Server (NTRS)
Husen, Nicholas; Roozeboom, Nettie; Liu, Tianshu; Sullivan, John P.
2015-01-01
A quantitative global skin-friction measurement technique is proposed. An oil-film is doped with a luminescent molecule and thereby made to fluoresce in order to resolve oil-film thickness, and Particle Image Surface Flow Visualization is used to resolve the velocity field of the surface of the oil-film. Skin-friction is then calculated at location x as (x )xh, where x is the displacement of the surface of the oil-film and is the dynamic viscosity of the oil. The data collection procedure and data analysis procedures are explained, and preliminary experimental skin-friction results for flow over the wing of the CRM are presented.
Song, Qiankun; Yu, Qinqin; Zhao, Zhenjiang; Liu, Yurong; Alsaadi, Fuad E
2018-07-01
In this paper, the boundedness and robust stability for a class of delayed complex-valued neural networks with interval parameter uncertainties are investigated. By using Homomorphic mapping theorem, Lyapunov method and inequality techniques, sufficient condition to guarantee the boundedness of networks and the existence, uniqueness and global robust stability of equilibrium point is derived for the considered uncertain neural networks. The obtained robust stability criterion is expressed in complex-valued LMI, which can be calculated numerically using YALMIP with solver of SDPT3 in MATLAB. An example with simulations is supplied to show the applicability and advantages of the acquired result. Copyright © 2018 Elsevier Ltd. All rights reserved.
Global Atmosphere Watch Workshop on Measurement-Model ...
The World Meteorological Organization’s (WMO) Global Atmosphere Watch (GAW) Programme coordinates high-quality observations of atmospheric composition from global to local scales with the aim to drive high-quality and high-impact science while co-producing a new generation of products and services. In line with this vision, GAW’s Scientific Advisory Group for Total Atmospheric Deposition (SAG-TAD) has a mandate to produce global maps of wet, dry and total atmospheric deposition for important atmospheric chemicals to enable research into biogeochemical cycles and assessments of ecosystem and human health effects. The most suitable scientific approach for this activity is the emerging technique of measurement-model fusion for total atmospheric deposition. This technique requires global-scale measurements of atmospheric trace gases, particles, precipitation composition and precipitation depth, as well as predictions of the same from global/regional chemical transport models. The fusion of measurement and model results requires data assimilation and mapping techniques. The objective of the GAW Workshop on Measurement-Model Fusion for Global Total Atmospheric Deposition (MMF-GTAD), an initiative of the SAG-TAD, was to review the state-of-the-science and explore the feasibility and methodology of producing, on a routine retrospective basis, global maps of atmospheric gas and aerosol concentrations as well as wet, dry and total deposition via measurement-model
Status and Plans for the WCRP/GEWEX Global Precipitation Climatology Project (GPCP)
NASA Technical Reports Server (NTRS)
Adler, Robert F.
2007-01-01
The Global Precipitation Climatology Project (GPCP) is an international project under the auspices of the World Climate Research Program (WCRP) and GEWEX (Global Water and Energy Experiment). The GPCP group consists of scientists from agencies and universities in various countries that work together to produce a set of global precipitation analyses at time scales of monthly, pentad, and daily. The status of the current products will be briefly summarized, focusing on the monthly analysis. Global and large regional rainfall variations and possible long-term changes are examined using the 27-year (1 979-2005) monthly dataset. In addition to global patterns associated with phenomena such as ENSO, the data set is explored for evidence of long-term change. Although the global change of precipitation in the data set is near zero, the data set does indicate a small upward change in the Tropics (25s-25N) during the period,. especially over ocean. Techniques are derived to isolate and eliminate variations due to ENS0 and major volcanic eruptions and the significance of the linear change is examined. Plans for a GPCP reprocessing for a Version 3 of products, potentially including a fine-time resolution product will be discussed. Current and future links to IPWG will also be addressed.
Comparison of global sst analyses for atmospheric data assimilation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phoebus, P.A.; Cummings, J.A.
1995-03-17
Traditionally, atmospheric models were executed using a climatological estimate of the sea surface temperature (SST) to define the marine boundary layer. More recently, particularly since the deployment of remote sensing instruments and the advent of multichannel SST observations atmospheric models have been improved by using more timely estimates of the actual state of the ocean. Typically, some type of objective analysis is performed using the data from satellites along with ship, buoy, and bathythermograph observations, and perhaps even climatology, to produce a weekly or daily analysis of global SST. Some of the earlier efforts to produce real-time global temperature analysesmore » have been described by Clancy and Pollak (1983) and Reynolds (1988). However, just as new techniques have been developed for atmospheric data assimilation, improvements have been made to ocean data assimilation systems as well. In 1988, the U.S. Navy`s Fleet Numerical Meteorology and Oceanography Center (FNMOC) implemented a global three-dimensional ocean temperature analysis that was based on the optimum interpolation methodology (Clancy et al., 1990). This system, the Optimum Thermal Interpolation System (OTIS 1.0), was initially distributed on a 2.50 resolution grid, and was later modified to generate fields on a 1.250 grid (OTIS 1.1; Clancy et al., 1992). Other optimum interpolation-based analyses (OTIS 3.0) were developed by FNMOC to perform high-resolution three-dimensional ocean thermal analyses in areas with strong frontal gradients and clearly defined water mass characteristics.« less
Chantornvong, S.; Collin, J.; Dodgson, R.; Lee, K.; McCargo, D.; Seddon, D.; Vaughan, P.; Woelk, G.
2000-01-01
Crucial to the success of the proposed Framework Convention on Tobacco Control will be an understanding of the political and economic context for tobacco control policies, particularly in low-income and middle-income countries. Policy studies in Thailand and Zimbabwe employed the analytical perspective of political economy and a research strategy that used political mapping, a technique for characterizing and evaluating the political environment surrounding a policy issue, and stakeholder analysis, which seeks to identify key actors and to determine their capacity to shape policy outcomes. These policy studies clearly revealed how tobacco control in low-income and middle-income countries is also being shaped by developments in the global and regional political economy. Hence efforts to strengthen national control policies need to be set within the context of globalization and the international context. Besides the transnational tobacco companies, international tobacco groups and foreign governments, international agencies and nongovernmental organizations are also playing influential roles. It cannot be assumed, therefore, that the tobacco control strategies being implemented in industrialized countries will be just as effective and appropriate when implemented in developing countries. There is an urgent need to expand the number of such tobacco policy studies, particularly in low-income and middle-income countries. Comprehensive guidelines for tobacco policy analysis and research are required to support this process, as is a broader international strategy to coordinate further tobacco policy research studies at country, regional and global levels. PMID:10994265
Multi-Spacecraft 3D differential emission measure tomography of the solar corona: STEREO results.
NASA Astrophysics Data System (ADS)
Vásquez, A. M.; Frazin, R. A.
We have recently developed a novel technique (called DEMT) for the em- pirical determination of the three-dimensional (3D) distribution of the so- lar corona differential emission measure through multi-spacecraft solar ro- tational tomography of extreme-ultaviolet (EUV) image time series (like those provided by EIT/SOHO and EUVI/STEREO). The technique allows, for the first time, to develop global 3D empirical maps of the coronal elec- tron temperature and density, in the height range 1.0 to 1.25 RS . DEMT constitutes a simple and powerful 3D analysis tool that obviates the need for structure specific modeling.
Plio-Pleistocene time evolution of the 100-ky cycle in marine paleoclimate records
NASA Technical Reports Server (NTRS)
Park, Jeffrey; Maasch, Kirk A.
1992-01-01
To constrain theories for the dynamical evolution of global ice mass through the late Neogene, it is important to determine whether major changes in the record were gradual or rapid. Of particular interest is the evolution of the near 100-ky ice age cycle in the middle Pleistocene. We have applied a new technique based on multiple taper spectrum analysis which allows us to model the time evolution of quasi-periodic signals. This technique uses both phase and amplitude information, and enables us to address the question of abrupt versus gradual onset of the 100-ky periodicity in the middle Pleistocene.
[The global impression technic in fixed dentures].
Lamy, M; Mainjot, A
2001-01-01
The global impression technique allows to obtain in a single stage the impression of the abutment as well as their neighboring teeth. This technique often requires the placement of one or two retraction cords in the sulcus. The impression technique herein described is the double mix method. This method is based on the use of two elastomers with different viscosities, but from the same group thus allowing a simultaneous polymerization.
Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities
NASA Astrophysics Data System (ADS)
Esposito, Gaetano
Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.
Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid
2016-01-01
Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239
Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid
2016-01-01
Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.
Mapping the zebrafish brain methylome using reduced representation bisulfite sequencing
Chatterjee, Aniruddha; Ozaki, Yuichi; Stockwell, Peter A; Horsfield, Julia A; Morison, Ian M; Nakagawa, Shinichi
2013-01-01
Reduced representation bisulfite sequencing (RRBS) has been used to profile DNA methylation patterns in mammalian genomes such as human, mouse and rat. The methylome of the zebrafish, an important animal model, has not yet been characterized at base-pair resolution using RRBS. Therefore, we evaluated the technique of RRBS in this model organism by generating four single-nucleotide resolution DNA methylomes of adult zebrafish brain. We performed several simulations to show the distribution of fragments and enrichment of CpGs in different in silico reduced representation genomes of zebrafish. Four RRBS brain libraries generated 98 million sequenced reads and had higher frequencies of multiple mapping than equivalent human RRBS libraries. The zebrafish methylome indicates there is higher global DNA methylation in the zebrafish genome compared with its equivalent human methylome. This observation was confirmed by RRBS of zebrafish liver. High coverage CpG dinucleotides are enriched in CpG island shores more than in the CpG island core. We found that 45% of the mapped CpGs reside in gene bodies, and 7% in gene promoters. This analysis provides a roadmap for generating reproducible base-pair level methylomes for zebrafish using RRBS and our results provide the first evidence that RRBS is a suitable technique for global methylation analysis in zebrafish. PMID:23975027
NASA Astrophysics Data System (ADS)
Zani, Hiran; Assine, Mario Luis; McGlue, Michael Matthew
2012-08-01
Traditional Shuttle Radar Topography Mission (SRTM) topographic datasets hold limited value in the geomorphic analysis of low-relief terrains. To address this shortcoming, this paper presents a series of techniques designed to enhance digital elevation models (DEMs) of environments dominated by low-amplitude landforms, such as a fluvial megafan system. These techniques were validated through the study of a wide depositional tract composed of several megafans located within the Brazilian Pantanal. The Taquari megafan is the most remarkable of these features, covering an area of approximately 49,000 km2. To enhance the SRTM-DEM, the megafan global topography was calculated and found to be accurately represented by a second order polynomial. Simple subtraction of the global topography from altitude produced a new DEM product, which greatly enhanced low amplitude landforms within the Taquari megafan. A field campaign and optical satellite images were used to ground-truth features on the enhanced DEM, which consisted of both depositional (constructional) and erosional features. The results demonstrate that depositional lobes are the dominant landforms on the megafan. A model linking baselevel change, avulsion, clastic sedimentation, and erosion is proposed to explain the microtopographic features on the Taquari megafan surface. The study confirms the potential promise of enhanced DEMs for geomorphological research in alluvial settings.
Vulnerability of global food production to extreme climatic events.
Yeni, F; Alpas, H
2017-06-01
It is known that the frequency, intensity or duration of the extreme climatic events have been changing substantially. The ultimate goal of this study was to identify current vulnerabilities of global primary food production against extreme climatic events, and to discuss potential entry points for adaptation planning by means of an explorative vulnerability analysis. Outcomes of this analysis were demonstrated as a composite index where 118 country performances in maintaining safety of food production were compared and ranked against climate change. In order to better interpret the results, cluster analysis technique was used as a tool to group the countries based on their vulnerability index (VI) scores. Results suggested that one sixth of the countries analyzed were subject to high level of exposure (0.45-1), one third to high to very high level of sensitivity (0.41-1) and low to moderate level of adaptive capacity (0-0.59). Proper adaptation strategies for reducing the microbial and chemical contamination of food products, soil and waters on the field were proposed. Finally, availability of data on food safety management systems and occurrence of foodborne outbreaks with global coverage were proposed as key factors for improving the robustness of future vulnerability assessments. Copyright © 2017 Elsevier Ltd. All rights reserved.
Energy Input Flux in the Global Quiet-Sun Corona
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mac Cormack, Cecilia; Vásquez, Alberto M.; López Fuentes, Marcelo
We present first results of a novel technique that provides, for the first time, constraints on the energy input flux at the coronal base ( r ∼ 1.025 R {sub ⊙}) of the quiet Sun at a global scale. By combining differential emission measure tomography of EUV images, with global models of the coronal magnetic field, we estimate the energy input flux at the coronal base that is required to maintain thermodynamically stable structures. The technique is described in detail and first applied to data provided by the Extreme Ultraviolet Imager instrument, on board the Solar TErrestrial RElations Observatory mission,more » and the Atmospheric Imaging Assembly instrument, on board the Solar Dynamics Observatory mission, for two solar rotations with different levels of activity. Our analysis indicates that the typical energy input flux at the coronal base of magnetic loops in the quiet Sun is in the range ∼0.5–2.0 × 10{sup 5} (erg s{sup −1} cm{sup −2}), depending on the structure size and level of activity. A large fraction of this energy input, or even its totality, could be accounted for by Alfvén waves, as shown by recent independent observational estimates derived from determinations of the non-thermal broadening of spectral lines in the coronal base of quiet-Sun regions. This new tomography product will be useful for the validation of coronal heating models in magnetohydrodinamic simulations of the global corona.« less
The Contribution of the IGS to a Globally Integrated Geodetic Observing System
NASA Astrophysics Data System (ADS)
WEBER, R.
2002-05-01
The dedicated goal of the International GPS Service (IGS) is 'to provide a service to support geodetic and geophysical research activities through GPS data and data products'. To accomplish its mission IGS began routine operations in Jan 1994. Nowadays operations are based on a large number of components like a globally distributed tracking network of about 200 stations, local and regional data centers as well as eight analysis centers. This presentation summarizes the measurement principles of the GPS and GLONASS microwave satellite navigation systems. An overview of current IGS-products will be given and factors limiting the accuracy of these products are discussed. Moreover IGS serves as one of the technique center of the IERS and therefore the delivered products follow designated IERS standards as close as possible. It can be anticipated that the IGS will also play an important role within the framework of an upcoming Globally Integrated Geodetic Observing System. Even today there are a number of scientific crosslinks to other space geodetic techniques and services e.g. to the ILRS in the determination of the geocentre or to the IVS in questions of a temporal and spatial densification of the reference frame. The above-mentioned initiative will strengthen further the cooperation and increase the scientific outcome.
Mihelcic, James R; Zimmerman, Julie B; Ramaswami, Anu
2007-05-15
Sustainable development in both the developed and developing world has the common fundamental themes of advancing economic and social prosperity while protecting and restoring natural systems. While many recent efforts have been undertaken to transfer knowledge from the developed to the developing world to achieve a more sustainable future, indigenous knowledge that often originates in developing nations also can contribute significantly to this global dialogue. Selected case studies are presented to describe important knowledge, methodologies, techniques, principles, and practices for sustainable development emerging from developing countries in two critical challenge areas to sustainability: water and energy. These, with additional analysis and quantification, can be adapted and expanded for transfer throughout the developed and developing world in advancing sustainability. A common theme in all of the case studies presented is the integration of natural processes and material flows into the anthropogenic system. Some of these techniques, originating in rural settings, have recently been adapted for use in cities, which is especially important as the global trend of urban population growth accelerates. Innovations in science and technology, specifically applied to two critical issues of today, water and energy, are expected to fundamentally shift the type and efficiency of energy and materials utilized to advance prosperity while protecting and restoring natural systems.
Toh, Su San; Treves, David S; Barati, Michelle T; Perlin, Michael H
2016-10-01
Microbotryum lychnidis-dioicae is a member of a species complex infecting host plants in the Caryophyllaceae. It is used as a model system in many areas of research, but attempts to make this organism tractable for reverse genetic approaches have not been fruitful. Here, we exploited the recently obtained genome sequence and transcriptome analysis to inform our design of constructs for use in Agrobacterium-mediated transformation techniques currently available for other fungi. Reproducible transformation was demonstrated at the genomic, transcriptional and functional levels. Moreover, these initial proof-of-principle experiments provide evidence that supports the findings from initial global transcriptome analysis regarding expression from the respective promoters under different growth conditions of the fungus. The technique thus provides for the first time the ability to stably introduce transgenes and over-express target M. lychnidis-dioicae genes.
Proteomics: a new approach to the study of disease.
Chambers, G; Lawrie, L; Cash, P; Murray, G I
2000-11-01
The global analysis of cellular proteins has recently been termed proteomics and is a key area of research that is developing in the post-genome era. Proteomics uses a combination of sophisticated techniques including two-dimensional (2D) gel electrophoresis, image analysis, mass spectrometry, amino acid sequencing, and bio-informatics to resolve comprehensively, to quantify, and to characterize proteins. The application of proteomics provides major opportunities to elucidate disease mechanisms and to identify new diagnostic markers and therapeutic targets. This review aims to explain briefly the background to proteomics and then to outline proteomic techniques. Applications to the study of human disease conditions ranging from cancer to infectious diseases are reviewed. Finally, possible future advances are briefly considered, especially those which may lead to faster sample throughput and increased sensitivity for the detection of individual proteins. Copyright 2000 John Wiley & Sons, Ltd.
Robust passivity analysis for discrete-time recurrent neural networks with mixed delays
NASA Astrophysics Data System (ADS)
Huang, Chuan-Kuei; Shu, Yu-Jeng; Chang, Koan-Yuh; Shou, Ho-Nien; Lu, Chien-Yu
2015-02-01
This article considers the robust passivity analysis for a class of discrete-time recurrent neural networks (DRNNs) with mixed time-delays and uncertain parameters. The mixed time-delays that consist of both the discrete time-varying and distributed time-delays in a given range are presented, and the uncertain parameters are norm-bounded. The activation functions are assumed to be globally Lipschitz continuous. Based on new bounding technique and appropriate type of Lyapunov functional, a sufficient condition is investigated to guarantee the existence of the desired robust passivity condition for the DRNNs, which can be derived in terms of a family of linear matrix inequality (LMI). Some free-weighting matrices are introduced to reduce the conservatism of the criterion by using the bounding technique. A numerical example is given to illustrate the effectiveness and applicability.
Secondary ion mass spectrometry: The application in the analysis of atmospheric particulate matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Di; Hua, Xin; Xiu, Guang-Li
Currently, considerable attention has been paid to atmospheric particulate matter (PM) investigation due to its importance in human health and global climate change. Surface characterization of PM is important since the chemical heterogeneity between the surface and bulk may vary its impact on the environment and human being. Secondary ion mass spectrometry (SIMS) is a surface technique with high surface sensitivity, capable of high spatial chemical imaging and depth profiling. Recent research shows that SIMS holds great potential in analyzing both surface and bulk chemical information of PM. In this review, we presented the working principal of SIMS in PMmore » characterization, summarized recent applications in PM analysis from different sources, discussed its advantages and limitations, and proposed the future development of this technique with a perspective in environmental sciences.« less
Real-Time Global Nonlinear Aerodynamic Modeling for Learn-To-Fly
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2016-01-01
Flight testing and modeling techniques were developed to accurately identify global nonlinear aerodynamic models for aircraft in real time. The techniques were developed and demonstrated during flight testing of a remotely-piloted subscale propeller-driven fixed-wing aircraft using flight test maneuvers designed to simulate a Learn-To-Fly scenario. Prediction testing was used to evaluate the quality of the global models identified in real time. The real-time global nonlinear aerodynamic modeling algorithm will be integrated and further tested with learning adaptive control and guidance for NASA Learn-To-Fly concept flight demonstrations.
Woods, Sarah; Taylor, Betsy
2013-12-01
Global endometrial ablation techniques are a relatively new surgical technology for the treatment of heavy menstrual bleeding that can now be used even in an outpatient clinic setting. A comparison of global ablation versus earlier ablation technologies notes no significant differences in success rates and some improvement in patient satisfaction. The advantages of the newer global endometrial ablation systems include less operative time, improved recovery time, and decreased anesthetic risk. Ablation procedures performed in an outpatient surgical or clinic setting provide advantages both of potential cost savings for patients and the health care system and improved patient convenience. Copyright © 2013. Published by Elsevier Inc.
Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See
2005-01-01
The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.
On the causal structure between CO2 and global temperature
Stips, Adolf; Macias, Diego; Coughlan, Clare; Garcia-Gorriz, Elisa; Liang, X. San
2016-01-01
We use a newly developed technique that is based on the information flow concept to investigate the causal structure between the global radiative forcing and the annual global mean surface temperature anomalies (GMTA) since 1850. Our study unambiguously shows one-way causality between the total Greenhouse Gases and GMTA. Specifically, it is confirmed that the former, especially CO2, are the main causal drivers of the recent warming. A significant but smaller information flow comes from aerosol direct and indirect forcing, and on short time periods, volcanic forcings. In contrast the causality contribution from natural forcings (solar irradiance and volcanic forcing) to the long term trend is not significant. The spatial explicit analysis reveals that the anthropogenic forcing fingerprint is significantly regionally varying in both hemispheres. On paleoclimate time scales, however, the cause-effect direction is reversed: temperature changes cause subsequent CO2/CH4 changes. PMID:26900086
Xiao, Qiang; Zeng, Zhigang
2017-10-01
The existed results of Lagrange stability and finite-time synchronization for memristive recurrent neural networks (MRNNs) are scale-free on time evolvement, and some restrictions appear naturally. In this paper, two novel scale-limited comparison principles are established by means of inequality techniques and induction principle on time scales. Then the results concerning Lagrange stability and global finite-time synchronization of MRNNs on time scales are obtained. Scaled-limited Lagrange stability criteria are derived, in detail, via nonsmooth analysis and theory of time scales. Moreover, novel criteria for achieving the global finite-time synchronization are acquired. In addition, the derived method can also be used to study global finite-time stabilization. The proposed results extend or improve the existed ones in the literatures. Two numerical examples are chosen to show the effectiveness of the obtained results.
Understanding the Global Structure and Evolution of Coronal Mass Ejections in the Solar Wind
NASA Technical Reports Server (NTRS)
Riley, Pete
2004-01-01
This report summarizes the technical progress made during the first six months of the second year of the NASA Living with a Star program contract Understanding the global structure and evolution of coronal mass ejections in the solar wind, between NASA and Science Applications International Corporation, and covers the period November 18, 2003 - May 17,2004. Under this contract SAIC has conducted numerical and data analysis related to fundamental issues concerning the origin, intrinsic properties, global structure, and evolution of coronal mass ejections in the solar wind. During this working period we have focused on a quantitative assessment of 5 flux rope fitting techniques. In the following sections we summarize the main aspects of this work and our proposed investigation plan for the next reporting period. Thus far, our investigation has resulted in 6 refereed scientific publications and we have presented the results at a number of scientific meetings and workshops.
User Selection Criteria of Airspace Designs in Flexible Airspace Management
NASA Technical Reports Server (NTRS)
Lee, Hwasoo E.; Lee, Paul U.; Jung, Jaewoo; Lai, Chok Fung
2011-01-01
A method for identifying global aerodynamic models from flight data in an efficient manner is explained and demonstrated. A novel experiment design technique was used to obtain dynamic flight data over a range of flight conditions with a single flight maneuver. Multivariate polynomials and polynomial splines were used with orthogonalization techniques and statistical modeling metrics to synthesize global nonlinear aerodynamic models directly and completely from flight data alone. Simulation data and flight data from a subscale twin-engine jet transport aircraft were used to demonstrate the techniques. Results showed that global multivariate nonlinear aerodynamic dependencies could be accurately identified using flight data from a single maneuver. Flight-derived global aerodynamic model structures, model parameter estimates, and associated uncertainties were provided for all six nondimensional force and moment coefficients for the test aircraft. These models were combined with a propulsion model identified from engine ground test data to produce a high-fidelity nonlinear flight simulation very efficiently. Prediction testing using a multi-axis maneuver showed that the identified global model accurately predicted aircraft responses.
Location estimation in wireless sensor networks using spring-relaxation technique.
Zhang, Qing; Foh, Chuan Heng; Seet, Boon-Chong; Fong, A C M
2010-01-01
Accurate and low-cost autonomous self-localization is a critical requirement of various applications of a large-scale distributed wireless sensor network (WSN). Due to its massive deployment of sensors, explicit measurements based on specialized localization hardware such as the Global Positioning System (GPS) is not practical. In this paper, we propose a low-cost WSN localization solution. Our design uses received signal strength indicators for ranging, light weight distributed algorithms based on the spring-relaxation technique for location computation, and the cooperative approach to achieve certain location estimation accuracy with a low number of nodes with known locations. We provide analysis to show the suitability of the spring-relaxation technique for WSN localization with cooperative approach, and perform simulation experiments to illustrate its accuracy in localization.
NASA Astrophysics Data System (ADS)
Donner, Reik; Balasis, Georgios; Stolbova, Veronika; Wiedermann, Marc; Georgiou, Marina; Kurths, Jürgen
2016-04-01
Magnetic storms are the most prominent global manifestations of out-of-equilibrium magnetospheric dynamics. Investigating the dynamical complexity exhibited by geomagnetic observables can provide valuable insights into relevant physical processes as well as temporal scales associated with this phenomenon. In this work, we introduce several innovative data analysis techniques enabling a quantitative analysis of the Dst index non-stationary behavior. Using recurrence quantification analysis (RQA) and recurrence network analysis (RNA), we obtain a variety of complexity measures serving as markers of quiet- and storm-time magnetospheric dynamics. We additionally apply these techniques to the main driver of Dst index variations, the V BSouth coupling function and interplanetary medium parameters Bz and Pdyn in order to discriminate internal processes from the magnetosphere's response directly induced by the external forcing by the solar wind. The derived recurrence-based measures allow us to improve the accuracy with which magnetospheric storms can be classified based on ground-based observations. The new methodology presented here could be of significant interest for the space weather research community working on time series analysis for magnetic storm forecasts.
Nallasivam, Ulaganathan; Shah, Vishesh H.; Shenvi, Anirudh A.; ...
2016-02-10
We present a general Global Minimization Algorithm (GMA) to identify basic or thermally coupled distillation configurations that require the least vapor duty under minimum reflux conditions for separating any ideal or near-ideal multicomponent mixture into a desired number of product streams. In this algorithm, global optimality is guaranteed by modeling the system using Underwood equations and reformulating the resulting constraints to bilinear inequalities. The speed of convergence to the globally optimal solution is increased by using appropriate feasibility and optimality based variable-range reduction techniques and by developing valid inequalities. As a result, the GMA can be coupled with already developedmore » techniques that enumerate basic and thermally coupled distillation configurations, to provide for the first time, a global optimization based rank-list of distillation configurations.« less
Bural, Gonca; Torigian, Drew; Basu, Sandip; Houseni, Mohamed; Zhuge, Ying; Rubello, Domenico; Udupa, Jayaram; Alavi, Abass
2015-12-01
Our aim was to explore a novel quantitative method [based upon an MRI-based image segmentation that allows actual calculation of grey matter, white matter and cerebrospinal fluid (CSF) volumes] for overcoming the difficulties associated with conventional techniques for measuring actual metabolic activity of the grey matter. We included four patients with normal brain MRI and fluorine-18 fluorodeoxyglucose (F-FDG)-PET scans (two women and two men; mean age 46±14 years) in this analysis. The time interval between the two scans was 0-180 days. We calculated the volumes of grey matter, white matter and CSF by using a novel segmentation technique applied to the MRI images. We measured the mean standardized uptake value (SUV) representing the whole metabolic activity of the brain from the F-FDG-PET images. We also calculated the white matter SUV from the upper transaxial slices (centrum semiovale) of the F-FDG-PET images. The whole brain volume was calculated by summing up the volumes of the white matter, grey matter and CSF. The global cerebral metabolic activity was calculated by multiplying the mean SUV with total brain volume. The whole brain white matter metabolic activity was calculated by multiplying the mean SUV for the white matter by the white matter volume. The global cerebral metabolic activity only reflects those of the grey matter and the white matter, whereas that of the CSF is zero. We subtracted the global white matter metabolic activity from that of the whole brain, resulting in the global grey matter metabolism alone. We then divided the grey matter global metabolic activity by grey matter volume to accurately calculate the SUV for the grey matter alone. The brain volumes ranged between 1546 and 1924 ml. The mean SUV for total brain was 4.8-7. Total metabolic burden of the brain ranged from 5565 to 9617. The mean SUV for white matter was 2.8-4.1. On the basis of these measurements we generated the grey matter SUV, which ranged from 8.1 to 11.3. The accurate metabolic activity of the grey matter can be calculated using the novel segmentation technique that we applied to MRI. By combining these quantitative data with those generated from F-FDG-PET images we were able to calculate the accurate metabolic activity of the grey matter. These types of measurements will be of great value in accurate analysis of the data from patients with neuropsychiatric disorders.
Global distribution of ozone for various seasons
NASA Technical Reports Server (NTRS)
Koprova, L. I.
1979-01-01
A technique which was used to obtain a catalog of the seasonal global distribution of ozone is presented. The technique is based on the simultaneous use of 1964-1975 data on the total ozone content from a worldwide network of ozonometric stations and on the vertical ozone profile from ozone sounding stations.
Association mining of dependency between time series
NASA Astrophysics Data System (ADS)
Hafez, Alaaeldin
2001-03-01
Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.
NASA Technical Reports Server (NTRS)
McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.
2012-01-01
This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.
Hoffman, Steven J; Hughsam, Matthew; Randhawa, Harkanwal; Sritharan, Lathika; Guyatt, Gordon; Lavis, John N; Røttingen, John-Arne
2016-04-16
In recent years, there have been numerous calls for global institutions to develop and enforce new international laws. International laws are, however, often blunt instruments with many uncertain benefits, costs, risks of harm, and trade-offs. Thus, they are probably not always appropriate solutions to global health challenges. Given these uncertainties and international law's potential importance for improving global health, the paucity of synthesized evidence addressing whether international laws achieve their intended effects or whether they are superior in comparison to other approaches is problematic. Ten electronic bibliographic databases were searched using predefined search strategies, including MEDLINE, Global Health, CINAHL, Applied Social Sciences Index and Abstracts, Dissertations and Theses, International Bibliography of Social Sciences, International Political Science Abstracts, Social Sciences Abstracts, Social Sciences Citation Index, PAIS International, and Worldwide Political Science Abstracts. Two reviewers will independently screen titles and abstracts using predefined inclusion criteria. Pairs of reviewers will then independently screen the full-text of articles for inclusion using predefined inclusion criteria and then independently extract data and assess risk of bias for included studies. Where feasible, results will be pooled through subgroup analyses, meta-analyses, and meta-regression techniques. The findings of this review will contribute to a better understanding of the expected benefits and possible harms of using international law to address different kinds of problems, thereby providing important evidence-informed guidance on when and how it can be effectively introduced and implemented by countries and global institutions. PROSPERO CRD42015019830.
Global Monitoring of Clouds and Aerosols Using a Network of Micro-Pulse Lidar Systems
NASA Technical Reports Server (NTRS)
Welton, Ellsworth J.; Campbell, James R.; Spinhirne, James D.; Scott, V. Stanley
2000-01-01
Long-term global radiation programs, such as AERONET and BSRN, have shown success in monitoring column averaged cloud and aerosol optical properties. Little attention has been focused on global measurements of vertically resolved optical properties. Lidar systems are the preferred instrument for such measurements. However, global usage of lidar systems has not been achieved because of limits imposed by older systems that were large, expensive, and logistically difficult to use in the field. Small, eye-safe, and autonomous lidar systems are now currently available and overcome problems associated with older systems. The first such lidar to be developed is the Micro-pulse lidar System (MPL). The MPL has proven to be useful in the field because it can be automated, runs continuously (day and night), is eye-safe, can easily be transported and set up, and has a small field-of-view which removes multiple scattering concerns. We have developed successful protocols to operate and calibrate MPL systems. We have also developed a data analysis algorithm that produces data products such as cloud and aerosol layer heights, optical depths, extinction profiles, and the extinction-backscatter ratio. The algorithm minimizes the use of a priori assumptions and also produces error bars for all data products. Here we present an overview of our MPL protocols and data analysis techniques. We also discuss the ongoing construction of a global MPL network in conjunction with the AERONET program. Finally, we present some early results from the MPL network.
Global parameter estimation for thermodynamic models of transcriptional regulation.
Suleimenov, Yerzhan; Ay, Ahmet; Samee, Md Abul Hassan; Dresch, Jacqueline M; Sinha, Saurabh; Arnosti, David N
2013-07-15
Deciphering the mechanisms involved in gene regulation holds the key to understanding the control of central biological processes, including human disease, population variation, and the evolution of morphological innovations. New experimental techniques including whole genome sequencing and transcriptome analysis have enabled comprehensive modeling approaches to study gene regulation. In many cases, it is useful to be able to assign biological significance to the inferred model parameters, but such interpretation should take into account features that affect these parameters, including model construction and sensitivity, the type of fitness calculation, and the effectiveness of parameter estimation. This last point is often neglected, as estimation methods are often selected for historical reasons or for computational ease. Here, we compare the performance of two parameter estimation techniques broadly representative of local and global approaches, namely, a quasi-Newton/Nelder-Mead simplex (QN/NMS) method and a covariance matrix adaptation-evolutionary strategy (CMA-ES) method. The estimation methods were applied to a set of thermodynamic models of gene transcription applied to regulatory elements active in the Drosophila embryo. Measuring overall fit, the global CMA-ES method performed significantly better than the local QN/NMS method on high quality data sets, but this difference was negligible on lower quality data sets with increased noise or on data sets simplified by stringent thresholding. Our results suggest that the choice of parameter estimation technique for evaluation of gene expression models depends both on quality of data, the nature of the models [again, remains to be established] and the aims of the modeling effort. Copyright © 2013 Elsevier Inc. All rights reserved.
Moment-based metrics for global sensitivity analysis of hydrological systems
NASA Astrophysics Data System (ADS)
Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto
2017-12-01
We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.
NASA Technical Reports Server (NTRS)
Ho, C.; Wilson, B.; Mannucci, A.; Lindqwister, U.; Yuan, D.
1997-01-01
Global ionospheric mapping (GIM) is a new, emerging technique for determining global ionospheric TEC (total electron content) based on measurements from a worldwide network of Global Positioning System (GPS) receivers.
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
NASA Astrophysics Data System (ADS)
Cicone, Antonio; Zhou, Haomin; Piersanti, Mirko; Materassi, Massimo; Spogli, Luca
2017-04-01
Nonlinear and nonstationary signals are ubiquitous in real life. Their decomposition and analysis is of crucial importance in many research fields. Traditional techniques, like Fourier and wavelet Transform have been proved to be limited in this context. In the last two decades new kind of nonlinear methods have been developed which are able to unravel hidden features of these kinds of signals. In this talk we will review the state of the art and present a new method, called Adaptive Local Iterative Filtering (ALIF). This method, developed originally to study mono-dimensional signals, unlike any other technique proposed so far, can be easily generalized to study two or higher dimensional signals. Furthermore, unlike most of the similar methods, it does not require any a priori assumption on the signal itself, so that the method can be applied as it is to any kind of signals. Applications of ALIF algorithm to real life signals analysis will be presented. Like, for instance, the behavior of the water level near the coastline in presence of a Tsunami, the length of the day signal, the temperature and pressure measured at ground level on a global grid, and the radio power scintillation from GNSS signals.
NASA Technical Reports Server (NTRS)
Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.
2014-01-01
We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.
NASA Astrophysics Data System (ADS)
Roy, P. K.; Pal, S.; Banerjee, G.; Biswas Roy, M.; Ray, D.; Majumder, A.
2014-12-01
River is considered as one of the main sources of freshwater all over the world. Hence analysis and maintenance of this water resource is globally considered a matter of major concern. This paper deals with the assessment of surface water quality of the Ichamati river using multivariate statistical techniques. Eight distinct surface water quality observation stations were located and samples were collected. For the samples collected statistical techniques were applied to the physico-chemical parameters and depth of siltation. In this paper cluster analysis is done to determine the relations between surface water quality and siltation depth of river Ichamati. Multiple regressions and mathematical equation modeling have been done to characterize surface water quality of Ichamati river on the basis of physico-chemical parameters. It was found that surface water quality of the downstream river was different from the water quality of the upstream. The analysis of the water quality parameters of the Ichamati river clearly indicate high pollution load on the river water which can be accounted to agricultural discharge, tidal effect and soil erosion. The results further reveal that with the increase in depth of siltation, water quality degraded.
Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center
NASA Technical Reports Server (NTRS)
Reinath, Michael S.
1997-01-01
Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.
The effects of missing data on global ozone estimates
NASA Technical Reports Server (NTRS)
Drewry, J. W.; Robbins, J. L.
1981-01-01
The effects of missing data and model truncation on estimates of the global mean, zonal distribution, and global distribution of ozone are considered. It is shown that missing data can introduce biased estimates with errors that are not accounted for in the accuracy calculations of empirical modeling techniques. Data-fill techniques are introduced and used for evaluating error bounds and constraining the estimate in areas of sparse and missing data. It is found that the accuracy of the global mean estimate is more dependent on data distribution than model size. Zonal features can be accurately described by 7th order models over regions of adequate data distribution. Data variance accounted for by higher order models appears to represent climatological features of columnar ozone rather than pure error. Data-fill techniques can prevent artificial feature generation in regions of sparse or missing data without degrading high order estimates over dense data regions.
NASA Technical Reports Server (NTRS)
Hashemi-Kia, Mostafa; Toossi, Mostafa
1990-01-01
A computational procedure for the reduction of large finite element models was developed. This procedure is used to obtain a significantly reduced model while retaining the essential global dynamic characteristics of the full-size model. This reduction procedure is applied to the airframe finite element model of AH-64A Attack Helicopter. The resulting reduced model is then validated by application to a vibration reduction study.
Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering
NASA Astrophysics Data System (ADS)
Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.
2004-05-01
Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.
Singh, Sheetal; Shih, Shyh-Jen; Vaughan, Andrew T M
2014-01-01
Current techniques for examining the global creation and repair of DNA double-strand breaks are restricted in their sensitivity, and such techniques mask any site-dependent variations in breakage and repair rate or fidelity. We present here a system for analyzing the fate of documented DNA breaks, using the MLL gene as an example, through application of ligation-mediated PCR. Here, a simple asymmetric double-stranded DNA adapter molecule is ligated to experimentally induced DNA breaks and subjected to seminested PCR using adapter- and gene-specific primers. The rate of appearance and loss of specific PCR products allows detection of both the break and its repair. Using the additional technique of inverse PCR, the presence of misrepaired products (translocations) can be detected at the same site, providing information on the fidelity of the ligation reaction in intact cells. Such techniques may be adapted for the analysis of DNA breaks and rearrangements introduced into any identifiable genomic location. We have also applied parallel sequencing for the high-throughput analysis of inverse PCR products to facilitate the unbiased recording of all rearrangements located at a specific genomic location.
Sgarlata, Carmelo; Raymond, Kenneth N
2016-07-05
The entropic and enthalpic driving forces for encapsulation versus sequential exterior guest binding to the [Ga4L6](12-) supramolecular host in solution are very different, which significantly complicates the determination of these thermodynamic parameters. The simultaneous use of complementary techniques, such as NMR, UV-vis, and isothermal titration calorimetry, enables the disentanglement of such multiple host-guest interactions. Indeed, data collected by each technique measure different components of the host-guest equilibria and together provide a complete picture of the solution thermodynamics. Unfortunately, commercially available programs do not allow for global analysis of different physical observables. We thus resorted to a novel procedure for the simultaneous refinement of multiple parameters (ΔG°, ΔH°, and ΔS°) by treating different observables through a weighted nonlinear least-squares analysis of a constrained model. The refinement procedure is discussed for the multiple binding of the Et4N(+) guest, but it is broadly applicable to the deconvolution of other intricate host-guest equilibria.
Relative performance of academic departments using DEA with sensitivity analysis.
Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P
2009-05-01
The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.
Pulmonary Infiltrates in Immunosuppressed Patients: Analysis of a Diagnostic Protocol
Danés, Cristina; González-Martín, Julián; Pumarola, Tomàs; Rañó, Ana; Benito, Natividad; Torres, Antoni; Moreno, Asunción; Rovira, Montserrat; Puig de la Bellacasa, Jorge
2002-01-01
A diagnostic protocol was started to study the etiology of pulmonary infiltrates in immunosuppressed patients. The diagnostic yields of the different techniques were analyzed, with special emphasis on the importance of the sample quality and the role of rapid techniques in the diagnostic strategy. In total, 241 patients with newly developed pulmonary infiltrates within a period of 19 months were included. Noninvasive or invasive evaluation was performed according to the characteristics of the infiltrates. Diagnosis was achieved in 202 patients (84%); 173 patients (72%) had pneumonia, and specific etiologic agents were found in 114 (66%). Bronchoaspirate and bronchoalveolar lavage showed the highest yields, either on global analysis (23 of 35 specimens [66%] and 70 of 134 specimens [52%], respectively) or on analysis of each type of pneumonia. A tendency toward better results with optimal-quality samples was observed, and a statistically significant difference was found in sputum bacterial culture. Rapid diagnostic tests yielded results in 71 of 114 (62.2%) diagnoses of etiological pneumonia. PMID:12037077
NASA Technical Reports Server (NTRS)
Comfort, R. H.; Horwitz, J. L.
1986-01-01
Temperature and density analysis in the Automated Analysis Program (for the global empirical model) were modified to use flow velocities produced by the flow velocity analysis. Revisions were started to construct an interactive version of the technique for temperature and density analysis used in the automated analysis program. A sutdy of ion and electron heating at high altitudes in the outer plasmasphere was initiated. Also the analysis of the electron gun experiments on SCATHA were extended to include eclipse operations in order to test a hypothesis that there are interactions between the 50 to 100 eV beam and spacecraft generated photoelectrons. The MASSCOMP software to be used in taking and displaying data in the two-ion plasma experiment was tested and is now working satisfactorily. Papers published during the report period are listed.
Fixation and chemical analysis of single fog and rain droplets
NASA Astrophysics Data System (ADS)
Kasahara, M.; Akashi, S.; Ma, C.-J.; Tohno, S.
Last decade, the importance of global environmental problems has been recognized worldwide. Acid rain is one of the most important global environmental problems as well as the global warming. The grasp of physical and chemical properties of fog and rain droplets is essential to make clear the physical and chemical processes of acid rain and also their effects on forests, materials and ecosystems. We examined the physical and chemical properties of single fog and raindrops by applying fixation technique. The sampling method and treatment procedure to fix the liquid droplets as a solid particle were investigated. Small liquid particles like fog droplet could be easily fixed within few minutes by exposure to cyanoacrylate vapor. The large liquid particles like raindrops were also fixed successively, but some of them were not perfect. Freezing method was applied to fix the large raindrops. Frozen liquid particles existed stably by exposure to cyanoacrylate vapor after freezing. The particle size measurement and the elemental analysis of the fixed particle were performed in individual base using microscope, and SEX-EDX, particle-induced X-ray emission (PIXE) and micro-PIXE analyses, respectively. The concentration in raindrops was dependent upon the droplet size and the elapsed time from the beginning of rainfall.
Diurnal global variability of the Earth's magnetic field during geomagnetically quiet conditions
NASA Astrophysics Data System (ADS)
Klausner, V.
2012-12-01
This work proposes a methodology (or treatment) to establish a representative signal of the global magnetic diurnal variation. It is based on a spatial distribution in both longitude and latitude of a set of magnetic stations as well as their magnetic behavior on a time basis. We apply the Principal Component Analysis (PCA) technique using gapped wavelet transform and wavelet correlation. This new approach was used to describe the characteristics of the magnetic variations at Vassouras (Brazil) and 12 other magnetic stations spread around the terrestrial globe. Using magnetograms from 2007, we have investigated the global dominant pattern of the Sq variation as a function of low solar activity. This year was divided into two seasons for seasonal variation analysis: solstices (June and December) and equinoxes (March and September). We aim to reconstruct the original geomagnetic data series of the H component taking into account only the diurnal variations with periods of 24 hours on geomagnetically quiet days. We advance a proposal to reconstruct the Sq baseline using only the PCA first mode. The first interpretation of the results suggests that PCA/wavelet method could be used to the reconstruction of the Sq baseline.
Model reference tracking control of an aircraft: a robust adaptive approach
NASA Astrophysics Data System (ADS)
Tanyer, Ilker; Tatlicioglu, Enver; Zergeroglu, Erkan
2017-05-01
This work presents the design and the corresponding analysis of a nonlinear robust adaptive controller for model reference tracking of an aircraft that has parametric uncertainties in its system matrices and additive state- and/or time-dependent nonlinear disturbance-like terms in its dynamics. Specifically, robust integral of the sign of the error feedback term and an adaptive term is fused with a proportional integral controller. Lyapunov-based stability analysis techniques are utilised to prove global asymptotic convergence of the output tracking error. Extensive numerical simulations are presented to illustrate the performance of the proposed robust adaptive controller.
Experimental modal analysis of the fuselage panels of an Aero Commander aircraft
NASA Technical Reports Server (NTRS)
Geisler, D.
1981-01-01
The reduction of interior noise in light aircraft was investigated with emphasis the thin fuselage sidewall. The approach used is theoretical and involves modeling of the sidewall panels and stiffeners. Experimental data obtained from tests investigating the effects of mass and stiffness treatments to the sidewalls are presented. The dynamic characteristics of treated panels are contrasted with the untreated sidewall panels using experimental modal analysis techniques. The results include the natural frequencies, modal dampling, and mode shapes of selected panels. Frequency response functions, data relating to the global fuselage response, and acoustic response are also presented.
NASA Technical Reports Server (NTRS)
Tornatore, Vincenza
2013-01-01
The main activities carried out at the PMD (Politecnico di Milano DIIAR) IVS Analysis Center during 2012 are briefly higlighted, and future plans for 2013 are sketched out. We principally continued to process European VLBI sessions using different approaches to evaluate possible differences due to various processing choices. Then VLBI solutions were also compared to the GPS ones as well as the ones calculated at co-located sites. Concerning the observational aspect, several tests were performed to identify the most suitable method to achieve the highest possible accuracy in the determination of GNSS (GLOBAL NAVIGATION SATELLITE SYSTEM) satellite positions using the VLBI technique.
Effects on noise properties of GPS time series caused by higher-order ionospheric corrections
NASA Astrophysics Data System (ADS)
Jiang, Weiping; Deng, Liansheng; Li, Zhao; Zhou, Xiaohui; Liu, Hongfei
2014-04-01
Higher-order ionospheric (HOI) effects are one of the principal technique-specific error sources in precise global positioning system (GPS) analysis. These effects also influence the non-linear characteristics of GPS coordinate time series. In this paper, we investigate these effects on coordinate time series in terms of seasonal variations and noise amplitudes. Both power spectral techniques and maximum likelihood estimators (MLE) are used to evaluate these effects quantitatively and qualitatively. Our results show an overall improvement for the analysis of global sites if HOI effects are considered. We note that the noise spectral index that is used for the determination of the optimal noise models in our analysis ranged between -1 and 0 both with and without HOI corrections, implying that the coloured noise cannot be removed by these corrections. However, the corrections were found to have improved noise properties for global sites. After the corrections were applied, the noise amplitudes at most sites decreased, among which the white noise amplitudes decreased remarkably. The white noise amplitudes of up to 81.8% of the selected sites decreased in the up component, and the flicker noise of 67.5% of the sites decreased in the north component. Stacked periodogram results show that, no matter whether the HOI effects are considered or not, a common fundamental period of 1.04 cycles per year (cpy), together with the expected annual and semi-annual signals, can explain all peaks of the north and up components well. For the east component, however, reasonable results can be obtained only based on HOI corrections. HOI corrections are useful for better detecting the periodic signals in GPS coordinate time series. Moreover, the corrections contributed partly to the seasonal variations of the selected sites, especially for the up component. Statistically, HOI corrections reduced more than 50% and more than 65% of the annual and semi-annual amplitudes respectively at the selected sites.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
Van’t Hoff global analyses of variable temperature isothermal titration calorimetry data
Freiburger, Lee A.; Auclair, Karine; Mittermaier, Anthony K.
2016-01-01
Isothermal titration calorimetry (ITC) can provide detailed information on the thermodynamics of biomolecular interactions in the form of equilibrium constants, KA, and enthalpy changes, ΔHA. A powerful application of this technique involves analyzing the temperature dependences of ITC-derived KA and ΔHA values to gain insight into thermodynamic linkage between binding and additional equilibria, such as protein folding. We recently developed a general method for global analysis of variable temperature ITC data that significantly improves the accuracy of extracted thermodynamic parameters and requires no prior knowledge of the coupled equilibria. Here we report detailed validation of this method using Monte Carlo simulations and an application to study coupled folding and binding in an aminoglycoside acetyltransferase enzyme. PMID:28018008
Huang, Haiying; Du, Qiaosheng; Kang, Xibing
2013-11-01
In this paper, a class of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays is investigated. The jumping parameters are modeled as a continuous-time finite-state Markov chain. At first, the existence of equilibrium point for the addressed neural networks is studied. By utilizing the Lyapunov stability theory, stochastic analysis theory and linear matrix inequality (LMI) technique, new delay-dependent stability criteria are presented in terms of linear matrix inequalities to guarantee the neural networks to be globally exponentially stable in the mean square. Numerical simulations are carried out to illustrate the main results. © 2013 ISA. Published by ISA. All rights reserved.
Ozone measurement system for NASA global air sampling program
NASA Technical Reports Server (NTRS)
Tiefermann, M. W.
1979-01-01
The ozone measurement system used in the NASA Global Air Sampling Program is described. The system uses a commercially available ozone concentration monitor that was modified and repackaged so as to operate unattended in an aircraft environment. The modifications required for aircraft use are described along with the calibration techniques, the measurement of ozone loss in the sample lines, and the operating procedures that were developed for use in the program. Based on calibrations with JPL's 5-meter ultraviolet photometer, all previously published GASP ozone data are biased high by 9 percent. A system error analysis showed that the total system measurement random error is from 3 to 8 percent of reading (depending on the pump diaphragm material) or 3 ppbv, whichever are greater.
NASA Astrophysics Data System (ADS)
Abdeh-Kolahchi, A.; Satish, M.; Datta, B.
2004-05-01
A state art groundwater monitoring network design is introduced. The method combines groundwater flow and transport results with optimization Genetic Algorithm (GA) to identify optimal monitoring well locations. Optimization theory uses different techniques to find a set of parameter values that minimize or maximize objective functions. The suggested groundwater optimal monitoring network design is based on the objective of maximizing the probability of tracking a transient contamination plume by determining sequential monitoring locations. The MODFLOW and MT3DMS models included as separate modules within the Groundwater Modeling System (GMS) are used to develop three dimensional groundwater flow and contamination transport simulation. The groundwater flow and contamination simulation results are introduced as input to the optimization model, using Genetic Algorithm (GA) to identify the groundwater optimal monitoring network design, based on several candidate monitoring locations. The groundwater monitoring network design model is used Genetic Algorithms with binary variables representing potential monitoring location. As the number of decision variables and constraints increase, the non-linearity of the objective function also increases which make difficulty to obtain optimal solutions. The genetic algorithm is an evolutionary global optimization technique, which is capable of finding the optimal solution for many complex problems. In this study, the GA approach capable of finding the global optimal solution to a groundwater monitoring network design problem involving 18.4X 1018 feasible solutions will be discussed. However, to ensure the efficiency of the solution process and global optimality of the solution obtained using GA, it is necessary that appropriate GA parameter values be specified. The sensitivity analysis of genetic algorithms parameters such as random number, crossover probability, mutation probability, and elitism are discussed for solution of monitoring network design.
NASA Technical Reports Server (NTRS)
Bonavito, N. L.; Gordon, C. L.; Inguva, R.; Serafino, G. N.; Barnes, R. A.
1994-01-01
NASA's Mission to Planet Earth (MTPE) will address important interdisciplinary and environmental issues such as global warming, ozone depletion, deforestation, acid rain, and the like with its long term satellite observations of the Earth and with its comprehensive Data and Information System. Extensive sets of satellite observations supporting MTPE will be provided by the Earth Observing System (EOS), while more specific process related observations will be provided by smaller Earth Probes. MTPE will use data from ground and airborne scientific investigations to supplement and validate the global observations obtained from satellite imagery, while the EOS satellites will support interdisciplinary research and model development. This is important for understanding the processes that control the global environment and for improving the prediction of events. In this paper we illustrate the potential for powerful artificial intelligence (AI) techniques when used in the analysis of the formidable problems that exist in the NASA Earth Science programs and of those to be encountered in the future MTPE and EOS programs. These techniques, based on the logical and probabilistic reasoning aspects of plausible inference, strongly emphasize the synergetic relation between data and information. As such, they are ideally suited for the analysis of the massive data streams to be provided by both MTPE and EOS. To demonstrate this, we address both the satellite imagery and model enhancement issues for the problem of ozone profile retrieval through a method based on plausible scientific inferencing. Since in the retrieval problem, the atmospheric ozone profile that is consistent with a given set of measured radiances may not be unique, an optimum statistical method is used to estimate a 'best' profile solution from the radiances and from additional a priori information.
NASA Technical Reports Server (NTRS)
Li, Jing; Carlson, Barbara E.; Lacis, Andrew A.
2014-01-01
Satellite measurements of global aerosol properties are very useful in constraining aerosol parameterization in climate models. The reliability of different data sets in representing global and regional aerosol variability becomes an essential question. In this study, we present the results of a comparison using combined principal component analysis (CPCA), applied to monthly mean, mapped (Level 3) aerosol optical depth (AOD) product from Moderate Resolution Imaging Spectroradiometer (MODIS), Multiangle Imaging Spectroradiometer (MISR), and Ozone Monitoring Instrument (OMI). This technique effectively finds the common space-time variability in the multiple data sets by decomposing the combined AOD field. The results suggest that all of the sensors capture the globally important aerosol regimes, including dust, biomass burning, pollution, and mixed aerosol types. Nonetheless, differences are also noted. Specifically, compared with MISR and OMI, MODIS variability is significantly higher over South America, India, and the Sahel. MODIS deep blue AOD has a lower seasonal variability in North Africa, accompanied by a decreasing trend that is not found in either MISR or OMI AOD data. The narrow swath of MISR results in an underestimation of dust variability over the Taklamakan Desert. The MISR AOD data also exhibit overall lower variability in South America and the Sahel. OMI does not capture the Russian wild fire in 2010 nor the phase shift in biomass burning over East South America compared to Central South America, likely due to cloud contamination and the OMI row anomaly. OMI also indicates a much stronger (boreal) winter peak in South Africa compared with MODIS and MISR.
Multiscale analysis and computation for flows in heterogeneous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Efendiev, Yalchin; Hou, T. Y.; Durlofsky, L. J.
Our work in this project is aimed at making fundamental advances in multiscale methods for flow and transport in highly heterogeneous porous media. The main thrust of this research is to develop a systematic multiscale analysis and efficient coarse-scale models that can capture global effects and extend existing multiscale approaches to problems with additional physics and uncertainties. A key emphasis is on problems without an apparent scale separation. Multiscale solution methods are currently under active investigation for the simulation of subsurface flow in heterogeneous formations. These procedures capture the effects of fine-scale permeability variations through the calculation of specialized coarse-scalemore » basis functions. Most of the multiscale techniques presented to date employ localization approximations in the calculation of these basis functions. For some highly correlated (e.g., channelized) formations, however, global effects are important and these may need to be incorporated into the multiscale basis functions. Other challenging issues facing multiscale simulations are the extension of existing multiscale techniques to problems with additional physics, such as compressibility, capillary effects, etc. In our project, we explore the improvement of multiscale methods through the incorporation of additional (single-phase flow) information and the development of a general multiscale framework for flows in the presence of uncertainties, compressible flow and heterogeneous transport, and geomechanics. We have considered (1) adaptive local-global multiscale methods, (2) multiscale methods for the transport equation, (3) operator-based multiscale methods and solvers, (4) multiscale methods in the presence of uncertainties and applications, (5) multiscale finite element methods for high contrast porous media and their generalizations, and (6) multiscale methods for geomechanics. Below, we present a brief overview of each of these contributions.« less
Hashemifar, Somaye; Xu, Jinbo
2014-09-01
High-throughput experimental techniques have produced a large amount of protein-protein interaction (PPI) data. The study of PPI networks, such as comparative analysis, shall benefit the understanding of life process and diseases at the molecular level. One way of comparative analysis is to align PPI networks to identify conserved or species-specific subnetwork motifs. A few methods have been developed for global PPI network alignment, but it still remains challenging in terms of both accuracy and efficiency. This paper presents a novel global network alignment algorithm, denoted as HubAlign, that makes use of both network topology and sequence homology information, based upon the observation that topologically important proteins in a PPI network usually are much more conserved and thus, more likely to be aligned. HubAlign uses a minimum-degree heuristic algorithm to estimate the topological and functional importance of a protein from the global network topology information. Then HubAlign aligns topologically important proteins first and gradually extends the alignment to the whole network. Extensive tests indicate that HubAlign greatly outperforms several popular methods in terms of both accuracy and efficiency, especially in detecting functionally similar proteins. HubAlign is available freely for non-commercial purposes at http://ttic.uchicago.edu/∼hashemifar/software/HubAlign.zip. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Han, Qingyuan; Rossow, William B.; Chou, Joyce; Welch, Ronald M.
1997-01-01
Cloud microphysical parameterizations have attracted a great deal of attention in recent years due to their effect on cloud radiative properties and cloud-related hydrological processes in large-scale models. The parameterization of cirrus particle size has been demonstrated as an indispensable component in the climate feedback analysis. Therefore, global-scale, long-term observations of cirrus particle sizes are required both as a basis of and as a validation of parameterizations for climate models. While there is a global scale, long-term survey of water cloud droplet sizes (Han et al. 1994), there is no comparable study for cirrus ice crystals. In this paper a near-global survey of cirrus ice crystal sizes is conducted using ISCCP satellite data analysis. The retrieval scheme uses phase functions based upon hexagonal crystals calculated by a ray tracing technique. The results show that global mean values of D(e) are about 60 micro-m. This study also investigates the possible reasons for the significant difference between satellite retrieved effective radii (approx. 60 micro-m) and aircraft measured particle sizes (approx. 200 micro-m) during the FIRE I IFO experiment. They are (1) vertical inhomogeneity of cirrus particle sizes; (2) lower limit of the instrument used in aircraft measurements; (3) different definitions of effective particle sizes; and (4) possible inappropriate phase functions used in satellite retrieval.
The poleward shift of storm tracks under global warming: A Lagrangian perspective
NASA Astrophysics Data System (ADS)
Tamarin, T.; Kaspi, Y.
2017-10-01
Comprehensive models of climate change projections have shown that the latitudinal band of extratropical storms will likely shift poleward under global warming. Here we study this poleward shift from a Lagrangian storm perspective, through simulations with an idealized general circulation model. By employing a feature tracking technique to identify the storms, we demonstrate that the poleward motion of individual cyclones increases with increasing global mean temperature. A potential vorticity tendency analysis of the cyclone composites highlights two leading mechanisms responsible for enhanced poleward motion: nonlinear horizontal advection and diabatic heating associated with latent heat release. Our results imply that for a 4 K rise in the global mean surface temperature, the mean poleward displacement of cyclones increases by about 0.85° of latitude, and this occurs in addition to a poleward shift of about 0.6° in their mean genesis latitude. Changes in cyclone tracks may have a significant impact on midlatitude climate, especially in localized storm tracks such as the Atlantic and Pacific storm tracks, which may exhibit a more poleward deflected shape.
FliPer: checking the reliability of global seismic parameters from automatic pipelines
NASA Astrophysics Data System (ADS)
Bugnet, L.; García, R. A.; Davies, G. R.; Mathur, S.; Corsaro, E.
2017-12-01
Our understanding of stars through asteroseismic data analysis is limited by our ability to take advantage of the huge amount of observed stars provided by space missions such as CoRoT, \\keplerp, \\ktop, and soon TESS and PLATO. Global seismic pipelines provide global stellar parameters such as mass and radius using the mean seismic parameters, as well as the effective temperature. These pipelines are commonly used automatically on thousands of stars observed by K2 for 3 months (and soon TESS for at least ˜ 1 month). However, pipelines are not immune from misidentifying noise peaks and stellar oscillations. Therefore, new validation techniques are required to assess the quality of these results. We present a new metric called FliPer (Flicker in Power), which takes into account the average variability at all measured time scales. The proper calibration of \\powvar enables us to obtain good estimations of global stellar parameters such as surface gravity that are robust against the influence of noise peaks and hence are an excellent way to find faults in asteroseismic pipelines.
The positive Indian Ocean Dipole-like response in the tropical Indian Ocean to global warming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Yiyong; Lu, Jian; Liu, Fukai
Climate models project a positive Indian Ocean Dipole (pIOD)-like SST response in the tropical Indian Ocean to global warming. By employing the Community Earth System Model (CESM) and applying an overriding technique to its ocean component Parallel Ocean Program version 2 (POP2), this study investigates the similarity and difference of the formation mechanisms for the changes in the tropical Indian Ocean during the pIOD versus global warming. Results show that their formation processes and related seasonality are quite similar; in particular, the Bjerknes feedback is the leading mechanism in producing the anomalous cooling over the eastern tropics in both cases.more » Some differences are also found, including that the cooling effect of the vertical advection over the eastern tropical Indian Ocean is dominated by the anomalous vertical velocity during the pIOD while it is dominated by the anomalous upper-ocean stratification under global warming. Lastly, these findings above are further examined with an analysis of the mixed layer heat budget.« less
The positive Indian Ocean Dipole-like response in the tropical Indian Ocean to global warming
Luo, Yiyong; Lu, Jian; Liu, Fukai; ...
2016-02-04
Climate models project a positive Indian Ocean Dipole (pIOD)-like SST response in the tropical Indian Ocean to global warming. By employing the Community Earth System Model (CESM) and applying an overriding technique to its ocean component Parallel Ocean Program version 2 (POP2), this study investigates the similarity and difference of the formation mechanisms for the changes in the tropical Indian Ocean during the pIOD versus global warming. Results show that their formation processes and related seasonality are quite similar; in particular, the Bjerknes feedback is the leading mechanism in producing the anomalous cooling over the eastern tropics in both cases.more » Some differences are also found, including that the cooling effect of the vertical advection over the eastern tropical Indian Ocean is dominated by the anomalous vertical velocity during the pIOD while it is dominated by the anomalous upper-ocean stratification under global warming. Lastly, these findings above are further examined with an analysis of the mixed layer heat budget.« less
Global Neuromagnetic Cortical Fields Have Non-Zero Velocity
Alexander, David M.; Nikolaev, Andrey R.; Jurica, Peter; Zvyagintsev, Mikhail; Mathiak, Klaus; van Leeuwen, Cees
2016-01-01
Globally coherent patterns of phase can be obscured by analysis techniques that aggregate brain activity measures across-trials, whether prior to source localization or for estimating inter-areal coherence. We analyzed, at single-trial level, whole head MEG recorded during an observer-triggered apparent motion task. Episodes of globally coherent activity occurred in the delta, theta, alpha and beta bands of the signal in the form of large-scale waves, which propagated with a variety of velocities. Their mean speed at each frequency band was proportional to temporal frequency, giving a range of 0.06 to 4.0 m/s, from delta to beta. The wave peaks moved over the entire measurement array, during both ongoing activity and task-relevant intervals; direction of motion was more predictable during the latter. A large proportion of the cortical signal, measurable at the scalp, exists as large-scale coherent motion. We argue that the distribution of observable phase velocities in MEG is dominated by spatial filtering considerations in combination with group velocity of cortical activity. Traveling waves may index processes involved in global coordination of cortical activity. PMID:26953886
Scale-up of ecological experiments: Density variation in the mobile bivalve Macomona liliana
Schneider, Davod C.; Walters, R.; Thrush, S.; Dayton, P.
1997-01-01
At present the problem of scaling up from controlled experiments (necessarily at a small spatial scale) to questions of regional or global importance is perhaps the most pressing issue in ecology. Most of the proposed techniques recommend iterative cycling between theory and experiment. We present a graphical technique that facilitates this cycling by allowing the scope of experiments, surveys, and natural history observations to be compared to the scope of models and theory. We apply the scope analysis to the problem of understanding the population dynamics of a bivalve exposed to environmental stress at the scale of a harbour. Previous lab and field experiments were found not to be 1:1 scale models of harbour-wide processes. Scope analysis allowed small scale experiments to be linked to larger scale surveys and to a spatially explicit model of population dynamics.
Light water reactor lower head failure analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rempe, J.L.; Chavez, S.A.; Thinnes, G.L.
1993-10-01
This document presents the results from a US Nuclear Regulatory Commission-sponsored research program to investigate the mode and timing of vessel lower head failure. Major objectives of the analysis were to identify plausible failure mechanisms and to develop a method for determining which failure mode would occur first in different light water reactor designs and accident conditions. Failure mechanisms, such as tube ejection, tube rupture, global vessel failure, and localized vessel creep rupture, were studied. Newly developed models and existing models were applied to predict which failure mechanism would occur first in various severe accident scenarios. So that a broadermore » range of conditions could be considered simultaneously, calculations relied heavily on models with closed-form or simplified numerical solution techniques. Finite element techniques-were employed for analytical model verification and examining more detailed phenomena. High-temperature creep and tensile data were obtained for predicting vessel and penetration structural response.« less
Kinetic Simulation and Energetic Neutral Atom Imaging of the Magnetosphere
NASA Technical Reports Server (NTRS)
Fok, Mei-Ching H.
2011-01-01
Advanced simulation tools and measurement techniques have been developed to study the dynamic magnetosphere and its response to drivers in the solar wind. The Comprehensive Ring Current Model (CRCM) is a kinetic code that solves the 3D distribution in space, energy and pitch-angle information of energetic ions and electrons. Energetic Neutral Atom (ENA) imagers have been carried in past and current satellite missions. Global morphology of energetic ions were revealed by the observed ENA images. We have combined simulation and ENA analysis techniques to study the development of ring current ions during magnetic storms and substorms. We identify the timing and location of particle injection and loss. We examine the evolution of ion energy and pitch-angle distribution during different phases of a storm. In this talk we will discuss the findings from our ring current studies and how our simulation and ENA analysis tools can be applied to the upcoming TRIO-CINAMA mission.
DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.
Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien
2017-09-01
Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.
Loudiyi, M; Rutledge, D N; Aït-Kaddour, A
2018-10-30
Common Dimension (ComDim) chemometrics method for multi-block data analysis was employed to evaluate the impact of different added salts and ripening times on physicochemical, color, dynamic low amplitude oscillatory rheology, texture profile, and molecular structure (fluorescence and MIR spectroscopies) of five Cantal-type cheeses. Firstly, Independent Components Analysis (ICA) was applied separately on fluorescence and MIR spectra in order to extract the relevant signal source and the associated proportions related to molecular structure characteristics. ComDim was then applied on the 31 data tables corresponding to the proportion of ICA signals obtained for spectral methods and the global analysis of cheeses by the other techniques. The ComDim results indicated that generally cheeses made with 50% NaCl or with 75:25% NaCl/KCl exhibit the equivalent characteristics in structural, textural, meltability and color properties. The proposed methodology demonstrates the applicability of ComDim for the characterization of samples when different techniques describe the same samples. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohd, Shukri; Holford, Karen M.; Pullin, Rhys
2014-02-12
Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less
Topological characterization versus synchronization for assessing (or not) dynamical equivalence
NASA Astrophysics Data System (ADS)
Letellier, Christophe; Mangiarotti, Sylvain; Sendiña-Nadal, Irene; Rössler, Otto E.
2018-04-01
Model validation from experimental data is an important and not trivial topic which is too often reduced to a simple visual inspection of the state portrait spanned by the variables of the system. Synchronization was suggested as a possible technique for model validation. By means of a topological analysis, we revisited this concept with the help of an abstract chemical reaction system and data from two electrodissolution experiments conducted by Jack Hudson's group. The fact that it was possible to synchronize topologically different global models led us to conclude that synchronization is not a recommendable technique for model validation. A short historical preamble evokes Jack Hudson's early career in interaction with Otto E. Rössler.
The EUSTACE project: delivering global, daily information on surface air temperature
NASA Astrophysics Data System (ADS)
Ghent, D.; Rayner, N. A.
2017-12-01
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-2018, https://www.eustaceproject.eu) we have developed an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. This includes developing new "Big Data" analysis methods as the data volumes involved are considerable. We will present recent progress along this road in the EUSTACE project, i.e.: • identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; • estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; • using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.
Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina
2010-04-01
DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.
Use of MODIS Cloud Top Pressure to Improve Assimilation Yields of AIRS Radiances in GSI
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Srikishen, Jayanthi
2014-01-01
Improvements to global and regional numerical weather prediction have been demonstrated through assimilation of data from NASA's Atmospheric Infrared Sounder (AIRS). Current operational data assimilation systems use AIRS radiances, but impact on regional forecasts has been much smaller than for global forecasts. Previously, it has been shown that cloud top designation associated with quality control procedures within the Gridpoint Statistical Interpolation (GSI) system used operationally by a number of Joint Center for Satellite Data Assimilation (JCSDA) partners may not provide the best representation of cloud top pressure (CTP). Because this designated CTP determines which channels are cloud-free and, thus, available for assimilation, ensuring the most accurate representation of this value is imperative to obtaining the greatest impact from satellite radiances. This paper examines the assimilation of hyperspectral sounder data used in operational numerical weather prediction by comparing analysis increments and numerical forecasts generated using operational techniques with a research technique that swaps CTP from the Moderate-resolution Imaging Spectroradiometer (MODIS) for the value of CTP calculated from the radiances within GSI.
Anatomic partial nephrectomy: technique evolution.
Azhar, Raed A; Metcalfe, Charles; Gill, Inderbir S
2015-03-01
Partial nephrectomy provides equivalent long-term oncologic and superior functional outcomes as radical nephrectomy for T1a renal masses. Herein, we review the various vascular clamping techniques employed during minimally invasive partial nephrectomy, describe the evolution of our partial nephrectomy technique and provide an update on contemporary thinking about the impact of ischemia on renal function. Recently, partial nephrectomy surgical technique has shifted away from main artery clamping and towards minimizing/eliminating global renal ischemia during partial nephrectomy. Supported by high-fidelity three-dimensional imaging, novel anatomic-based partial nephrectomy techniques have recently been developed, wherein partial nephrectomy can now be performed with segmental, minimal or zero global ischemia to the renal remnant. Sequential innovations have included early unclamping, segmental clamping, super-selective clamping and now culminating in anatomic zero-ischemia surgery. By eliminating 'under-the-gun' time pressure of ischemia for the surgeon, these techniques allow an unhurried, tightly contoured tumour excision with point-specific sutured haemostasis. Recent data indicate that zero-ischemia partial nephrectomy may provide better functional outcomes by minimizing/eliminating global ischemia and preserving greater vascularized kidney volume. Contemporary partial nephrectomy includes a spectrum of surgical techniques ranging from conventional-clamped to novel zero-ischemia approaches. Technique selection should be tailored to each individual case on the basis of tumour characteristics, surgical feasibility, surgeon experience, patient demographics and baseline renal function.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736
High throughput gene expression profiling: a molecular approach to integrative physiology
Liang, Mingyu; Cowley, Allen W; Greene, Andrew S
2004-01-01
Integrative physiology emphasizes the importance of understanding multiple pathways with overlapping, complementary, or opposing effects and their interactions in the context of intact organisms. The DNA microarray technology, the most commonly used method for high-throughput gene expression profiling, has been touted as an integrative tool that provides insights into regulatory pathways. However, the physiology community has been slow in acceptance of these techniques because of early failure in generating useful data and the lack of a cohesive theoretical framework in which experiments can be analysed. With recent advances in both technology and analysis, we propose a concept of multidimensional integration of physiology that incorporates data generated by DNA microarray and other functional, genomic, and proteomic approaches to achieve a truly integrative understanding of physiology. Analysis of several studies performed in simpler organisms or in mammalian model animals supports the feasibility of such multidimensional integration and demonstrates the power of DNA microarray as an indispensable molecular tool for such integration. Evaluation of DNA microarray techniques indicates that these techniques, despite limitations, have advanced to a point where the question-driven profiling research has become a feasible complement to the conventional, hypothesis-driven research. With a keen sense of homeostasis, global regulation, and quantitative analysis, integrative physiologists are uniquely positioned to apply these techniques to enhance the understanding of complex physiological functions. PMID:14678487
Inference of Stochastic Nonlinear Oscillators with Applications to Physiological Problems
NASA Technical Reports Server (NTRS)
Smelyanskiy, Vadim N.; Luchinsky, Dmitry G.
2004-01-01
A new method of inferencing of coupled stochastic nonlinear oscillators is described. The technique does not require extensive global optimization, provides optimal compensation for noise-induced errors and is robust in a broad range of dynamical models. We illustrate the main ideas of the technique by inferencing a model of five globally and locally coupled noisy oscillators. Specific modifications of the technique for inferencing hidden degrees of freedom of coupled nonlinear oscillators is discussed in the context of physiological applications.
Data Assimilation Cycling for Weather Analysis
NASA Technical Reports Server (NTRS)
Tran, Nam; Li, Yongzuo; Fitzpatrick, Patrick
2008-01-01
This software package runs the atmospheric model MM5 in data assimilation cycling mode to produce an optimized weather analysis, including the ability to insert or adjust a hurricane vortex. The program runs MM5 through a cycle of short forecasts every three hours where the vortex is adjusted to match the observed hurricane location and storm intensity. This technique adjusts the surrounding environment so that the proper steering current and environmental shear are achieved. MM5cycle uses a Cressman analysis to blend observation into model fields to get a more accurate weather analysis. Quality control of observations is also done in every cycle to remove bad data that may contaminate the analysis. This technique can assimilate and propagate data in time from intermittent and infrequent observations while maintaining the atmospheric field in a dynamically balanced state. The software consists of a C-shell script (MM5cycle.driver) and three FORTRAN programs (splitMM5files.F, comRegrid.F, and insert_vortex.F), and are contained in the pre-processor component of MM5 called "Regridder." The model is first initialized with data from a global model such as the Global Forecast System (GFS), which also provides lateral boundary conditions. These data are separated into single-time files using splitMM5.F. The hurricane vortex is then bogussed in the correct location and with the correct wind field using insert_vortex.F. The modified initial and boundary conditions are then recombined into the model fields using comRegrid.F. The model then makes a three-hour forecast. The three-hour forecast data from MM5 now become the analysis for the next short forecast run, where the vortex will again be adjusted. The process repeats itself until the desired time of analysis is achieved. This code can also assimilate observations if desired.
Global Optimization of a Periodic System using a Genetic Algorithm
NASA Astrophysics Data System (ADS)
Stucke, David; Crespi, Vincent
2001-03-01
We use a novel application of a genetic algorithm global optimizatin technique to find the lowest energy structures for periodic systems. We apply this technique to colloidal crystals for several different stoichiometries of binary and trinary colloidal crystals. This application of a genetic algorithm is decribed and results of likely candidate structures are presented.
Global Magnetosphere Modeling With Kinetic Treatment of Magnetic Reconnection
NASA Astrophysics Data System (ADS)
Toth, G.; Chen, Y.; Gombosi, T. I.; Cassak, P.; Markidis, S.; Peng, B.; Henderson, M. G.
2017-12-01
Global magnetosphere simulations with a kinetic treatment of magnetic reconnection are very challenging because of the large separation of global and kinetic scales. We have developed two algorithms that can overcome these difficulties: 1) the two-way coupling of the global magnetohydrodynamic code with an embedded particle-in-cell model (MHD-EPIC) and 2) the artificial increase of the ion and electron kinetic scales. Both of these techniques improve the efficiency of the simulations by many orders of magnitude. We will describe the techniques and show that they provide correct and meaningful results. Using the coupled model and the increased kinetic scales, we will present global magnetosphere simulations with the PIC domains covering the dayside and/or tail reconnection sites. The simulation results will be compared to and validated with MMS observations.
Neoliberal Optimism: Applying Market Techniques to Global Health.
Mei, Yuyang
2017-01-01
Global health and neoliberalism are becoming increasingly intertwined as organizations utilize markets and profit motives to solve the traditional problems of poverty and population health. I use field work conducted over 14 months in a global health technology company to explore how the promise of neoliberalism re-envisions humanitarian efforts. In this company's vaccine refrigerator project, staff members expect their investors and their market to allow them to achieve scale and develop accountability to their users in developing countries. However, the translation of neoliberal techniques to the global health sphere falls short of the ideal, as profits are meager and purchasing power remains with donor organizations. The continued optimism in market principles amidst such a non-ideal market reveals the tenacious ideological commitment to neoliberalism in these global health projects.
NASA Technical Reports Server (NTRS)
Zebker, Howard A.; Rosen, Paul A.; Goldstein, Richard M.; Gabriel, Andrew; Werner, Charles L.
1994-01-01
We present a map of the coseimic displacement field resulting from the Landers, California, June 28, 1992, earthquake derived using data acquired from an orbiting high-resolution radar system. We achieve results more accurate than previous space studies and similar in accuracy to those obtained by conventional field survey techniques. Data from the ERS 1 synthetic aperture radar instrument acquired in April, July, and August 1992 are used to generate a high-resolution, wide area map of the displacements. The data represent the motion in the direction of the radar line of sight to centimeter level precision of each 30-m resolution element in a 113 km by 90 km image. Our coseismic displacement contour map gives a lobed pattern consistent with theoretical models of the displacement field from the earthquake. Fine structure observed as displacement tiling in regions several kilometers from the fault appears to be the result of local surface fracturing. Comparison of these data with Global Positioning System and electronic distance measurement survey data yield a correlation of 0.96; thus the radar measurements are a means to extend the point measurements acquired by traditional techniques to an area map format. The technique we use is (1) more automatic, (2) more precise, and (3) better validated than previous similar applications of differential radar interferometry. Since we require only remotely sensed satellite data with no additioanl requirements for ancillary information. the technique is well suited for global seismic monitoring and analysis.
Description of the GMAO OSSE for Weather Analysis Software Package: Version 3
NASA Technical Reports Server (NTRS)
Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.;
2017-01-01
The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.
Global electromagnetic induction in the moon and planets. [poloidal eddy current transient response
NASA Technical Reports Server (NTRS)
Dyal, P.; Parkin, C. W.
1973-01-01
Experiments and analyses concerning electromagnetic induction in the moon and other extraterrestrial bodies are summarized. The theory of classical electromagnetic induction in a sphere is first considered, and this treatment is extended to the case of the moon, where poloidal eddy-current response has been found experimentally to dominate other induction modes. Analysis of lunar poloidal induction yields lunar internal electrical conductivity and temperature profiles. Two poloidal-induction analytical techniques are discussed: a transient-response method applied to time-series magnetometer data, and a harmonic-analysis method applied to data numerically Fourier-transformed to the frequency domain, with emphasis on the former technique. Attention is given to complicating effects of the solar wind interaction with both induced poloidal fields and remanent steady fields. The static magnetization field induction mode is described, from which are calculated bulk magnetic permeability profiles. Magnetic field measurements obtained from the moon and from fly-bys of Venus and Mars are studied to determine the feasibility of extending theoretical and experimental induction techniques to other bodies in the solar system.
NASA Astrophysics Data System (ADS)
Singh, K.; Sandu, A.; Bowman, K. W.; Parrington, M.; Jones, D. B. A.; Lee, M.
2011-08-01
Chemistry transport models determine the evolving chemical state of the atmosphere by solving the fundamental equations that govern physical and chemical transformations subject to initial conditions of the atmospheric state and surface boundary conditions, e.g., surface emissions. The development of data assimilation techniques synthesize model predictions with measurements in a rigorous mathematical framework that provides observational constraints on these conditions. Two families of data assimilation methods are currently widely used: variational and Kalman filter (KF). The variational approach is based on control theory and formulates data assimilation as a minimization problem of a cost functional that measures the model-observations mismatch. The Kalman filter approach is rooted in statistical estimation theory and provides the analysis covariance together with the best state estimate. Suboptimal Kalman filters employ different approximations of the covariances in order to make the computations feasible with large models. Each family of methods has both merits and drawbacks. This paper compares several data assimilation methods used for global chemical data assimilation. Specifically, we evaluate data assimilation approaches for improving estimates of the summertime global tropospheric ozone distribution in August 2006 based on ozone observations from the NASA Tropospheric Emission Spectrometer and the GEOS-Chem chemistry transport model. The resulting analyses are compared against independent ozonesonde measurements to assess the effectiveness of each assimilation method. All assimilation methods provide notable improvements over the free model simulations, which differ from the ozonesonde measurements by about 20 % (below 200 hPa). Four dimensional variational data assimilation with window lengths between five days and two weeks is the most accurate method, with mean differences between analysis profiles and ozonesonde measurements of 1-5 %. Two sequential assimilation approaches (three dimensional variational and suboptimal KF), although derived from different theoretical considerations, provide similar ozone estimates, with relative differences of 5-10 % between the analyses and ozonesonde measurements. Adjoint sensitivity analysis techniques are used to explore the role of of uncertainties in ozone precursors and their emissions on the distribution of tropospheric ozone. A novel technique is introduced that projects 3-D-Variational increments back to an equivalent initial condition, which facilitates comparison with 4-D variational techniques.
Bumps in river profiles: uncertainty assessment and smoothing using quantile regression techniques
NASA Astrophysics Data System (ADS)
Schwanghart, Wolfgang; Scherler, Dirk
2017-12-01
The analysis of longitudinal river profiles is an important tool for studying landscape evolution. However, characterizing river profiles based on digital elevation models (DEMs) suffers from errors and artifacts that particularly prevail along valley bottoms. The aim of this study is to characterize uncertainties that arise from the analysis of river profiles derived from different, near-globally available DEMs. We devised new algorithms - quantile carving and the CRS algorithm - that rely on quantile regression to enable hydrological correction and the uncertainty quantification of river profiles. We find that globally available DEMs commonly overestimate river elevations in steep topography. The distributions of elevation errors become increasingly wider and right skewed if adjacent hillslope gradients are steep. Our analysis indicates that the AW3D DEM has the highest precision and lowest bias for the analysis of river profiles in mountainous topography. The new 12 m resolution TanDEM-X DEM has a very low precision, most likely due to the combined effect of steep valley walls and the presence of water surfaces in valley bottoms. Compared to the conventional approaches of carving and filling, we find that our new approach is able to reduce the elevation bias and errors in longitudinal river profiles.
A Global Health Elective Course in a PharmD Curriculum
Dutta, Arjun; Kovera, Craig
2014-01-01
Objective. To describe the design, development, and the first 4 implementations of a Global Health elective course intended to prepare pharmacy students pursue global health careers and to evaluate student perceptions of the instructional techniques used and of skills developed during the course. Design. Following the blended curriculum model used at Touro College of Pharmacy, the Global Health course combined team-based learning (TBL) sessions in class, out-of-class team projects, and online self-directed learning with classroom teaching and discussion sessions. Assessment. Student performance was assessed with TBL sessions, team projects, and class presentations, online quizzes, and final examinations. A precourse and postcourse survey showed improvement in global health knowledge and attitudes, and in the perception of pharmacists’ role and career opportunities in global health. Significant improvement in skills applicable to global health work was reported and students rated highly the instructional techniques, value, and relevance of the course. Conclusion. The Global Health elective course is on track to achieve its intended goal of equipping pharmacy students with the requisite knowledge and applicable skills to pursue global health careers and opportunities. After taking this course, students have gone on to pursue global field experiences. PMID:25657374
A chaotic model for the epidemic of Ebola virus disease in West Africa (2013-2016)
NASA Astrophysics Data System (ADS)
Mangiarotti, Sylvain; Peyre, Marisa; Huc, Mireille
2016-11-01
An epidemic of Ebola Virus Disease (EVD) broke out in Guinea in December 2013. It was only identified in March 2014 while it had already spread out in Liberia and Sierra Leone. The spill over of the disease became uncontrollable and the epidemic could not be stopped before 2016. The time evolution of this epidemic is revisited here with the global modeling technique which was designed to obtain the deterministic models from single time series. A generalized formulation of this technique for multivariate time series is introduced. It is applied to the epidemic of EVD in West Africa focusing on the period between March 2014 and January 2015, that is, before any detected signs of weakening. Data gathered by the World Health Organization, based on the official publications of the Ministries of Health of the three main countries involved in this epidemic, are considered in our analysis. Two observed time series are used: the daily numbers of infections and deaths. A four-dimensional model producing a very complex dynamical behavior is obtained. The model is tested in order to investigate its skills and drawbacks. Our global analysis clearly helps to distinguish three main stages during the epidemic. A characterization of the obtained attractor is also performed. In particular, the topology of the chaotic attractor is analyzed and a skeleton is obtained for its structure.
ERIC Educational Resources Information Center
Bushell, Brenda
Rationale and techniques for incorporating global environmental education into second language instruction are discussed. The approach suggested combines infusion of environmental issues into the curriculum and presentation of a global perspective on environmental problems and their solutions. Six concepts of global education are outlined:…
Surface conversion techniques for low energy neutral atom imagers
NASA Technical Reports Server (NTRS)
Quinn, J. M.
1995-01-01
This investigation has focused on development of key technology elements for low energy neutral atom imaging. More specifically, we have investigated the conversion of low energy neutral atoms to negatively charged ions upon reflection from specially prepared surfaces. This 'surface conversion' technique appears to offer a unique capability of detecting, and thus imaging, neutral atoms at energies of 0.01 - 1 keV with high enough efficiencies to make practical its application to low energy neutral atom imaging in space. Such imaging offers the opportunity to obtain the first instantaneous global maps of macroscopic plasma features and their temporal variation. Through previous in situ plasma measurements, we have a statistical picture of large scale morphology and local measurements of dynamic processes. However, with in situ techniques it is impossible to characterize or understand many of the global plasma transport and energization processes. A series of global plasma images would greatly advance our understanding of these processes and would provide the context for interpreting previous and future in situ measurements. Fast neutral atoms, created from ions that are neutralized in collisions with exospheric neutrals, offer the means for remotely imaging plasma populations. Energy and mass analysis of these neutrals provides critical information about the source plasma distribution. The flux of neutral atoms available for imaging depends upon a convolution of the ambient plasma distribution with the charge exchange cross section for the background neutral population. Some of the highest signals are at relatively low energies (well below 1 keV). This energy range also includes some of the most important plasma populations to be imaged, for example the base of the cleft ion fountain.
Using hadron-in-jet data in a global analysis of D* fragmentation functions
NASA Astrophysics Data System (ADS)
Anderle, Daniele P.; Kaufmann, Tom; Stratmann, Marco; Ringer, Felix; Vitev, Ivan
2017-08-01
We present a novel global QCD analysis of charged D*-meson fragmentation functions at next-to-leading order accuracy. This is achieved by making use of the available data for single-inclusive D*-meson production in electron-positron annihilation, hadron-hadron collisions, and, for the first time, in-jet fragmentation in proton-proton scattering. It is shown how to include all relevant processes efficiently and without approximations within the Mellin moment technique, specifically for the in-jet fragmentation cross section. The presented technical framework is generic and can be straightforwardly applied to future analyses of fragmentation functions for other hadron species, as soon as more in-jet fragmentation data become available. We choose to work within the zero mass variable flavor number scheme which is applicable for sufficiently high energies and transverse momenta. The obtained optimum set of parton-to-D* fragmentation functions is accompanied by Hessian uncertainty sets which allow one to propagate hadronization uncertainties to other processes of interest.
Advanced phenotyping and phenotype data analysis for the study of plant growth and development
Rahaman, Md. Matiur; Chen, Dijun; Gillani, Zeeshan; Klukas, Christian; Chen, Ming
2015-01-01
Due to an increase in the consumption of food, feed, fuel and to meet global food security needs for the rapidly growing human population, there is a necessity to breed high yielding crops that can adapt to the future climate changes, particularly in developing countries. To solve these global challenges, novel approaches are required to identify quantitative phenotypes and to explain the genetic basis of agriculturally important traits. These advances will facilitate the screening of germplasm with high performance characteristics in resource-limited environments. Recently, plant phenomics has offered and integrated a suite of new technologies, and we are on a path to improve the description of complex plant phenotypes. High-throughput phenotyping platforms have also been developed that capture phenotype data from plants in a non-destructive manner. In this review, we discuss recent developments of high-throughput plant phenotyping infrastructure including imaging techniques and corresponding principles for phenotype data analysis. PMID:26322060
Global Analysis of Yeast Endosomal Transport Identifies the Vps55/68 Sorting Complex
Schluter, Cayetana; Lam, Karen K.Y.; Brumm, Jochen; Wu, Bella W.; Saunders, Matthew; Stevens, Tom H.
2008-01-01
Endosomal transport is critical for cellular processes ranging from receptor down-regulation and retroviral budding to the immune response. A full understanding of endosome sorting requires a comprehensive picture of the multiprotein complexes that orchestrate vesicle formation and fusion. Here, we use unsupervised, large-scale phenotypic analysis and a novel computational approach for the global identification of endosomal transport factors. This technique effectively identifies components of known and novel protein assemblies. We report the characterization of a previously undescribed endosome sorting complex that contains two well-conserved proteins with four predicted membrane-spanning domains. Vps55p and Vps68p form a complex that acts with or downstream of ESCRT function to regulate endosomal trafficking. Loss of Vps68p disrupts recycling to the TGN as well as onward trafficking to the vacuole without preventing the formation of lumenal vesicles within the MVB. Our results suggest the Vps55/68 complex mediates a novel, conserved step in the endosomal maturation process. PMID:18216282
NASA Astrophysics Data System (ADS)
Preusse, Peter; Dörnbrack, Andreas; Eckermann, Stephen D.; Riese, Martin; Schaeler, Bernd; Bacmeister, Julio T.; Broutman, Dave; Grossmann, Klaus U.
2002-09-01
The Cryogenic Infrared Spectrometers and Telescopes for the Atmosphere (CRISTA) instrument measured stratospheric temperatures and trace species concentrations with high precision and spatial resolution during two missions. The measuring technique is infrared limb-sounding of optically thin emissions. In a general approach, we investigate the applicability of the technique to measure gravity waves (GWs) in the retrieved temperature data. It is shown that GWs with wavelengths of the order of 100-200 km horizontally can be detected. The results are applicable to any instrument using the same technique. We discuss additional constraints inherent to the CRISTA instrument. The vertical field of view and the influence of the sampling and retrieval imply that waves with vertical wavelengths ~3-5 km or larger can be retrieved. Global distributions of GW fluctuations were extracted from temperature data measured by CRISTA using Maximum Entropy Method (MEM) and Harmonic Analysis (HA), yielding height profiles of vertical wavelength and peak amplitude for fluctuations in each scanned profile. The method is discussed and compared to Fourier transform analyses and standard deviations. Analysis of data from the first mission reveals large GW amplitudes in the stratosphere over southernmost South America. These waves obey the dispersion relation for linear two-dimensional mountain waves (MWs). The horizontal structure on 6 November 1994 is compared to temperature fields calculated by the Pennsylvania State University (PSU)/National Center for Atmospheric Research (NCAR) mesoscale model (MM5). It is demonstrated that precise knowledge of the instrument's sensitivity is essential. Particularly good agreement is found at the southern tip of South America where the MM5 accurately reproduces the amplitudes and phases of a large-scale wave with 400 km horizontal wavelength. Targeted ray-tracing simulations allow us to interpret some of the observed wave features. A companion paper will discuss MWs on a global scale and estimates the fraction that MWs contribute to the total GW energy (Preusse et al., in preparation, 2002).
Efficient QoS-aware Service Composition
NASA Astrophysics Data System (ADS)
Alrifai, Mohammad; Risse, Thomas
Web service composition requests are usually combined with endto-end QoS requirements, which are specified in terms of non-functional properties (e.g. response time, throughput and price). The goal of QoS-aware service composition is to find the best combination of services such that their aggregated QoS values meet these end-to-end requirements. Local selection techniques are very efficient but fail short in handling global QoS constraints. Global optimization techniques, on the other hand, can handle global constraints, but their poor performance render them inappropriate for applications with dynamic and real-time requirements. In this paper we address this problem and propose a solution that combines global optimization with local selection techniques for achieving a better performance. The proposed solution consists of two steps: first we use mixed integer linear programming (MILP) to find the optimal decomposition of global QoS constraints into local constraints. Second, we use local search to find the best web services that satisfy these local constraints. Unlike existing MILP-based global planning solutions, the size of the MILP model in our case is much smaller and independent on the number of available services, yields faster computation and more scalability. Preliminary experiments have been conducted to evaluate the performance of the proposed solution.
Tang, Haijing; Wang, Siye; Zhang, Yanjun
2013-01-01
Clustering has become a common trend in very long instruction words (VLIW) architecture to solve the problem of area, energy consumption, and design complexity. Register-file-connected clustered (RFCC) VLIW architecture uses the mechanism of global register file to accomplish the inter-cluster data communications, thus eliminating the performance and energy consumption penalty caused by explicit inter-cluster data move operations in traditional bus-connected clustered (BCC) VLIW architecture. However, the limit number of access ports to the global register file has become an issue which must be well addressed; otherwise the performance and energy consumption would be harmed. In this paper, we presented compiler optimization techniques for an RFCC VLIW architecture called Lily, which is designed for encryption systems. These techniques aim at optimizing performance and energy consumption for Lily architecture, through appropriate manipulation of the code generation process to maintain a better management of the accesses to the global register file. All the techniques have been implemented and evaluated. The result shows that our techniques can significantly reduce the penalty of performance and energy consumption due to access port limitation of global register file. PMID:23970841
Quantitative analysis of cardiovascular MR images.
van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H
1997-06-01
The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.
NASA Astrophysics Data System (ADS)
Zellweger, Christoph; Emmenegger, Lukas; Firdaus, Mohd; Hatakka, Juha; Heimann, Martin; Kozlova, Elena; Spain, T. Gerard; Steinbacher, Martin; van der Schoot, Marcel V.; Buchmann, Brigitte
2016-09-01
Until recently, atmospheric carbon dioxide (CO2) and methane (CH4) measurements were made almost exclusively using nondispersive infrared (NDIR) absorption and gas chromatography with flame ionisation detection (GC/FID) techniques, respectively. Recently, commercially available instruments based on spectroscopic techniques such as cavity ring-down spectroscopy (CRDS), off-axis integrated cavity output spectroscopy (OA-ICOS) and Fourier transform infrared (FTIR) spectroscopy have become more widely available and affordable. This resulted in a widespread use of these techniques at many measurement stations. This paper is focused on the comparison between a CRDS "travelling instrument" that has been used during performance audits within the Global Atmosphere Watch (GAW) programme of the World Meteorological Organization (WMO) with instruments incorporating other, more traditional techniques for measuring CO2 and CH4 (NDIR and GC/FID). We demonstrate that CRDS instruments and likely other spectroscopic techniques are suitable for WMO/GAW stations and allow a smooth continuation of historic CO2 and CH4 time series. Moreover, the analysis of the audit results indicates that the spectroscopic techniques have a number of advantages over the traditional methods which will lead to the improved accuracy of atmospheric CO2 and CH4 measurements.
VLBI tracking of GNSS satellites: recent achievements
NASA Astrophysics Data System (ADS)
Liu, Li; Heinkelmann, Robert; Tornatore, Vincenza; Li, Jinling; Mora-Diaz, Julian; Nilsson, Tobias; Karbon, Maria; Raposo-Pulido, Virginia; Soja, Benedikt; Xu, Minghui; Lu, Cuixian; Schuh, Harald
2014-05-01
While the ITRF (International Terrestrial Reference Frame) is realized by the combination of the various space geodetic techniques, VLBI (Very Long Baseline Interferometry) is the only technique for determining the ICRF (International Celestial Reference Frame) through its observations of extragalactic radio sources. Therefore, small inconsistencies between the two important frames do exist. According to recent comparisons of parameters derived by GNSS (Global Navigation Satellite Systems) and VLBI (e.g. troposphere delays, gradients, UT1-UTC), evidences of discrepancies obtained by the vast amounts of data become obvious. Terrestrial local ties can provide a way to interlink the otherwise independent technique-specific reference frames but only to some degree. It is evident that errors in the determination of the terrestrial ties, e.g. due to the errors when transforming the locally surveyed coordinates into global Cartesian three dimensional coordinates, introduce significant errors in the combined analysis of space geodetic techniques. A new concept for linking the space geodetic techniques might be to introduce celestial ties, e.g. realized by technique co-location on board of satellites. A small satellite carrying a variety of space geodetic techniques is under investigation at GFZ. Such a satellite would provide a new observing platform with its own additional unknowns, such as the orbit or atmospheric drag parameters. A link of the two techniques VLBI and GNSS might be achieved in a more direct way as well: by VLBI tracking of GNSS satellites. Several tests of this type of observation were already successfully carried out. This new kind of hybrid VLBI-GNSS observation would comprise a new direct inter-technique tie without the involvement of surveying methods and would enable improving the consistency of the two space geodetic techniques VLBI and GNSS, in particular of their celestial frames. Recently the radio telescopes Wettzell and Onsala have successfully observed a GNSS satellite for the first time, using also new receiver developments, done at Wettzell. In this contribution we want to develop the motivation for this kind of innovative observation and we will show first results of the test observations.
One technique for refining the global Earth gravity models
NASA Astrophysics Data System (ADS)
Koneshov, V. N.; Nepoklonov, V. B.; Polovnev, O. V.
2017-01-01
The results of the theoretical and experimental research on the technique for refining the global Earth geopotential models such as EGM2008 in the continental regions are presented. The discussed technique is based on the high-resolution satellite data for the Earth's surface topography which enables the allowance for the fine structure of the Earth's gravitational field without the additional gravimetry data. The experimental studies are conducted by the example of the new GGMplus global gravity model of the Earth with a resolution about 0.5 km, which is obtained by expanding the EGM2008 model to degree 2190 with the corrections for the topograohy calculated from the SRTM data. The GGMplus and EGM2008 models are compared with the regional geoid models in 21 regions of North America, Australia, Africa, and Europe. The obtained estimates largely support the possibility of refining the global geopotential models such as EGM2008 by the procedure implemented in GGMplus, particularly in the regions with relatively high elevation difference.
Pilot Ionosonde Network for Identification of Traveling Ionospheric Disturbances
NASA Astrophysics Data System (ADS)
Reinisch, Bodo; Galkin, Ivan; Belehaki, Anna; Paznukhov, Vadym; Huang, Xueqin; Altadill, David; Buresova, Dalia; Mielich, Jens; Verhulst, Tobias; Stankov, Stanimir; Blanch, Estefania; Kouba, Daniel; Hamel, Ryan; Kozlov, Alexander; Tsagouri, Ioanna; Mouzakis, Angelos; Messerotti, Mauro; Parkinson, Murray; Ishii, Mamoru
2018-03-01
Traveling ionospheric disturbances (TIDs) are the ionospheric signatures of atmospheric gravity waves. Their identification and tracking is important because the TIDs affect all services that rely on predictable ionospheric radio wave propagation. Although various techniques have been proposed to measure TID characteristics, their real-time implementation still has several difficulties. In this contribution, we present a new technique, based on the analysis of oblique Digisonde-to-Digisonde "skymap" observations, to directly identify TIDs and specify the TID wave parameters based on the measurement of angle of arrival, Doppler frequency, and time of flight of ionospherically reflected high-frequency radio pulses. The technique has been implemented for the first time for the Network for TID Exploration project with data streaming from the network of European Digisonde DPS4D observatories. The performance is demonstrated during a period of moderate auroral activity, assessing its consistency with independent measurements such as data from auroral magnetometers and electron density perturbations from Digisondes and Global Navigation Satellite System stations. Given that the different types of measurements used for this assessment were not made at exactly the same time and location, and that there was insufficient coverage in the area between the atmospheric gravity wave sources and the measurement locations, we can only consider our interpretation as plausible and indicative for the reliability of the extracted TID characteristics. In the framework of the new TechTIDE project (European Commission H2020), a retrospective analysis of the Network for TID Exploration results in comparison with those extracted from Global Navigation Satellite System total electron content-based methodologies is currently being attempted, and the results will be the objective of a follow-up paper.
Automated brainstem co-registration (ABC) for MRI.
Napadow, Vitaly; Dhond, Rupali; Kennedy, David; Hui, Kathleen K S; Makris, Nikos
2006-09-01
Group data analysis in brainstem neuroimaging is predicated on accurate co-registration of anatomy. As the brainstem is comprised of many functionally heterogeneous nuclei densely situated adjacent to one another, relatively small errors in co-registration can manifest in increased variance or decreased sensitivity (or significance) in detecting activations. We have devised a 2-stage automated, reference mask guided registration technique (Automated Brainstem Co-registration, or ABC) for improved brainstem co-registration. Our approach utilized a brainstem mask dataset to weight an automated co-registration cost function. Our method was validated through measurement of RMS error at 12 manually defined landmarks. These landmarks were also used as guides for a secondary manual co-registration option, intended for outlier individuals that may not adequately co-register with our automated method. Our methodology was tested on 10 healthy human subjects and compared to traditional co-registration techniques (Talairach transform and automated affine transform to the MNI-152 template). We found that ABC had a significantly lower mean RMS error (1.22 +/- 0.39 mm) than Talairach transform (2.88 +/- 1.22 mm, mu +/- sigma) and the global affine (3.26 +/- 0.81 mm) method. Improved accuracy was also found for our manual-landmark-guided option (1.51 +/- 0.43 mm). Visualizing individual brainstem borders demonstrated more consistent and uniform overlap for ABC compared to traditional global co-registration techniques. Improved robustness (lower susceptibility to outliers) was demonstrated with ABC through lower inter-subject RMS error variance compared with traditional co-registration methods. The use of easily available and validated tools (AFNI and FSL) for this method should ease adoption by other investigators interested in brainstem data group analysis.
VLBI real-time analysis by Kalman Filtering
NASA Astrophysics Data System (ADS)
Karbon, M.; Nilsson, T.; Soja, B.; Heinkelmann, R.; Raposo-Pulido, V.; Schuh, H.
2013-12-01
Geodetic Very Long Baseline Interferometry (VLBI) is one of the primary space geodetic techniques providing the full set of Earth Orientation Parameter (EOP) and is unique for observing long term Universal Time (UT1) and precession/nutation. Accurate and continuous EOP obtained in near real-time are essential for satellite based navigation and positioning and for enabling the precise tracking of interplanetary spacecrafts. To meet this necessity the International VLBI Service for Geodesy and Astrometry (IVS) increased its efforts to reduce the time span between the VLBI observations and the availability of the final results. Currently the timeliness is about two weeks, but the goal is to reduce it to less than one day with the future VGOS (VLBI2010 Global Observing System) network. The FWF project VLBI-ART contributes to this new generation VLBI system by considerably accelerating the VLBI analysis procedure through the implementation of an elaborate Kalman filter. This true real-time Kalman filter will be embedded in the Vienna VLBI Software (VieVS) as a completely automated tool with no need of human interaction. This filter also allows the prediction and combination of EOP from various space geodetic techniques by implementing stochastic models to statistically account for unpredictable changes in EOP. Additionally, atmospheric angular momenta calculated from numerical weather prediction models are introduced to support the short-term EOP prediction. To optimize the performance of the new software various investigations with real as well as simulated data are foreseen. The results are compared to the ones obtained by conventional VLBI parameter estimation methods (e.g. least squares method) and to corresponding parameter series from other techniques, such as from the Global Navigation Satellite Systems (GNSS).
A simple, physically-based method for evaluating the economic costs of geo-engineering schemes
NASA Astrophysics Data System (ADS)
Garrett, T. J.
2009-04-01
The consumption of primary energy (e.g coal, oil, uranium) by the global economy is done in expectation of a return on investment. For geo-engineering schemes, however, the relationship between the primary energy consumption required and the economic return is, at first glance, quite different. The energy costs of a given scheme represent a removal of economically productive available energy to do work in the normal global economy. What are the economic implications of the energy consumption associated with geo-engineering techniques? I will present a simple thermodynamic argument that, in general, real (inflation-adjusted) economic value has a fixed relationship to the rate of global primary energy consumption. This hypothesis will be shown to be supported by 36 years of available energy statistics and a two millennia period of statistics for global economic production. What is found from this analysis is that the value in any given inflation-adjusted 1990 dollar is sustained by a constant 9.7 +/- 0.3 milliwatts of global primary energy consumption. Thus, insofar as geo-engineering is concerned, any scheme that requires some nominal fraction of continuous global primary energy output necessitates a corresponding inflationary loss of real global economic value. For example, if 1% of global energy output is required, at today's consumption rates of 15 TW this corresponds to an inflationary loss of 15 trillion 1990 dollars of real value. The loss will be less, however, if the geo-engineering scheme also enables a demonstrable enhancement to global economic production capacity through climate modification.
In situ sensors for measurements in the global trosposphere
NASA Technical Reports Server (NTRS)
Saeger, M. L.; Eaton, W. C.; Wright, R. S.; White, J. H.; Tommerdahl, J. B.
1981-01-01
Current techniques available for the in situ measurement of ambient trace gas species, particulate composition, and particulate size distribution are reviewed. The operational specifications of the various techniques are described. Most of the techniques described are those that have been used in airborne applications or show promise of being adaptable to airborne applications. Some of the instruments described are specialty items that are not commercially-available. In situ measurement techniques for several meteorological parameters important in the study of the distribution and transport of ambient air pollutants are discussed. Some remote measurement techniques for meteorological parameters are also discussed. State-of-the-art measurement capabilities are compared with a list of capabilities and specifications desired by NASA for ambient measurements in the global troposphere.
Monitoring beach changes using GPS surveying techniques
Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.
1993-01-01
The adaptation of Global Positioning System (GPS) surveying techniques to beach monitoring activities is a promising response to this challenge. An experiment that employed both GPS and conventional beach surveying was conducted, and a new beach monitoring method employing kinematic GPS surveys was devised. This new method involves the collection of precise shore-parallel and shore-normal GPS positions from a moving vehicle so that an accurate two-dimensional beach surface can be generated. Results show that the GPS measurements agree with conventional shore-normal surveys at the 1 cm level, and repeated GPS measurements employing the moving vehicle demonstrate a precision of better than 1 cm. In addition, the nearly continuous sampling and increased resolution provided by the GPS surveying technique reveals alongshore changes in beach morphology that are undetected by conventional shore-normal profiles. The application of GPS surveying techniques combined with the refinement of appropriate methods for data collection and analysis provides a better understanding of beach changes, sediment transport, and storm impacts.
Villalobos, Michael J; Betti, Christopher J; Vaughan, Andrew T M
2006-01-01
Current techniques for examining the global creation and repair of DNA double-strand breaks are restricted in their sensitivity, and such techniques mask any site-dependent variations in breakage and repair rate or fidelity. We present here a system for analyzing the fate of documented DNA breaks, using the MLL gene as an example, through application of ligation-mediated PCR. Here, a simple asymmetric double-stranded DNA adapter molecule is ligated to experimentally induced DNA breaks and subjected to seminested PCR using adapter and gene-specific primers. The rate of appearance and loss of specific PCR products allows detection of both the break and its repair. Using the additional technique of inverse PCR, the presence of misrepaired products (translocations) can be detected at the same site, providing information on the fidelity of the ligation reaction in intact cells. Such techniques may be adapted for the analysis of DNA breaks introduced into any identifiable genomic location.
Classification of Clouds and Deep Convection from GEOS-5 Using Satellite Observations
NASA Technical Reports Server (NTRS)
Putman, William; Suarez, Max
2010-01-01
With the increased resolution of global atmospheric models and the push toward global cloud resolving models, the resemblance of model output to satellite observations has become strikingly similar. As we progress with our adaptation of the Goddard Earth Observing System Model, Version 5 (GEOS-5) as a high resolution cloud system resolving model, evaluation of cloud properties and deep convection require in-depth analysis beyond a visual comparison. Outgoing long-wave radiation (OLR) provides a sufficient comparison with infrared (IR) satellite imagery to isolate areas of deep convection. We have adopted a binning technique to generate a series of histograms for OLR which classify the presence and fraction of clear sky versus deep convection in the tropics that can be compared with a similar analyses of IR imagery from composite Geostationary Operational Environmental Satellite (GOES) observations. We will present initial results that have been used to evaluate the amount of deep convective parameterization required within the model as we move toward cloud system resolving resolutions of 10- to 1-km globally.
Global Environmental Data for Mapping Infectious Disease Distribution
Hay, S.I.; Tatem, A.J.; Graham, A.J.; Goetz, S.J.; Rogers, D.J.
2011-01-01
This contribution documents the satellite data archives, data processing methods and temporal Fourier analysis (TFA) techniques used to create the remotely sensed datasets on the DVD distributed with this volume. The aim is to provide a detailed reference guide to the genesis of the data, rather than a standard review. These remotely sensed data cover the entire globe at either 1 × 1 or 8 × 8 km spatial resolution. We briefly evaluate the relationships between the 1 × 1 and 8 × 8 km global TFA products to explore their inter-compatibility. The 8 × 8 km TFA surfaces are used in the mapping procedures detailed in the subsequent disease mapping reviews, since the 1 × 1 km products have been validated less widely. Details are also provided on additional, current and planned sensors that should be able to provide continuity with these environmental variable surfaces, as well as other sources of global data that may be used for mapping infectious disease. PMID:16647967
Global meteorological data facility for real-time field experiments support and guidance
NASA Technical Reports Server (NTRS)
Shipham, Mark C.; Shipley, Scott T.; Trepte, Charles R.
1988-01-01
A Global Meteorological Data Facility (GMDF) has been constructed to provide economical real-time meteorological support to atmospheric field experiments. After collection and analysis of meteorological data sets at a central station, tailored meteorological products are transmitted to experiment field sites using conventional ground link or satellite communication techniques. The GMDF supported the Global Tropospheric Experiment Amazon Boundary Layer Experiment (GTE-ABLE II) based in Manaus, Brazil, during July and August 1985; an arctic airborne lidar survey mission for the Polar Stratospheric Clouds (PSC) experiment during January 1986; and the Genesis of Atlantic Lows Experiment (GALE) during January, February and March 1986. GMDF structure is similar to the UNIDATA concept, including meteorological data from the Zephyr Weather Transmission Service, a mode AAA GOES downlink, and dedicated processors for image manipulation, transmission and display. The GMDF improved field experiment operations in general, with the greatest benefits arising from the ability to communicate with field personnel in real time.
Global analysis of an impulsive delayed Lotka-Volterra competition system
NASA Astrophysics Data System (ADS)
Xia, Yonghui
2011-03-01
In this paper, a retarded impulsive n-species Lotka-Volterra competition system with feedback controls is studied. Some sufficient conditions are obtained to guarantee the global exponential stability and global asymptotic stability of a unique equilibrium for such a high-dimensional biological system. The problem considered in this paper is in many aspects more general and incorporates as special cases various problems which have been extensively studied in the literature. Moreover, applying the obtained results to some special cases, I derive some new criteria which generalize and greatly improve some well known results. A method is proposed to investigate biological systems subjected to the effect of both impulses and delays. The method is based on Banach fixed point theory and matrix's spectral theory as well as Lyapunov function. Moreover, some novel analytic techniques are employed to study GAS and GES. It is believed that the method can be extended to other high-dimensional biological systems and complex neural networks. Finally, two examples show the feasibility of the results.
Allen, Trevor I.; Wald, David J.
2009-01-01
Regional differences in ground-motion attenuation have long been thought to add uncertainty in the prediction of ground motion. However, a growing body of evidence suggests that regional differences in ground-motion attenuation may not be as significant as previously thought and that the key differences between regions may be a consequence of limitations in ground-motion datasets over incomplete magnitude and distance ranges. Undoubtedly, regional differences in attenuation can exist owing to differences in crustal structure and tectonic setting, and these can contribute to differences in ground-motion attenuation at larger source-receiver distances. Herein, we examine the use of a variety of techniques for the prediction of several ground-motion metrics (peak ground acceleration and velocity, response spectral ordinates, and macroseismic intensity) and compare them against a global dataset of instrumental ground-motion recordings and intensity assignments. The primary goal of this study is to determine whether existing ground-motion prediction techniques are applicable for use in the U.S. Geological Survey's Global ShakeMap and Prompt Assessment of Global Earthquakes for Response (PAGER). We seek the most appropriate ground-motion predictive technique, or techniques, for each of the tectonic regimes considered: shallow active crust, subduction zone, and stable continental region.
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis
Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.
2017-01-01
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.
Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P
2017-04-07
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.
Non-dynamic decimeter tracking of earth satellites using the Global Positioning System
NASA Technical Reports Server (NTRS)
Yunck, T. P.; Wu, S. C.
1986-01-01
A technique is described for employing the Global Positioning System (GPS) to determine the position of a low earth orbiter with decimeter accuracy without the need for user dynamic models. A differential observing strategy is used requiring a GPS receiver on the user vehicle and a network of six ground receivers. The technique uses the continuous record of position change obtained from GPS carrier phase to smooth position measurements made with pseudo-range. The result is a computationally efficient technique that can deliver decimeter accuracy down to the lowest altitude orbits.
Analysis of the DORIS, GNSS, SLR, VLBI and gravimetric time series at the GGOS core sites
NASA Astrophysics Data System (ADS)
Moreaux, G.; Lemoine, F. G.; Luceri, V.; Pavlis, E. C.; MacMillan, D. S.; Bonvalot, S.; Saunier, J.
2017-12-01
Since June 2016 and the installation of a new DORIS station in Wettzell (Germany), four geodetic sites (Badary, Greenbelt, Wettzell and Yarragadee) are equipped with the four space geodetic techniques (DORIS, GNSS, SLR and VLBI). In line with the GGOS (Global Geodetic Observing System) objective of achieving a terrestrial reference frame at the millimetric level of accuracy, the combination centers of the four space techniques initiated a joint study to assess the level of agreement among these space geodetic techniques. In addition to the four sites, we will consider all the GGOS core sites including the seven sites with at least two space geodetic techniques in addition to DORIS. Starting from the coordinate time series, we will estimate and compare the mean positions and velocities of the co-located instruments. The temporal evolution of the coordinate differences will also be evaluated with respect to the local tie vectors and discrepancies will be investigated. Then, the analysis of the signal content of the time series will be carried out. Amplitudes and phases of the common signals among the techniques, and eventually from gravity data, will be compared. The first objective of this talk is to describe our joint study: the sites, the data, and the objectives. The second purpose is to present the first results obtained from the GGAO (Goddard Geophysical and Astronomic Observatory) site of Greenbelt.
Computational methods for global/local analysis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.
1992-01-01
Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.
The relation between global migration and trade networks
NASA Astrophysics Data System (ADS)
Sgrignoli, Paolo; Metulini, Rodolfo; Schiavo, Stefano; Riccaboni, Massimo
2015-01-01
In this paper we develop a methodology to analyze and compare multiple global networks, focusing our analysis on the relation between human migration and trade. First, we identify the subset of products for which the presence of a community of migrants significantly increases trade intensity, where to assure comparability across networks we apply a hypergeometric filter that lets us identify those links which intensity is significantly higher than expected. Next, proposing a new way to define country neighbors based on the most intense links in the trade network, we use spatial econometrics techniques to measure the effect of migration on international trade, while controlling for network interdependences. Overall, we find that migration significantly boosts trade across countries and we are able to identify product categories for which this effect is particularly strong.
Seq-ing answers: uncovering the unexpected in global gene regulation.
Otto, George Maxwell; Brar, Gloria Ann
2018-04-19
The development of techniques for measuring gene expression globally has greatly expanded our understanding of gene regulatory mechanisms in depth and scale. We can now quantify every intermediate and transition in the canonical pathway of gene expression-from DNA to mRNA to protein-genome-wide. Employing such measurements in parallel can produce rich datasets, but extracting the most information requires careful experimental design and analysis. Here, we argue for the value of genome-wide studies that measure multiple outputs of gene expression over many timepoints during the course of a natural developmental process. We discuss our findings from a highly parallel gene expression dataset of meiotic differentiation, and those of others, to illustrate how leveraging these features can provide new and surprising insight into fundamental mechanisms of gene regulation.
A multilevel control system for the large space telescope. [numerical analysis/optimal control
NASA Technical Reports Server (NTRS)
Siljak, D. D.; Sundareshan, S. K.; Vukcevic, M. B.
1975-01-01
A multilevel scheme was proposed for control of Large Space Telescope (LST) modeled by a three-axis-six-order nonlinear equation. Local controllers were used on the subsystem level to stabilize motions corresponding to the three axes. Global controllers were applied to reduce (and sometimes nullify) the interactions among the subsystems. A multilevel optimization method was developed whereby local quadratic optimizations were performed on the subsystem level, and global control was again used to reduce (nullify) the effect of interactions. The multilevel stabilization and optimization methods are presented as general tools for design and then used in the design of the LST Control System. The methods are entirely computerized, so that they can accommodate higher order LST models with both conceptual and numerical advantages over standard straightforward design techniques.
Stability and bifurcation analysis on a ratio-dependent predator-prey model with time delay
NASA Astrophysics Data System (ADS)
Xu, Rui; Gan, Qintao; Ma, Zhien
2009-08-01
A ratio-dependent predator-prey model with time delay due to the gestation of the predator is investigated. By analyzing the corresponding characteristic equations, the local stability of a positive equilibrium and a semi-trivial boundary equilibrium is discussed, respectively. Further, it is proved that the system undergoes a Hopf bifurcation at the positive equilibrium. Using the normal form theory and the center manifold reduction, explicit formulae are derived to determine the direction of bifurcations and the stability and other properties of bifurcating periodic solutions. By means of an iteration technique, sufficient conditions are obtained for the global attractiveness of the positive equilibrium. By comparison arguments, the global stability of the semi-trivial equilibrium is also addressed. Numerical simulations are carried out to illustrate the main results.
Ouar, N; Guillier, D; Moris, V; Revol, M; Francois, C; Cristofari, S
2017-06-01
Labia minora reduction interventions rise in Europe and in North America. Several techniques are described. The objective of this study was to compare postoperative complications of the two most practiced interventions: wedge resection and edge resection. Primary labia minora reductions realized in our unit between October 2009 and July 2016 have been retrospectively identified. Two techniques were used by two surgeons: edge resection technique and wedge resection technique. The main evaluation criterion was the occurrence and the quantity of wound dehiscence: superior to 50% (total or subtotal) and inferior to 50% (partial). Patients were systematically examined at 1 week, 1 month and 6 months postoperatively. Data analysis between both groups was made with an exact Fisher test. Mean follow-up was 5.3 months after intervention. Sixty-four patients have been included, 42 wedge resections (group C) and 22 edge resections (group L). Global complication rate at 1 month was 13% (n=8). Among wedge resections 14% (n=6) developed complication and 2% (n=9) among edge resection. Seven surgical revisions were necessary: 5 for wound dehiscence (4 in the group C and 1 in the group L) and 2 for hematoma, one in each group. Three (5%) partial wound dehiscence (inferior to 50%) have been identified and let in secondary intention healing: 2 (19%) in the group C and 1 (27%) in the group L. Complication rates between both techniques were not significantly different. Postoperative wound dehiscence is the main labia minora reduction complication. Our global complication rate, 13%, matches with the current literature. A tendency can be shown where wedge resection is more likely to develop wound dehiscence than edge resection. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Analysis of Protein Expression in Cell Microarrays: A Tool for Antibody-based Proteomics
Andersson, Ann-Catrin; Strömberg, Sara; Bäckvall, Helena; Kampf, Caroline; Uhlen, Mathias; Wester, Kenneth; Pontén, Fredrik
2006-01-01
Tissue microarray (TMA) technology provides a possibility to explore protein expression patterns in a multitude of normal and disease tissues in a high-throughput setting. Although TMAs have been used for analysis of tissue samples, robust methods for studying in vitro cultured cell lines and cell aspirates in a TMA format have been lacking. We have adopted a technique to homogeneously distribute cells in an agarose gel matrix, creating an artificial tissue. This enables simultaneous profiling of protein expression in suspension- and adherent-grown cell samples assembled in a microarray. In addition, the present study provides an optimized strategy for the basic laboratory steps to efficiently produce TMAs. Presented modifications resulted in an improved quality of specimens and a higher section yield compared with standard TMA production protocols. Sections from the generated cell TMAs were tested for immunohistochemical staining properties using 20 well-characterized antibodies. Comparison of immunoreactivity in cultured dispersed cells and corresponding cells in tissue samples showed congruent results for all tested antibodies. We conclude that a modified TMA technique, including cell samples, provides a valuable tool for high-throughput analysis of protein expression, and that this technique can be used for global approaches to explore the human proteome. PMID:16957166
Uniform Foam Crush Testing for Multi-Mission Earth Entry Vehicle Impact Attenuation
NASA Technical Reports Server (NTRS)
Patterson, Byron W.; Glaab, Louis J.
2012-01-01
Multi-Mission Earth Entry Vehicles (MMEEVs) are blunt-body vehicles designed with the purpose of transporting payloads from outer space to the surface of the Earth. To achieve high-reliability and minimum weight, MMEEVs avoid use of limited-reliability systems, such as parachutes and retro-rockets, instead using built-in impact attenuators to absorb energy remaining at impact to meet landing loads requirements. The Multi-Mission Systems Analysis for Planetary Entry (M-SAPE) parametric design tool is used to facilitate the design of MMEEVs and develop the trade space. Testing was conducted to characterize the material properties of several candidate impact foam attenuators to enhance M-SAPE analysis. In the current effort, four different Rohacell foams are tested at three different, uniform, strain rates (approximately 0.17, approximately 100, approximately 13,600%/s). The primary data analysis method uses a global data smoothing technique in the frequency domain to remove noise and system natural frequencies. The results from the data indicate that the filter and smoothing technique are successful in identifying the foam crush event and removing aberrations. The effect of strain rate increases with increasing foam density. The 71-WF-HT foam may support Mars Sample Return requirements. Several recommendations to improve the drop tower test technique are identified.
The Global GNSS, SLR, VLBI, and DORIS Networks and their Support of GGOS: IGS+ILRS+IVS+IDS
NASA Technical Reports Server (NTRS)
Noll, Carey
2008-01-01
The global network of the International GNSS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International DORIS Service (IDS) are part of the ground-based infrastructure for GGOS. The observations obtained from these global networks provide for the determination and maintenance of the International Terrestrial Reference Frame (ITRF), an accurate set of positions and velocities that provides a stable coordinate system allowing scientists ts to link measurements over space and time. Many of these sites offer co-location of two or more techniques. Co-location provides integration of technique-specific networks into the ITRF as well as an assessment/validation of the quality and accuracy of the resulting measurements. As of fall 2008, these networks consisted of 410 GNSS sites, 42 laser ranging sites, 45 VLBI sites, and 58 DORIS sites. This poster will illustrate the global coverage of these networks, highlighting inter-technique co-locations, and show the importance of these networks 60 the underlying goals of GGOS including providing the observational basis to maintain a stable, accurate, global reference frame.
NASA Astrophysics Data System (ADS)
Taniguchi, Haruhito
Electric power generation that relies on various sources as the primary sources of energy is expected to bring down CO2 emissions levels to support the overall strategy to curb global warming. Accordingly, utilities are moving towards integrating more renewable sources for generation, mostly dispersed, and adopting Smart Grid Technologies for system control. In order to construct, operate, and maintain power systems stably and economically in such background, thorough understanding about the characteristics of power systems and their components is essential. This paper presents modeling and simulation techniques available for the analysis of critical aspects such as thermal capacity, stability, voltage stability, and frequency dynamics, vital for the stable operation of power systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Dipendra, E-mail: d-11sharma@rediffmail.com; Tiwari, S. N., E-mail: sntiwari123@rediffmail.com; Dwivedi, M. K., E-mail: dwivedi-ji@gmail.com
2016-05-06
Electronic structure properties of 4-n-methoxy-4′-cyanobiphenyl, a pure nematic liquid crystal have been examined using an ab‒initio, HF/6‒31G(d,p) technique with GAMESS program. Conformational and charge distribution analysis have been carried out. MEP, HOMO and LUMO surfaces have been scanned. Ionization potential, electron affinity, electronegativity, global hardness and softness of the liquid crystal molecule have been calculated. Further, stacking, side by side and end to end interactions between a molecular pair have been evaluated. Results have been used to elucidate the physico-chemical and liquid crystalline properties of the system.
Adapting GOMS to Model Human-Robot Interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drury, Jill; Scholtz, Jean; Kieras, David
2007-03-09
Human-robot interaction (HRI) has been maturing in tandem with robots’ commercial success. In the last few years HRI researchers have been adopting—and sometimes adapting—human-computer interaction (HCI) evaluation techniques to assess the efficiency and intuitiveness of HRI designs. For example, Adams (2005) used Goal Directed Task Analysis to determine the interaction needs of officers from the Nashville Metro Police Bomb Squad. Scholtz et al. (2004) used Endsley’s (1988) Situation Awareness Global Assessment Technique to determine robotic vehicle supervisors’ awareness of when vehicles were in trouble and thus required closer monitoring or intervention. Yanco and Drury (2004) employed usability testing to determinemore » (among other things) how well a search-andrescue interface supported use by first responders. One set of HCI tools that has so far seen little exploration in the HRI domain, however, is the class of modeling and evaluation techniques known as formal methods.« less
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
NASA Astrophysics Data System (ADS)
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
Computer-aided assessment of pulmonary disease in novel swine-origin H1N1 influenza on CT
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Dwyer, Andrew J.; Summers, Ronald M.; Mollura, Daniel J.
2011-03-01
The 2009 pandemic is a global outbreak of novel H1N1 influenza. Radiologic images can be used to assess the presence and severity of pulmonary infection. We develop a computer-aided assessment system to analyze the CT images from Swine-Origin Influenza A virus (S-OIV) novel H1N1 cases. The technique is based on the analysis of lung texture patterns and classification using a support vector machine (SVM). Pixel-wise tissue classification is computed from the SVM value. The method was validated on four H1N1 cases and ten normal cases. We demonstrated that the technique can detect regions of pulmonary abnormality in novel H1N1 patients and differentiate these regions from visually normal lung (area under the ROC curve is 0.993). This technique can also be applied to differentiate regions infected by different pulmonary diseases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nallasivam, Ulaganathan; Shah, Vishesh H.; Shenvi, Anirudh A.
We present a general Global Minimization Algorithm (GMA) to identify basic or thermally coupled distillation configurations that require the least vapor duty under minimum reflux conditions for separating any ideal or near-ideal multicomponent mixture into a desired number of product streams. In this algorithm, global optimality is guaranteed by modeling the system using Underwood equations and reformulating the resulting constraints to bilinear inequalities. The speed of convergence to the globally optimal solution is increased by using appropriate feasibility and optimality based variable-range reduction techniques and by developing valid inequalities. As a result, the GMA can be coupled with already developedmore » techniques that enumerate basic and thermally coupled distillation configurations, to provide for the first time, a global optimization based rank-list of distillation configurations.« less
ERIC Educational Resources Information Center
Chattuchai, Sakkarin; Singseewo, Adisak; Suksringarm, Paitool
2015-01-01
This study aims to investigate the effects of learning environmental education on the knowledge, awareness, global warming decreasing behavior, and critical thinking of eighty grade 11 students from two classes. The Four Noble Truths method with metacognitive techniques and traditional teaching method were used for the investigation. The sample…
A new technique in the global reliability of cyclic communications network
NASA Technical Reports Server (NTRS)
Sjogren, Jon A.
1989-01-01
The global reliability of a communications network is the probability that given any pair of nodes, there exists a viable path between them. A characterization of connectivity, for a given class of networks, can enable one to find this reliability. Such a characterization is described for a useful class of undirected networks called daisy-chained or braided networks. This leads to a new method of quickly computing the global reliability of these networks. Asymptotic behavior in terms of component reliability is related to geometric properties of the given graph. Generalization of the technique is discussed.
Modeling, Analysis, and Control of Swarming Agents in a Probabilistic Framework
2012-11-01
configurations, which can ultimately lead the swarm towards configurations close to the global minimum of the total potential of interactions. The drawback ...165–171, 1992. [6] H. Ye, H. Wang, and H. Wang, “Stabilization of a PVTOL aircraft and an inertia wheel pendulum using saturation technique,” IEEE...estimate its parameters. The drawback of this approach is that the assumed form of the field can be unrealistic. In the approach that we are presenting here
Ruiz-García, Leonor; Cabezas, Jose Antonio; de María, Nuria; Cervera, María-Teresa
2010-01-01
Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is a modification of the Amplified Fragment Length Polymorphism (AFLP) technique that has been used to study methylation of anonymous CCGG sequences in different fungi, plant and animal species. The main variation of this technique is based on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent cutter restriction enzyme. For each sample, AFLP analysis is performed using both EcoRI/HpaII and EcoRI/MspI digested samples. Comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) "Methylation-insensitive polymorphisms" that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples; and (2) "Methylation-sensitive polymorphisms" that are associated with amplified fragments differing in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses modifications that can be applied to adjust the technology to different species of interest.
2014-01-01
Background Poor quality medicines threaten the lives of millions of patients and are alarmingly common in many parts of the world. Nevertheless, the global extent of the problem remains unknown. Accurate estimates of the epidemiology of poor quality medicines are sparse and are influenced by sampling methodology and diverse chemical analysis techniques. In order to understand the existing data, the Antimalarial Quality Scientific Group at WWARN built a comprehensive, open-access, global database and linked Antimalarial Quality Surveyor, an online visualization tool. Analysis of the database is described here, the limitations of the studies and data reported, and their public health implications discussed. Methods The database collates customized summaries of 251 published anti-malarial quality reports in English, French and Spanish by time and location since 1946. It also includes information on assays to determine quality, sampling and medicine regulation. Results No publicly available reports for 60.6% (63) of the 104 malaria-endemic countries were found. Out of 9,348 anti-malarials sampled, 30.1% (2,813) failed chemical/packaging quality tests with 39.3% classified as falsified, 2.3% as substandard and 58.3% as poor quality without evidence available to categorize them as either substandard or falsified. Only 32.3% of the reports explicitly described their definitions of medicine quality and just 9.1% (855) of the samples collected in 4.6% (six) surveys were conducted using random sampling techniques. Packaging analysis was only described in 21.5% of publications and up to twenty wrong active ingredients were found in falsified anti-malarials. Conclusions There are severe neglected problems with anti-malarial quality but there are important caveats to accurately estimate the prevalence and distribution of poor quality anti-malarials. The lack of reports in many malaria-endemic areas, inadequate sampling techniques and inadequate chemical analytical methods and instrumental procedures emphasizes the need to interpret medicine quality results with caution. The available evidence demonstrates the need for more investment to improve both sampling and analytical methodology and to achieve consensus in defining different types of poor quality medicines. PMID:24712972
Deines, Andrew M.; Bunnell, David B.; Rogers, Mark W.; Beard, T. Douglas; Taylor, William W.
2015-01-01
The relationship between autotrophic activity and freshwater fish populations is an important consideration for ecologists describing trophic structure in aquatic communities, fisheries managers tasked with increasing sustainable fisheries development, and fish farmers seeking to maximize production. Previous studies of the empirical relationships of autotrophic activity and freshwater fish yield have found positive relationships but were limited by small sample sizes, small geographic scopes, and the inability to compare patterns among many types of measurement techniques. Individual studies and reviews have also lacked consistent consideration of regional climate factors which may inform relationships between fisheries and autotrophic activity. We compiled data from over 700 freshwater systems worldwide and used meta-analysis and linear models to develop a comprehensive global synthesis between multiple metrics of autotrophic activity, fisheries, and climate indicators. Our results demonstrate that multiple metrics of fish (i.e., catch per unit effort, yield, and production) increase with autotrophic activity across a variety of fisheries. At the global scale additional variation in this positive relationship can be ascribed to regional climate differences (i.e., temperature and precipitation) across systems. Our results provide a method and proof-of-concept for assessing inland fisheries production at the global scale, where current estimates are highly uncertain, and may therefore inform the continued sustainable use of global inland fishery resources.
The Space Geodesy Project and Radio Frequency Interference Characterization and Mitigation
NASA Technical Reports Server (NTRS)
Lawrence, Hilliard M.; Beaudoin, C.; Corey, B. E.; Tourain, C. L.; Petrachenko, B.; Dickey, John
2013-01-01
The Space Geodesy Project (SGP) development by NASA is an effort to co-locate the four international geodetic techniques Satellite Laser Ranging (SLR) and Lunar Laser Ranging (LLR), Very Long Baseline Interferometry (VLBI), Global Navigation Satellite System (GNSS), and Doppler Orbitography and Radiopositioning Integrated by Satellite (DORIS) into one tightly referenced campus and coordinated reference frame analysis. The SGP requirement locates these stations within a small area to maintain line-of-sight and frequent automated survey known as the vector tie system. This causes a direct conflict with the new broadband VLBI technique. Broadband means 2-14 GHz, and RFI susceptibility at -80 dBW or higher due to sensitive RF components in the front end of the radio receiver.
Computerized data reduction techniques for nadir viewing remote sensors
NASA Technical Reports Server (NTRS)
Tiwari, S. N.; Gormsen, Barbara B.
1985-01-01
Computer resources have been developed for the analysis and reduction of MAPS experimental data from the OSTA-1 payload. The MAPS Research Project is concerned with the measurement of the global distribution of mid-tropospheric carbon monoxide. The measurement technique for the MAPS instrument is based on non-dispersive gas filter radiometer operating in the nadir viewing mode. The MAPS experiment has two passive remote sensing instruments, the prototype instrument which is used to measure tropospheric air pollution from aircraft platforms and the third generation (OSTA) instrument which is used to measure carbon monoxide in the mid and upper troposphere from space platforms. Extensive effort was also expended in support of the MAPS/OSTA-3 shuttle flight. Specific capabilities and resources developed are discussed.
The EUSTACE project: delivering global, daily information on surface air temperature
NASA Astrophysics Data System (ADS)
Rayner, Nick
2017-04-01
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-June 2018, https://www.eustaceproject.eu) we are developing an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. As the data volumes involved are considerable, such work needs to include development of new "Big Data" analysis methods. We will present recent progress along this road in the EUSTACE project: 1. providing new, consistent, multi-component estimates of uncertainty in surface skin temperature retrievals from satellites; 2. identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; 3. estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; 4. using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.
The EUSTACE project: delivering global, daily information on surface air temperature
NASA Astrophysics Data System (ADS)
Ghent, D.; Rayner, N. A.
2016-12-01
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-June 2018, https://www.eustaceproject.eu) we are developing an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. As the data volumes involved are considerable, such work needs to include development of new "Big Data" analysis methods. We will present recent progress along this road in the EUSTACE project, i.e.: • providing new, consistent, multi-component estimates of uncertainty in surface skin temperature retrievals from satellites; • identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; • estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; • using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.
NASA Astrophysics Data System (ADS)
van der Linden, Sebastian
2016-05-01
Compiling a good book on urban remote sensing is probably as hard as the research in this disciplinary field itself. Urban areas comprise various environments and show high heterogeneity in many respects, they are highly dynamic in time and space and at the same time of greatest influence on connected and even tele-connected regions due to their great economic importance. Urban remote sensing is therefore of great importance, yet as manifold as its study area: mapping urban areas (or sub-categories thereof) plays an important (and challenging) role in land use and land cover (change) monitoring; the analysis of urban green and forests is by itself a specialization of ecological remote sensing; urban climatology asks for spatially and temporally highly resolved remote sensing products; the detection of artificial objects is not only a common and important remote sensing application but also a typical benchmark for image analysis techniques, etc. Urban analyses are performed with all available spaceborne sensor types and at the same time they are one of the most relevant fields for airborne remote sensing. Several books on urban remote sensing have been published during the past 10 years, each taking a different perspective. The book Global Urban Monitoring and Assessment through Earth Observation is motivated by the objectives of the Global Urban Observation and Information Task (SB-04) in the GEOSS (Global Earth Observation System of Systems) 2012-2015 workplan (compare Chapter 2) and wants to highlight the global aspects of state-of-the-art urban remote sensing.
Perioperative Assessment of Myocardial Deformation
Duncan, Andra E.; Alfirevic, Andrej; Sessler, Daniel I.; Popovic, Zoran B.; Thomas, James D.
2014-01-01
Evaluation of left ventricular performance improves risk assessment and guides anesthetic decisions. However, the most common echocardiographic measure of myocardial function, the left ventricular ejection fraction (LVEF), has important limitations. LVEF is limited by subjective interpretation which reduces accuracy and reproducibility, and LVEF assesses global function without characterizing regional myocardial abnormalities. An alternative objective echocardiographic measure of myocardial function is thus needed. Myocardial deformation analysis, which performs quantitative assessment of global and regional myocardial function, may be useful for perioperative care of surgical patients. Myocardial deformation analysis evaluates left ventricular mechanics by quantifying strain and strain rate. Strain describes percent change in myocardial length in the longitudinal (from base to apex) and circumferential (encircling the short-axis of the ventricle) direction and change in thickness in the radial direction. Segmental strain describes regional myocardial function. Strain is a negative number when the ventricle shortens longitudinally or circumferentially and is positive with radial thickening. Reference values for normal longitudinal strain from a recent meta-analysis using transthoracic echocardiography are (mean ± SD) −19.7 ± 0.4%, while radial and circumferential strain are 47.3 ± 1.9 and −23.3 ± 0.7%, respectively. The speed of myocardial deformation is also important and is characterized by strain rate. Longitudinal systolic strain rate in healthy subjects averages −1.10 ± 0.16 sec−1. Assessment of myocardial deformation requires consideration of both strain (change in deformation), which correlates with LVEF, and strain rate (speed of deformation), which correlates with rate of rise of left ventricular pressure (dP/dt). Myocardial deformation analysis also evaluates ventricular relaxation, twist, and untwist, providing new and noninvasive methods to assess components of myocardial systolic and diastolic function. Myocardial deformation analysis is based on either Doppler or a non-Doppler technique, called speckle-tracking echocardiography. Myocardial deformation analysis provides quantitative measures of global and regional myocardial function for use in the perioperative care of the surgical patient. For example, coronary graft occlusion after coronary artery bypass grafting is detected by an acute reduction in strain in the affected coronary artery territory. In addition, assessment of left ventricular mechanics detects underlying myocardial pathology before abnormalities become apparent on conventional echocardiography. Certainly, patients with aortic regurgitation demonstrate reduced longitudinal strain before reduction in LVEF occurs, which allows detection of subclinical left ventricular dysfunction and predicts increased risk for heart failure and impaired myocardial function after surgical repair. In this review we describe the principles, techniques, and clinical application of myocardial deformation analysis. PMID:24557101
Global Synthesis and Critical Evaluation of Pharmaceutical Data Sets Collected from River Systems
2012-01-01
Pharmaceuticals have emerged as a major group of environmental contaminants over the past decade but relatively little is known about their occurrence in freshwaters compared to other pollutants. We present a global-scale analysis of the presence of 203 pharmaceuticals across 41 countries and show that contamination is extensive due to widespread consumption and subsequent disposal to rivers. There are clear regional biases in current understanding with little work outside North America, Europe, and China, and no work within Africa. Within individual countries, research is biased around a small number of populated provinces/states and the majority of research effort has focused upon just 14 compounds. Most research has adopted sampling techniques that are unlikely to provide reliable and representative data. This analysis highlights locations where concentrations of antibiotics, cardiovascular drugs, painkillers, contrast media, and antiepileptic drugs have been recorded well above thresholds known to cause toxic effects in aquatic biota. Studies of pharmaceutical occurrence and effects need to be seen as a global research priority due to increasing consumption, particularly among societies with aging populations. Researchers in all fields of environmental management need to work together more effectively to identify high risk compounds, improve the reliability and coverage of future monitoring studies, and develop new mitigation measures. PMID:23227929
Sensitivity Analysis of the Static Aeroelastic Response of a Wing
NASA Technical Reports Server (NTRS)
Eldred, Lloyd B.
1993-01-01
A technique to obtain the sensitivity of the static aeroelastic response of a three dimensional wing model is designed and implemented. The formulation is quite general and accepts any aerodynamic and structural analysis capability. A program to combine the discipline level, or local, sensitivities into global sensitivity derivatives is developed. A variety of representations of the wing pressure field are developed and tested to determine the most accurate and efficient scheme for representing the field outside of the aerodynamic code. Chebyshev polynomials are used to globally fit the pressure field. This approach had some difficulties in representing local variations in the field, so a variety of local interpolation polynomial pressure representations are also implemented. These panel based representations use a constant pressure value, a bilinearly interpolated value. or a biquadraticallv interpolated value. The interpolation polynomial approaches do an excellent job of reducing the numerical problems of the global approach for comparable computational effort. Regardless of the pressure representation used. sensitivity and response results with excellent accuracy have been produced for large integrated quantities such as wing tip deflection and trim angle of attack. The sensitivities of such things as individual generalized displacements have been found with fair accuracy. In general, accuracy is found to be proportional to the relative size of the derivatives to the quantity itself.
Kumar, Sameer; Honkanen, Erik J; Karl, Chad C
2009-01-01
This study examines the idea of developing a global health diplomacy supply chain as an important foreign policy approach with the aim of improving the lives of vulnerable populations and serving the best interests of the United States. The study was based on the review of academic literature, news events, and military communiques, and historical writings were studied to determine the feasibility of the idea and the extent of costs and benefits of such an endeavor. An integrated strategic business model, supported by a medical care delivery process, was developed to create a framework for a feasible global health diplomacy supply chain. The findings indicate that extremism can be contained by creating and efficiently executing an effective supply chain to get medical care units to those that need them. The limitations are the potential exit strategies required, the tactical abilities, and diplomatic techniques needed in order to create positive diplomatic change in aid distribution. Managers must consider how supply chains will affect other organizations giving aid and the potential public response. Moreover, determining the level of care necessary to achieve the greatest positive health diplomacy continues to require vigilant scrutiny over the potential cost/benefit analysis. The analysis is valuable to policymakers considering the impacts of health diplomacy by utilizing supply chain management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Yuanshun; Baek, Seung H.; Garcia-Diza, Alberto
2012-01-01
This paper designs a comprehensive approach based on the engineering machine/system concept, to model, analyze, and assess the level of CO2 exchange between the atmosphere and terrestrial ecosystems, which is an important factor in understanding changes in global climate. The focus of this article is on spatial patterns and on the correlation between levels of CO2 fluxes and a variety of influencing factors in eco-environments. The engineering/machine concept used is a system protocol that includes the sequential activities of design, test, observe, and model. This concept is applied to explicitly include various influencing factors and interactions associated with CO2 fluxes.more » To formulate effective models of a large and complex climate system, this article introduces a modeling technique that will be referred to as Stochastic Filtering Analysis of Variance (SFANOVA). The CO2 flux data observed from some sites of AmeriFlux are used to illustrate and validate the analysis, prediction and globalization capabilities of the proposed engineering approach and the SF-ANOVA technology. The SF-ANOVA modeling approach was compared to stepwise regression, ridge regression, and neural networks. The comparison indicated that the proposed approach is a valid and effective tool with similar accuracy and less complexity than the other procedures.« less
NASA Technical Reports Server (NTRS)
Welch, Bryan W.
2016-01-01
NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The first phase of that increasing complexity and fidelity analysis initiative is based on a pure geometrically-derived access technique. The first phase of analysis has been completed, and the results are documented in this paper.
Current trends in satellite based emergency mapping - the need for harmonisation
NASA Astrophysics Data System (ADS)
Voigt, Stefan
2013-04-01
During the past years, the availability and use of satellite image data to support disaster management and humanitarian relief organisations has largely increased. The automation and data processing techniques are greatly improving as well as the capacity in accessing and processing satellite imagery in getting better globally. More and more global activities via the internet and through global organisations like the United Nations or the International Charter Space and Major Disaster engage in the topic, while at the same time, more and more national or local centres engage rapid mapping operations and activities. In order to make even more effective use of this very positive increase of capacity, for the sake of operational provision of analysis results, for fast validation of satellite derived damage assessments, for better cooperation in the joint inter agency generation of rapid mapping products and for general scientific use, rapid mapping results in general need to be better harmonized, if not even standardized. In this presentation, experiences from various years of rapid mapping gained by the DLR Center for satellite based Crisis Information (ZKI) within the context of the national activities, the International Charter Space and Major Disasters, GMES/Copernicus etc. are reported. Furthermore, an overview on how automation, quality assurance and optimization can be achieved through standard operation procedures within a rapid mapping workflow is given. Building on this long term rapid mapping experience, and building on the DLR initiative to set in pace an "International Working Group on Satellite Based Emergency Mapping" current trends in rapid mapping are discussed and thoughts on how the sharing of rapid mapping information can be optimized by harmonizing analysis results and data structures are presented. Such an harmonization of analysis procedures, nomenclatures and representations of data as well as meta data are the basis to better cooperate within the global rapid mapping community throughout local/national, regional/supranational and global scales
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Cunningham, Kevin; Hill, Melissa A.
2013-01-01
Flight test and modeling techniques were developed for efficiently identifying global aerodynamic models that can be used to accurately simulate stall, upset, and recovery on large transport airplanes. The techniques were developed and validated in a high-fidelity fixed-base flight simulator using a wind-tunnel aerodynamic database, realistic sensor characteristics, and a realistic flight deck representative of a large transport aircraft. Results demonstrated that aerodynamic models for stall, upset, and recovery can be identified rapidly and accurately using relatively simple piloted flight test maneuvers. Stall maneuver predictions and comparisons of identified aerodynamic models with data from the underlying simulation aerodynamic database were used to validate the techniques.
Global-Local Analysis and Optimization of a Composite Civil Tilt-Rotor Wing
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masound
1999-01-01
This report gives highlights of an investigation on the design and optimization of a thin composite wing box structure for a civil tilt-rotor aircraft. Two different concepts are considered for the cantilever wing: (a) a thin monolithic skin design, and (b) a thick sandwich skin design. Each concept is examined with three different skin ply patterns based on various combinations of 0, +/-45, and 90 degree plies. The global-local technique is used in the analysis and optimization of the six design models. The global analysis is based on a finite element model of the wing-pylon configuration while the local analysis uses a uniformly supported plate representing a wing panel. Design allowables include those on vibration frequencies, panel buckling, and material strength. The design optimization problem is formulated as one of minimizing the structural weight subject to strength, stiffness, and d,vnamic constraints. Six different loading conditions based on three different flight modes are considered in the design optimization. The results of this investigation reveal that of all the loading conditions the one corresponding to the rolling pull-out in the airplane mode is the most stringent. Also the frequency constraints are found to drive the skin thickness limits, rendering the buckling constraints inactive. The optimum skin ply pattern for the monolithic skin concept is found to be (((0/+/-45/90/(0/90)(sub 2))(sub s))(sub s), while for the sandwich skin concept the optimal ply pattern is found to be ((0/+/-45/90)(sub 2s))(sub s).
NASA Technical Reports Server (NTRS)
Merewitz, L.
1973-01-01
The following step-wise procedure for making a benefit-cost analysis of using remote sensing techniques could be used either in the limited context of California water resources, or a context as broad as the making of integrated resource surveys of the entire earth resource complex on a statewide, regional, national, or global basis. (1) Survey all data collection efforts which can be accomplished by remote sensing techniques. (2) Carefully inspect the State of California budget and the Budget of the United States Government to find annual cost of data collection efforts. (3) Decide the extent to which remote sensing can obviate each of the collection efforts. (4) Sum the annual costs of all data collection which can be equivalently accomplished through remote sensing. (5) Decide what additional data could and would be collected through remote sensing. (6) Estimate the value of this information. It is not harmful to do a benefit-cost analysis so long as its severe limitations are recalled and it is supplemented with socio-economic impact studies.
The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2002-01-01
The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.
Park, Hee-Won; In, Gyo; Kim, Jeong-Han; Cho, Byung-Goo; Han, Gyeong-Ho; Chang, Il-Moo
2013-01-01
Discriminating between two herbal medicines (Panax ginseng and Panax quinquefolius), with similar chemical and physical properties but different therapeutic effects, is a very serious and difficult problem. Differentiation between two processed ginseng genera is even more difficult because the characteristics of their appearance are very similar. An ultraperformance liquid chromatography-quadrupole time-of-flight mass spectrometry (UPLC-QTOF MS)-based metabolomic technique was applied for the metabolite profiling of 40 processed P. ginseng and processed P. quinquefolius. Currently known biomarkers such as ginsenoside Rf and F11 have been used for the analysis using the UPLC-photodiode array detector. However, this method was not able to fully discriminate between the two processed ginseng genera. Thus, an optimized UPLC-QTOF-based metabolic profiling method was adapted for the analysis and evaluation of two processed ginseng genera. As a result, all known biomarkers were identified by the proposed metabolomics, and additional potential biomarkers were extracted from the huge amounts of global analysis data. Therefore, it is expected that such metabolomics techniques would be widely applied to the ginseng research field. PMID:24558312
NASA Astrophysics Data System (ADS)
Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.
2017-12-01
Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.
Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering
NASA Astrophysics Data System (ADS)
Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki
2018-03-01
We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.
Design studies for a technology assessment receiver for global positioning system
NASA Technical Reports Server (NTRS)
Painter, J. H.
1981-01-01
The operational conditions of a radio receiver - microprocessor for the global positioning system are studied. Navigation fundamentals and orbit characterization are reviewed. The global positioning system is described with emphasis upon signal structure and satellite positioning. Ranging and receiver processing techniques are discussed.
Towards global patterns in the diversity and community structure of ectomycorrhizal fungi.
Tedersoo, Leho; Bahram, Mohammad; Toots, Märt; Diédhiou, Abdala G; Henkel, Terry W; Kjøller, Rasmus; Morris, Melissa H; Nara, Kazuhide; Nouhra, Eduardo; Peay, Kabir G; Põlme, Sergei; Ryberg, Martin; Smith, Matthew E; Kõljalg, Urmas
2012-09-01
Global species richness patterns of soil micro-organisms remain poorly understood compared to macro-organisms. We use a global analysis to disentangle the global determinants of diversity and community composition for ectomycorrhizal (EcM) fungi-microbial symbionts that play key roles in plant nutrition in most temperate and many tropical forest ecosystems. Host plant family has the strongest effect on the phylogenetic community composition of fungi, whereas temperature and precipitation mostly affect EcM fungal richness that peaks in the temperate and boreal forest biomes, contrasting with latitudinal patterns of macro-organisms. Tropical ecosystems experience rapid turnover of organic material and have weak soil stratification, suggesting that poor habitat conditions may contribute to the relatively low richness of EcM fungi, and perhaps other soil biota, in most tropical ecosystems. For EcM fungi, greater evolutionary age and larger total area of EcM host vegetation may also contribute to the higher diversity in temperate ecosystems. Our results provide useful biogeographic and ecological hypotheses for explaining the distribution of fungi that remain to be tested by involving next-generation sequencing techniques and relevant soil metadata. © 2012 Blackwell Publishing Ltd.
Marin-Oyaga, Victor A; Salavati, Ali; Houshmand, Sina; Pasha, Ahmed Khurshid; Gharavi, Mohammad; Saboury, Babak; Basu, Sandip; Torigian, Drew A; Alavi, Abass
2015-01-01
Treatment of malignant pleural mesothelioma (MPM) remains very challenging. Assessment of response to treatment is necessary for modifying treatment and using new drugs. Global disease assessment (GDA) by implementing image processing methods to extract more information out of positron emission tomography (PET) images may provide reliable information. In this study we show the feasibility of this method of semi-quantification in patients with mesothelioma, and compare it with the conventional methods. We also present a review of the literature about this topic. Nineteen subjects with histologically proven MPM who had undergone fluoride-18-fluorodeoxyglucose PET/computed tomography ((18)F-FDG PET/CT) before and after treatment were included in this study. An adaptive contrast-oriented thresholding algorithm was used for the image analysis and semi-quantification. Metabolic tumor volume (MTV), maximum and mean standardized uptake volume (SUVmax, SUVmean) and total lesion glycolysis (TLG) were calculated for each region of interest. The global tumor glycolysis (GTG) was obtained by summing up all TLG. Treatment response was assessed by the European Organisation for Research and Treatment of Cancer (EORTC) criteria and the changes of GTG. Agreement between global disease assessment and conventional method was also determined. In patients with progressive disease based on EORTC criteria, GTG showed an increase of 150.7 but in patients with stable or partial response, GTG showed a decrease of 433.1. The SUVmax of patients before treatment was 5.95 (SD: 2.93) and after the treatment it increased to 6.38 (SD: 3.19). Overall concordance of conventional method with GDA method was 57%. Concordance of progression of disease based on conventional method was 44%, stable disease was 85% and partial response was 33%. Discordance was 55%, 14% and 66%. Adaptive contrast-oriented thresholding algorithm is a promising method to quantify the whole tumor glycolysis in patients with mesothelioma. We are able to assess the total metabolic lesion volume, lesion glycolysis, SUVmax, tumor SUVmean and GTG for this particular tumor. Also we were able to demonstrate the potential use of this technique in the monitoring of treatment response. More studies comparing this technique with conventional and other global disease assessment methods are needed in order to clarify its role in the assessment of treatment response and prognosis of these patients.
NASA Astrophysics Data System (ADS)
Klooster, S.; Potter, C. S.; Genovese, V. B.; Gross, P. M.; Kumar, V.; Boriah, S.; Mithal, V.; Castilla-Rubio, J.
2009-12-01
Widely cited forest carbon values from look-up tables and statistical correlations with aboveground biomass have proven to be inadequate to discern details of national carbon stocks in forest pools. Similarly, global estimates based on biome-average (tropical, temperate, boreal, etc.) carbon measurements are generally insufficient to support REDD incentives (Reductions in Emission from Deforestation in Developing countries). The NASA-CASA (Carnegie-Ames-Stanford Approach) ecosystem model published by Potter et al. (1999 and 2003) offers several unique advantages for carbon accounting that cannot be provided by conventional inventory techniques. First, CASA uses continuous satellite observations to map land cover status and changes in vegetation on a monthly time interval over the past 25 years. NASA satellites observe areas that are too remote or rugged for conventional inventory-based techniques to measure. Second, CASA estimates both aboveground and belowground pools of carbon in all ecosystems (forests, shrublands, croplands, and rangelands). Carbon storage estimates for forests globally are currently being estimated for the Cisco Planetary Skin open collaborative platform (www.planetaryskin.org ) in a new series of CASA model runs using the latest input data from the NASA MODIS satellites, from 2000 to the present. We have also developed an approach for detection of large-scale ecosystem disturbance (LSED) events based on sustained declines in the same satellite greenness data used for CASA modeling. This approach is global in scope, covers more than a decade of observations, and encompasses all potential categories of major ecosystem disturbance - physical, biogenic, and anthropogenic, using advanced methods of data mining and analysis. In addition to quantifying forest areas at various levels of risk for loss of carbon storage capacity, our data mining approaches for LSED events can be adapted to detect and map biophysically unsuitable areas for deforestation worldwide and to develop carbon risk scoring algorithms that can enable large scale finance for conservation and reforestation efforts globally.
Analysis and Optimization of Building Energy Consumption
NASA Astrophysics Data System (ADS)
Chuah, Jun Wei
Energy is one of the most important resources required by modern human society. In 2010, energy expenditures represented 10% of global gross domestic product (GDP). By 2035, global energy consumption is expected to increase by more than 50% from current levels. The increased pace of global energy consumption leads to significant environmental and socioeconomic issues: (i) carbon emissions, from the burning of fossil fuels for energy, contribute to global warming, and (ii) increased energy expenditures lead to reduced standard of living. Efficient use of energy, through energy conservation measures, is an important step toward mitigating these effects. Residential and commercial buildings represent a prime target for energy conservation, comprising 21% of global energy consumption and 40% of the total energy consumption in the United States. This thesis describes techniques for the analysis and optimization of building energy consumption. The thesis focuses on building retrofits and building energy simulation as key areas in building energy optimization and analysis. The thesis first discusses and evaluates building-level renewable energy generation as a solution toward building energy optimization. The thesis next describes a novel heating system, called localized heating. Under localized heating, building occupants are heated individually by directed radiant heaters, resulting in a considerably reduced heated space and significant heating energy savings. To support localized heating, a minimally-intrusive indoor occupant positioning system is described. The thesis then discusses occupant-level sensing (OLS) as the next frontier in building energy optimization. OLS captures the exact environmental conditions faced by each building occupant, using sensors that are carried by all building occupants. The information provided by OLS enables fine-grained optimization for unprecedented levels of energy efficiency and occupant comfort. The thesis also describes a retrofit-oriented building energy simulator, ROBESim, that natively supports building retrofits. ROBESim extends existing building energy simulators by providing a platform for the analysis of novel retrofits, in addition to simulating existing retrofits. Using ROBESim, retrofits can be automatically applied to buildings, obviating the need for users to manually update building characteristics for comparisons between different building retrofits. ROBESim also includes several ease-of-use enhancements to support users of all experience levels.
Portable Electronic Nose Based on Electrochemical Sensors for Food Quality Assessment
Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek
2017-01-01
The steady increase in global consumption puts a strain on agriculture and might lead to a decrease in food quality. Currently used techniques of food analysis are often labour-intensive and time-consuming and require extensive sample preparation. For that reason, there is a demand for novel methods that could be used for rapid food quality assessment. A technique based on the use of an array of chemical sensors for holistic analysis of the sample’s headspace is called electronic olfaction. In this article, a prototype of a portable, modular electronic nose intended for food analysis is described. Using the SVM method, it was possible to classify samples of poultry meat based on shelf-life with 100% accuracy, and also samples of rapeseed oil based on the degree of thermal degradation with 100% accuracy. The prototype was also used to detect adulterations of extra virgin olive oil with rapeseed oil with 82% overall accuracy. Due to the modular design, the prototype offers the advantages of solutions targeted for analysis of specific food products, at the same time retaining the flexibility of application. Furthermore, its portability allows the device to be used at different stages of the production and distribution process. PMID:29186754
Sensitivity Analysis for Coupled Aero-structural Systems
NASA Technical Reports Server (NTRS)
Giunta, Anthony A.
1999-01-01
A novel method has been developed for calculating gradients of aerodynamic force and moment coefficients for an aeroelastic aircraft model. This method uses the Global Sensitivity Equations (GSE) to account for the aero-structural coupling, and a reduced-order modal analysis approach to condense the coupling bandwidth between the aerodynamic and structural models. Parallel computing is applied to reduce the computational expense of the numerous high fidelity aerodynamic analyses needed for the coupled aero-structural system. Good agreement is obtained between aerodynamic force and moment gradients computed with the GSE/modal analysis approach and the same quantities computed using brute-force, computationally expensive, finite difference approximations. A comparison between the computational expense of the GSE/modal analysis method and a pure finite difference approach is presented. These results show that the GSE/modal analysis approach is the more computationally efficient technique if sensitivity analysis is to be performed for two or more aircraft design parameters.
Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)
NASA Astrophysics Data System (ADS)
Hancher, M.
2013-12-01
Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.
An analysis of the impact of LHC Run I proton-lead data on nuclear parton densities.
Armesto, Néstor; Paukkunen, Hannu; Penín, José Manuel; Salgado, Carlos A; Zurita, Pía
2016-01-01
We report on an analysis of the impact of available experimental data on hard processes in proton-lead collisions during Run I at the large hadron collider on nuclear modifications of parton distribution functions. Our analysis is restricted to the EPS09 and DSSZ global fits. The measurements that we consider comprise production of massive gauge bosons, jets, charged hadrons and pions. This is the first time a study of nuclear PDFs includes this number of different observables. The goal of the paper is twofold: (i) checking the description of the data by nPDFs, as well as the relevance of these nuclear effects, in a quantitative manner; (ii) testing the constraining power of these data in eventual global fits, for which we use the Bayesian reweighting technique. We find an overall good, even too good, description of the data, indicating that more constraining power would require a better control over the systematic uncertainties and/or the proper proton-proton reference from LHC Run II. Some of the observables, however, show sizeable tension with specific choices of proton and nuclear PDFs. We also comment on the corresponding improvements as regards the theoretical treatment.
Bednarkiewicz, Artur; Whelan, Maurice P
2008-01-01
Fluorescence lifetime imaging (FLIM) is very demanding from a technical and computational perspective, and the output is usually a compromise between acquisition/processing time and data accuracy and precision. We present a new approach to acquisition, analysis, and reconstruction of microscopic FLIM images by employing a digital micromirror device (DMD) as a spatial illuminator. In the first step, the whole field fluorescence image is collected by a color charge-coupled device (CCD) camera. Further qualitative spectral analysis and sample segmentation are performed to spatially distinguish between spectrally different regions on the sample. Next, the fluorescence of the sample is excited segment by segment, and fluorescence lifetimes are acquired with a photon counting technique. FLIM image reconstruction is performed by either raster scanning the sample or by directly accessing specific regions of interest. The unique features of the DMD illuminator allow the rapid on-line measurement of global good initial parameters (GIP), which are supplied to the first iteration of the fitting algorithm. As a consequence, a decrease of the computation time required to obtain a satisfactory quality-of-fit is achieved without compromising the accuracy and precision of the lifetime measurements.
GMDD: a database of GMO detection methods.
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-06-04
Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.
The Global Precipitation Climatology Project (GPCP): Results, Status and Future
NASA Technical Reports Server (NTRS)
Adler, Robert F.
2007-01-01
The Global Precipitation Climatology Project (GPCP) is one of a number of long-term, satellite-based, global analyses routinely produced under the auspices of the World Climate Research Program (WCRP) and its Global Energy and Watercycle EXperiment (GEWEX) program. The research quality analyses are produced a few months after real-time through the efforts of scientists at various national agencies and universities in the U.S., Europe and Japan. The primary product is a monthly analysis of surface precipitation that is globally complete and spans the period 1979-present. There are also pentad analyses for the same period and a daily analysis for the 1997-present period. Although generated with somewhat different data sets and analysis schemes, the pentad and daily data sets are forced to agree with the primary monthly analysis on a grid box by grid box basis. The primary input data sets are from low-orbit passive microwave observations, geostationary infrared observations and surface raingauge information. Examples of research with the data sets are discussed, focusing on tropical (25N-25s) rainfall variations and possible long-term changes in the 28-year (1979-2006) monthly dataset. Techniques are used to discriminate among the variations due to ENSO, volcanic events and possible long-term changes for rainfall over both land and ocean. The impact of the two major volcanic eruptions over the past 25 years is estimated to be about a 5% maximum reduction in tropical rainfall during each event. Although the global change of precipitation in the data set is near zero, a small upward linear change over tropical ocean (0.06 mm/day/l0yr) and a slight downward linear change over tropical land (-0.03 mm/day/l0yr) are examined to understand the impact of the inhomogeneity in the data record and the length of the data set. These positive changes correspond to about a 5% increase (ocean) and 3% increase (ocean plus land) during this time period. Relations between variations in surface temperature and precipitation are analyzed on seasonal to inter-decadal time scales. A new, version 3 of GPCP is being planned to incorporate new satellite information (e.g., TRMM) and provide higher spatial and temporal resolution for at least part of the data record. The goals and plans for that GPCP re-processing will be outlined.
Mixed models and reduction method for dynamic analysis of anisotropic shells
NASA Technical Reports Server (NTRS)
Noor, A. K.; Peters, J. M.
1985-01-01
A time-domain computational procedure is presented for predicting the dynamic response of laminated anisotropic shells. The two key elements of the procedure are: (1) use of mixed finite element models having independent interpolation (shape) functions for stress resultants and generalized displacements for the spatial discretization of the shell, with the stress resultants allowed to be discontinuous at interelement boundaries; and (2) use of a dynamic reduction method, with the global approximation vectors consisting of the static solution and an orthogonal set of Lanczos vectors. The dynamic reduction is accomplished by means of successive application of the finite element method and the classical Rayleigh-Ritz technique. The finite element method is first used to generate the global approximation vectors. Then the Rayleigh-Ritz technique is used to generate a reduced system of ordinary differential equations in the amplitudes of these modes. The temporal integration of the reduced differential equations is performed by using an explicit half-station central difference scheme (Leap-frog method). The effectiveness of the proposed procedure is demonstrated by means of a numerical example and its advantages over reduction methods used with the displacement formulation are discussed.
Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic
NASA Astrophysics Data System (ADS)
Li, Yan; Dai, Shifang; Wu, Weiwei
2016-12-01
Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.
The Global Signature of Ocean Wave Spectra
NASA Astrophysics Data System (ADS)
Portilla-Yandún, Jesús
2018-01-01
A global atlas of ocean wave spectra is developed and presented. The development is based on a new technique for deriving wave spectral statistics, which is applied to the extensive ERA-Interim database from European Centre of Medium-Range Weather Forecasts. Spectral statistics is based on the idea of long-term wave systems, which are unique and distinct at every geographical point. The identification of those wave systems allows their separation from the overall spectrum using the partition technique. Their further characterization is made using standard integrated parameters, which turn out much more meaningful when applied to the individual components than to the total spectrum. The parameters developed include the density distribution of spectral partitions, which is the main descriptor; the identified wave systems; the individual distribution of the characteristic frequencies, directions, wave height, wave age, seasonal variability of wind and waves; return periods derived from extreme value analysis; and crossing-sea probabilities. This information is made available in web format for public use at http://www.modemat.epn.edu.ec/#/nereo. It is found that wave spectral statistics offers the possibility to synthesize data while providing a direct and comprehensive view of the local and regional wave conditions.
Global Interdependence--Knocking the World We Know Off Its Axis.
ERIC Educational Resources Information Center
Hamilton, John Maxwell; Roberts, Lesley
1989-01-01
Discusses the need for teaching about global interdependence. Points out that nearly every dimension of life in the United States shows proliferating connections to other nations. Describes techniques for finding global links and appreciating their importance. Notes that information emanating from the school will increase general knowledge…
Global existence of the three-dimensional viscous quantum magnetohydrodynamic model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Jianwei, E-mail: yangjianwei@ncwu.edu.cn; Ju, Qiangchang, E-mail: qiangchang-ju@yahoo.com
2014-08-15
The global-in-time existence of weak solutions to the viscous quantum Magnetohydrodynamic equations in a three-dimensional torus with large data is proved. The global existence of weak solutions to the viscous quantum Magnetohydrodynamic equations is shown by using the Faedo-Galerkin method and weak compactness techniques.
Bühlmann, Andreas; Dreo, Tanja; Rezzonico, Fabio; Pothier, Joël F; Smits, Theo H M; Ravnikar, Maja; Frey, Jürg E; Duffy, Brion
2014-07-01
Erwinia amylovora causes a major disease of pome fruit trees worldwide, and is regulated as a quarantine organism in many countries. While some diversity of isolates has been observed, molecular epidemiology of this bacterium is hindered by a lack of simple molecular typing techniques with sufficiently high resolution. We report a molecular typing system of E. amylovora based on variable number of tandem repeats (VNTR) analysis. Repeats in the E. amylovora genome were identified with comparative genomic tools, and VNTR markers were developed and validated. A Multiple-Locus VNTR Analysis (MLVA) was applied to E. amylovora isolates from bacterial collections representing global and regional distribution of the pathogen. Based on six repeats, MLVA allowed the distinction of 227 haplotypes among a collection of 833 isolates of worldwide origin. Three geographically separated groups were recognized among global isolates using Bayesian clustering methods. Analysis of regional outbreaks confirmed presence of diverse haplotypes but also high representation of certain haplotypes during outbreaks. MLVA analysis is a practical method for epidemiological studies of E. amylovora, identifying previously unresolved population structure within outbreaks. Knowledge of such structure can increase our understanding on how plant diseases emerge and spread over a given geographical region. © 2013 Society for Applied Microbiology and John Wiley & Sons Ltd.
Refinement of Earth's gravity field with Topex GPS measurements
NASA Technical Reports Server (NTRS)
Wu, Sien-Chong; Wu, Jiun-Tsong
1989-01-01
The NASA Ocean Topography Experiment satellite TOPEX will carry a microwave altimeter accurate to a few centimeters for the measurement of ocean height. The capability can be fully exploited only if TOPEX altitude can be independently determined to 15 cm or better. This in turn requires an accurate gravity model. The gravity will be tuned with selected nine 10-day arcs of laser ranging, which will be the baseline tracking data type, collected in the first six months of TOPEX flight. TOPEX will also carry onboard an experimental Global Positioning System (GPS) flight receiver capable of simultaneously observing six GPS satellites above its horizon to demonstrate the capability of GPS carrier phase and P-code pseudorange for precise determination of the TOPEX orbit. It was found that subdecimeter orbit accuracy can be achieved with a mere two-hour arc of GPS tracking data, provided that simultaneous measurements are also made at six of more ground tracking sites. The precision GPS data from TOPEX are also valuable for refining the gravity model. An efficient technique is presented for gravity tuning using GPS measurements. Unlike conventional global gravity tuning, this technique solves for far fewer gravity parameters in each filter run. These gravity parameters yield local gravity anomalies which can later be combined with the solutions over other parts of the earth to generate a global gravity map. No supercomputing power will be needed for such combining. The approaches used in this study are described and preliminary results of a covariance analysis presented.
Wang, Xiaotong; Liu, Jing; Yang, Xiaomei; Zhang, Qian; Zhang, Yiwen; Li, Qing; Bi, Kaishun
2018-03-30
To rapidly identify and classify complicated components and metabolites for traditional Chinese medicines, a liquid chromatography with quadrupole time-of-flight mass spectrometry method combined with multiple data-processing approaches was established. In this process, Kai-Xin-San, a widely used classic traditional Chinese medicine preparation, was chosen as a model prescription. Initially, the fragmentation patterns, diagnostic product ions and neutral loss of each category of compounds were summarized by collision-induced dissociation analysis of representative standards. In vitro, the multiple product ions filtering technique was utilized to identify the chemical constituents for globally covering trace components. With this strategy, 108 constituents were identified, and compounds database was successfully established. In vivo, the prototype compounds were extracted based on the established database, and the neutral loss filtering technique combined with the drug metabolism reaction rules was employed to identify metabolites. Overall, 69 constituents including prototype and metabolites were characterized in rat plasma and nine constituents were firstly characterized in rat brain, which may be the potential active constituents resulting in curative effects by synergistic interaction. In conclusion, this study provides a generally applicable strategy to global metabolite identification for the complicated components in complex matrix and a chemical basis for further pharmacological research of Kai-Xin-San. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less
NASA Technical Reports Server (NTRS)
Englander, Arnold C.; Englander, Jacob A.
2017-01-01
Interplanetary trajectory optimization problems are highly complex and are characterized by a large number of decision variables and equality and inequality constraints as well as many locally optimal solutions. Stochastic global search techniques, coupled with a large-scale NLP solver, have been shown to solve such problems but are inadequately robust when the problem constraints become very complex. In this work, we present a novel search algorithm that takes advantage of the fact that equality constraints effectively collapse the solution space to lower dimensionality. This new approach walks the filament'' of feasibility to efficiently find the global optimal solution.
Exploring New Pathways in Precipitation Assimilation
NASA Technical Reports Server (NTRS)
Hou, Arthur; Zhang, Sara Q.
2004-01-01
Precipitation assimilation poses a special challenge in that the forward model for rain in a global forecast system is based on parameterized physics, which can have large systematic errors that must be rectified to use precipitation data effectively within a standard statistical analysis framework. We examine some key issues in precipitation assimilation and describe several exploratory studies in assimilating rainfall and latent heating information in NASA's global data assimilation systems using the forecast model as a weak constraint. We present results from two research activities. The first is the assimilation of surface rainfall data using a time-continuous variational assimilation based on a column model of the full moist physics. The second is the assimilation of convective and stratiform latent heating retrievals from microwave sensors using a variational technique with physical parameters in the moist physics schemes as a control variable. We will show the impact of assimilating these data on analyses and forecasts. Among the lessons learned are (1) that the time-continuous application of moisture/temperature tendency corrections to mitigate model deficiencies offers an effective strategy for assimilating precipitation information, and (2) that the model prognostic variables must be allowed to directly respond to an improved rain and latent heating field within an analysis cycle to reap the full benefit of assimilating precipitation information. of microwave radiances versus retrieval information in raining areas, and initial efforts in developing ensemble techniques such as Kalman filter/smoother for precipitation assimilation. Looking to the future, we discuss new research directions including the assimilation
Statistical Downscaling of WRF-Chem Model: An Air Quality Analysis over Bogota, Colombia
NASA Astrophysics Data System (ADS)
Kumar, Anikender; Rojas, Nestor
2015-04-01
Statistical downscaling is a technique that is used to extract high-resolution information from regional scale variables produced by coarse resolution models such as Chemical Transport Models (CTMs). The fully coupled WRF-Chem (Weather Research and Forecasting with Chemistry) model is used to simulate air quality over Bogota. Bogota is a tropical Andean megacity located over a high-altitude plateau in the middle of very complex terrain. The WRF-Chem model was adopted for simulating the hourly ozone concentrations. The computational domains were chosen of 120x120x32, 121x121x32 and 121x121x32 grid points with horizontal resolutions of 27, 9 and 3 km respectively. The model was initialized with real boundary conditions using NCAR-NCEP's Final Analysis (FNL) and a 1ox1o (~111 km x 111 km) resolution. Boundary conditions were updated every 6 hours using reanalysis data. The emission rates were obtained from global inventories, namely the REanalysis of the TROpospheric (RETRO) chemical composition and the Emission Database for Global Atmospheric Research (EDGAR). Multiple linear regression and artificial neural network techniques are used to downscale the model output at each monitoring stations. The results confirm that the statistically downscaled outputs reduce simulated errors by up to 25%. This study provides a general overview of statistical downscaling of chemical transport models and can constitute a reference for future air quality modeling exercises over Bogota and other Colombian cities.
On the potential of Galileo E5 for time transfer.
Martínez-Belda, Mari Carmen; Defraigne, Pascale; Bruyninx, Carine
2013-01-01
The main global navigation satellite systems (GNSS) technique currently used for accurate time and frequency transfer is based on an analysis of the ionosphere-free combinations of dual-frequency code and carrier phase measurements in a precise point positioning (PPP) mode. This technique analyses the observations of one GNSS station using external products for satellite clocks and orbits to determine the position and clock synchronization errors of this station. The frequency stability of this time transfer is limited by the noise and multipath of the Global Positioning System (GPS) and Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS) codes. In the near future, Galileo will offer a broadband signal E5, with low noise in the centimeter range and with the lowest multipath error ever observed. This paper investigates new analysis procedures based on the E5 codeplus- carrier (CPC) combination for time transfer. The CPC combination with E5 provides a noise level 10 times lower than the ionosphere-free combination of Galileo E1 and E5, which is very promising for improving GNSS time transfer performances. From some tests with simulated Galileo data, it is shown here that the use of the CPC combination with E5 does not improve, at present, the medium- and long-term stability of time transfer with respect to the ionosphere-free combination of Galileo E1 and E5 codes, because of the need for a second frequency signal to correct for the ionospheric delays and ambiguities.
Visualization of Heart Sounds and Motion Using Multichannel Sensor
NASA Astrophysics Data System (ADS)
Nogata, Fumio; Yokota, Yasunari; Kawamura, Yoko
2010-06-01
As there are various difficulties associated with auscultation techniques, we have devised a technique for visualizing heart motion in order to assist in the understanding of heartbeat for both doctors and patients. Auscultatory sounds were first visualized using FFT and Wavelet analysis to visualize heart sounds. Next, to show global and simultaneous heart motions, a new technique for visualization was established. The visualization system consists of a 64-channel unit (63 acceleration sensors and one ECG sensor) and a signal/image analysis unit. The acceleration sensors were arranged in a square array (8×8) with a 20-mm pitch interval, which was adhered to the chest surface. The heart motion of one cycle was visualized at a sampling frequency of 3 kHz and quantization of 12 bits. The visualized results showed a typical waveform motion of the strong pressure shock due to closing tricuspid valve and mitral valve of the cardiac apex (first sound), and the closing aortic and pulmonic valve (second sound) in sequence. To overcome difficulties in auscultation, the system can be applied to the detection of heart disease and to the digital database management of the auscultation examination in medical areas.
Analysis of Multi-Antenna GNSS Receiver Performance under Jamming Attacks.
Vagle, Niranjana; Broumandan, Ali; Lachapelle, Gérard
2016-11-17
Although antenna array-based Global Navigation Satellite System (GNSS) receivers can be used to mitigate both narrowband and wideband electronic interference sources, measurement distortions induced by array processing methods are not suitable for high precision applications. The measurement distortions have an adverse effect on the carrier phase ambiguity resolution, affecting the navigation solution. Depending on the array attitude information availability and calibration parameters, different spatial processing methods can be implemented although they distort carrier phase measurements in some cases. This paper provides a detailed investigation of the effect of different array processing techniques on array-based GNSS receiver measurements and navigation performance. The main novelty of the paper is to provide a thorough analysis of array-based GNSS receivers employing different beamforming techniques from tracking to navigation solution. Two beamforming techniques, namely Power Minimization (PM) and Minimum Power Distortionless Response (MPDR), are being investigated. In the tracking domain, the carrier Doppler, Phase Lock Indicator (PLI), and Carrier-to-Noise Ratio (C/N₀) are analyzed. Pseudorange and carrier phase measurement distortions and carrier phase position performance are also evaluated. Performance analyses results from simulated GNSS signals and field tests are provided.
NASA Astrophysics Data System (ADS)
Vergados, P.; Mannucci, A. J.; Ao, C. O.; Verkhoglyadova, O. P.; Iijima, B.
2017-12-01
This presentation introduces the fundamentals of the Global Positioning System radio occultation (GPS RO) remote sensing technique in retrieving atmospheric temperature and humidity information and presents the use of these observations in climate research. Our objective is to demonstrate and establish the GPS RO remote sensing technique as a complementary data set to existing state-of-the-art space-based platforms for climate studies. We show how GPS RO measurements at 1.2-1.6 GHz frequency band can be used to infer the upper tropospheric water vapor and temperature feedbacks and we present a decade-long specific humidity (SH) record from January 2007 until December 2015. We cross-compare the GPS RO-estimated climate feedbacks and the SH long-record with independent data sets from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), the European Center for Medium-range Weather Forecasts Re-Analysis Interim (ERA-Interim), and the Atmospheric Infrared Sounder (AIRS) instrument. These cross-comparisons serve as a performance guide for the GPS-RO observations with respect to other data sets by providing an independent measure of climate feedbacks and humidity short-term trends.
NASA Astrophysics Data System (ADS)
Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.
2004-08-01
The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.
Segmentation of the Speaker's Face Region with Audiovisual Correlation
NASA Astrophysics Data System (ADS)
Liu, Yuyu; Sato, Yoichi
The ability to find the speaker's face region in a video is useful for various applications. In this work, we develop a novel technique to find this region within different time windows, which is robust against the changes of view, scale, and background. The main thrust of our technique is to integrate audiovisual correlation analysis into a video segmentation framework. We analyze the audiovisual correlation locally by computing quadratic mutual information between our audiovisual features. The computation of quadratic mutual information is based on the probability density functions estimated by kernel density estimation with adaptive kernel bandwidth. The results of this audiovisual correlation analysis are incorporated into graph cut-based video segmentation to resolve a globally optimum extraction of the speaker's face region. The setting of any heuristic threshold in this segmentation is avoided by learning the correlation distributions of speaker and background by expectation maximization. Experimental results demonstrate that our method can detect the speaker's face region accurately and robustly for different views, scales, and backgrounds.
Spatially distributed modal signals of free shallow membrane shell structronic system
NASA Astrophysics Data System (ADS)
Yue, H. H.; Deng, Z. Q.; Tzou, H. S.
2008-11-01
Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last 20 years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of shallow paraboloidal membrane shells are not clearly understood. In this paper, modeling of free flexible paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.
Spatial Signal Characteristics of Shallow Paraboloidal Shell Structronic Systems
NASA Astrophysics Data System (ADS)
Yue, H. H.; Deng, Z. Q.; Tzou, H. S.
Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last twenty years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of thin flexible membrane shells are not clearly understood. In this paper, modeling of free thin paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.
Secondary ion mass spectrometry: The application in the analysis of atmospheric particulate matter
Huang, Di; Hua, Xin; Xiu, Guang-Li; ...
2017-07-24
Currently, considerable attention has been paid to atmospheric particulate matter (PM) investigation due to its importance in human health and global climate change. Surface characterization, single particle analysis and depth profiling of PM is important for a better understanding of its formation processes and predicting its impact on the environment and human being. Secondary ion mass spectrometry (SIMS) is a surface technique with high surface sensitivity, high spatial resolution chemical imaging and unique depth profiling capabilities. Recent research shows that SIMS has great potential in analyzing both surface and bulk chemical information of PM. In this review, we give amore » brief introduction of SIMS working principle and survey recent applications of SIMS in PM characterization. In particular, analyses from different types of PM sources by various SIMS techniques were discussed concerning their advantages and limitations. Finally, we propose, the future development and needs of SIMS in atmospheric aerosol measurement with a perspective in broader environmental sciences.« less
Secondary ion mass spectrometry: The application in the analysis of atmospheric particulate matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Di; Hua, Xin; Xiu, Guang-Li
Currently, considerable attention has been paid to atmospheric particulate matter (PM) investigation due to its importance in human health and global climate change. Surface characterization, single particle analysis and depth profiling of PM is important for a better understanding of its formation processes and predicting its impact on the environment and human being. Secondary ion mass spectrometry (SIMS) is a surface technique with high surface sensitivity, high spatial resolution chemical imaging and unique depth profiling capabilities. Recent research shows that SIMS has great potential in analyzing both surface and bulk chemical information of PM. In this review, we give amore » brief introduction of SIMS working principle and survey recent applications of SIMS in PM characterization. In particular, analyses from different types of PM sources by various SIMS techniques were discussed concerning their advantages and limitations. Finally, we propose, the future development and needs of SIMS in atmospheric aerosol measurement with a perspective in broader environmental sciences.« less
Kappagantu, Madhu; Villamor, Dan Edward V; Bullock, Jeff M; Eastwell, Kenneth C
2017-07-01
Hop stunt disease caused by Hop stunt viroid (HSVd) is a growing threat to hop cultivation globally. HSVd spreads mainly by use of contaminated planting material and by mechanical means. Thorough testing of hop yards and removal of infected bines are critical components of efforts to control the spread of the disease. Reverse transcription-polymerase chain reaction (RT-PCR) has become the primary technique used for HSVd detection; however, sample handling and analysis are technically challenging. In this study, a robust reverse transcription-recombinase polymerase amplification (RT-RPA) assay was developed to facilitate analysis of multiple samples. The assay was optimized with all major variants of HSVd from other host species in addition to hop variants. Used in conjunction with sample collection cards, RT-RPA accommodates large sample numbers. Greenhouse and farm samples tested with RT-RPA were also tested with RT-PCR and a 100% correlation between the two techniques was found. Copyright © 2017. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Mogk, D. W.
2002-12-01
One of the hallmarks of geoscience research is the process of moving between observations and interpretations on local and global scales to develop an integrated understanding of Earth processes. Understanding this interplay is an important aspect of student geoscience learning which leads to an understanding of the fundamental principles of science and geoscience and of the connections between local natural phenomena or human activity and global processes. Several techniques that engage students in inquiry and discovery (as recommended in the National Science Education Standards, NRC 1996, Shaping the Future of Undergraduate Earth Science Education, AGU, 1997) hold promise for helping students make these connections. These include the development of global data sets from local observations (e.g. GLOBE); studying small scale or local phenomenon in the context of global models (e.g. carbon storage in local vegetation and its role in the carbon cycle); or an analysis of local environmental issues in a global context (e.g. a comparison of local flooding to flooding in other countries and analysis in the context of weather, geology and development patterns). Research on learning suggests that data-rich activities linking the local and global have excellent potential for enhancing student learning because 1) students have already developed observations and interpretations of their local environment which can serve as a starting point for constructing new knowledge and 2) this context may motivate learning and develop understanding that can be transferred to other situations. (How People Learn, NRC, 2001). Faculty and teachers at two recent workshops confirm that projects that involve local or global data can engage students in learning by providing real world context, creating student ownership of the learning process, and developing scientific skills applicable to the complex problems that characterize modern science and society. Workshop participants called for increased dissemination of examples of effective practice, evaluation of the impact of data-rich activities on learning, and further development of data access infrastructure and services. (for additional workshop results and discussion see http://serc.carleton.edu/research_education/usingdata)
NASA Technical Reports Server (NTRS)
Shum, C. K.
1999-01-01
The Earth's modem climate change has been characterized by interlinked changes in temperature, CO2, ice sheets and sea level. Global sea level change is a critical indicator for study of contemporary climate change. Sea level rise appears to have accelerated since the ice sheet retreats have stopped some 5000 years ago and it is estimated that the sea level rise has been approx. 15 cm over the last century. Contemporary radar altimeters represent the only technique capable of monitoring global sea level change with accuracy approaching 1 mm/yr and with a temporal scale of days and a spatial scale of 100 km or longer. This report highlights the major accomplishments of the TOPEX/POSEIDON (T/P) Extended Mission and Jason-1 science investigation. The primary objectives of the investigation include the calibration and improvement of T/P and Jason-1 altimeter data for global sea level change and coastal tide and circulation studies. The scientific objectives of the investigation include: (1) the calibration and improvement of T/P and Jason-1 data as a reference measurement system for the accurate cross-linking with other altimeter systems (Seasat, Geosat, ERS-1, ERS-2, GFO-1, and Envisat), (2) the improved determination and the associated uncertainties of the long-term (15-year) global mean sea level change using multiple altimeters, (3) the characterization of the sea level change by analyses of independent data, including tide gauges, sea surface temperature, and (4) the improvement coastal radar altimetry for studies including coastal ocean tide modeling and coastal circulation. Major accomplishments of the investigation include the development of techniques for low-cost radar altimeter absolute calibration (including the associated GPS-buoy technology), coastal ocean tide modeling, and the linking of multiple altimeter systems and the resulting determination of the 15-year (1985-1999) global mean sea level variations. The current rate of 15-year sea level rise observed by multiple satellite altimetry is +2.3 +/- 1.2 mm/yr, which is in general agreement with the analysis of sparsely distributed tide gauge measurements for the same data span, and represents the first such determination of sea level change in its kind.
Precise tracking of remote sensing satellites with the Global Positioning System
NASA Technical Reports Server (NTRS)
Yunck, Thomas P.; Wu, Sien-Chong; Wu, Jiun-Tsong; Thornton, Catherine L.
1990-01-01
The Global Positioning System (GPS) can be applied in a number of ways to track remote sensing satellites at altitudes below 3000 km with accuracies of better than 10 cm. All techniques use a precise global network of GPS ground receivers operating in concert with a receiver aboard the user satellite, and all estimate the user orbit, GPS orbits, and selected ground locations simultaneously. The GPS orbit solutions are always dynamic, relying on the laws of motion, while the user orbit solution can range from purely dynamic to purely kinematic (geometric). Two variations show considerable promise. The first one features an optimal synthesis of dynamics and kinematics in the user solution, while the second introduces a novel gravity model adjustment technique to exploit data from repeat ground tracks. These techniques, to be demonstrated on the Topex/Poseidon mission in 1992, will offer subdecimeter tracking accuracy for dynamically unpredictable satellites down to the lowest orbital altitudes.
NASA Astrophysics Data System (ADS)
Vaz, Miguel; Luersen, Marco A.; Muñoz-Rojas, Pablo A.; Trentin, Robson G.
2016-04-01
Application of optimization techniques to the identification of inelastic material parameters has substantially increased in recent years. The complex stress-strain paths and high nonlinearity, typical of this class of problems, require the development of robust and efficient techniques for inverse problems able to account for an irregular topography of the fitness surface. Within this framework, this work investigates the application of the gradient-based Sequential Quadratic Programming method, of the Nelder-Mead downhill simplex algorithm, of Particle Swarm Optimization (PSO), and of a global-local PSO-Nelder-Mead hybrid scheme to the identification of inelastic parameters based on a deep drawing operation. The hybrid technique has shown to be the best strategy by combining the good PSO performance to approach the global minimum basin of attraction with the efficiency demonstrated by the Nelder-Mead algorithm to obtain the minimum itself.
Chen, Xiaofeng; Song, Qiankun; Li, Zhongshan; Zhao, Zhenjiang; Liu, Yurong
2018-07-01
This paper addresses the problem of stability for continuous-time and discrete-time quaternion-valued neural networks (QVNNs) with linear threshold neurons. Applying the semidiscretization technique to the continuous-time QVNNs, the discrete-time analogs are obtained, which preserve the dynamical characteristics of their continuous-time counterparts. Via the plural decomposition method of quaternion, homeomorphic mapping theorem, as well as Lyapunov theorem, some sufficient conditions on the existence, uniqueness, and global asymptotical stability of the equilibrium point are derived for the continuous-time QVNNs and their discrete-time analogs, respectively. Furthermore, a uniform sufficient condition on the existence, uniqueness, and global asymptotical stability of the equilibrium point is obtained for both continuous-time QVNNs and their discrete-time version. Finally, two numerical examples are provided to substantiate the effectiveness of the proposed results.
Boyle, Peter; Autier, Philippe; van Wees, Sibylle Herzig; Sullivan, Richard
2015-01-01
Summary According to the Global Burden of Disease, trauma is now responsible for five million deaths each year. High-income countries have made great strides in reducing trauma-related mortality figures but low–middle-income countries have been left behind with high trauma-related fatality rates, primarily in the younger population. Much of the progress high-income countries have made in managing trauma rests on advances developed in their armed forces. This analysis looks at the recent advances in high-income military trauma systems and the potential transferability of those developments to the civilian health systems particularly in low–middle-income countries. It also evaluates some potential lifesaving trauma management techniques, proven effective in the military, and the barriers preventing these from being implemented in civilian settings. PMID:25792616
NASA Astrophysics Data System (ADS)
Dobbyn, Abigail J.; Knowles, Peter J.
A number of established techniques for obtaining diabatic electronic states in small molecules are critically compared for the example of the X and B states in the water molecule, which contribute to the two lowest-energy conical intersections. Integration of the coupling matrix elements and analysis of configuration mixing coefficients both produce reliable diabatic states globally. Methods relying on diagonalization of dipole moment and angular momentum operators are shown to fail in large regions of coordinate space. However, the use of transition angular momentum matrix elements involving the A state, which is degenerate with B at the conical intersections, is successful globally, provided that an appropriate choice of coordinates is made. Long range damping of non-adiabatic coupling to give correct asymptotic mixing angles also is investigated.
Geodetic positioning using a global positioning system of satellites
NASA Technical Reports Server (NTRS)
Fell, P. J.
1980-01-01
Geodetic positioning using range, integrated Doppler, and interferometric observations from a constellation of twenty-four Global Positioning System satellites is analyzed. A summary of the proposals for geodetic positioning and baseline determination is given which includes a description of measurement techniques and comments on rank deficiency and error sources. An analysis of variance comparison of range, Doppler, and interferometric time delay to determine their relative geometric strength for baseline determination is included. An analytic examination to the effect of a priori constraints on positioning using simultaneous observations from two stations is presented. Dynamic point positioning and baseline determination using range and Doppler is examined in detail. Models for the error sources influencing dynamic positioning are developed. Included is a discussion of atomic clock stability, and range and Doppler observation error statistics based on random correlated atomic clock error are derived.
Global Geodesy Using GPS Without Fiducial Sites
NASA Technical Reports Server (NTRS)
Heflin, Michael B.; Blewitt, Geoffrey
1994-01-01
Global Positioning System, GPS, used to make global geodetic measurements without use of fiducial site coordinates. Baseline lengths and geocentric radii for each site determined without having to fix any site coordinates. Given n globally distributed sites, n baseline lengths and n geocentric radii form polyhedron with each site at vertex and with geocenter at intersection of all radii. Geodetic information derived from structure of polyhedron and its change with time. Approach applied to any global geodetic technique.
Techniques for analyses of trends in GRUAN data
NASA Astrophysics Data System (ADS)
Bodeker, G. E.; Kremser, S.
2015-04-01
The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterized and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterized uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).
Techniques for analyses of trends in GRUAN data
NASA Astrophysics Data System (ADS)
Bodeker, G. E.; Kremser, S.
2014-12-01
The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterised and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterised uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).
Ionospheric research for space weather service support
NASA Astrophysics Data System (ADS)
Stanislawska, Iwona; Gulyaeva, Tamara; Dziak-Jankowska, Beata
2016-07-01
Knowledge of the behavior of the ionosphere is very important for space weather services. A wide variety of ground based and satellite existing and future systems (communications, radar, surveillance, intelligence gathering, satellite operation, etc) is affected by the ionosphere. There are the needs for reliable and efficient support for such systems against natural hazard and minimalization of the risk failure. The joint research Project on the 'Ionospheric Weather' of IZMIRAN and SRC PAS is aimed to provide on-line the ionospheric parameters characterizing the space weather in the ionosphere. It is devoted to science, techniques and to more application oriented areas of ionospheric investigation in order to support space weather services. The studies based on data mining philosophy increasing the knowledge of ionospheric physical properties, modelling capabilities and gain applications of various procedures in ionospheric monitoring and forecasting were concerned. In the framework of the joint Project the novel techniques for data analysis, the original system of the ionospheric disturbance indices and their implementation for the ionosphere and the ionospheric radio wave propagation are developed since 1997. Data of ionosonde measurements and results of their forecasting for the ionospheric observatories network, the regional maps and global ionospheric maps of total electron content from the navigational satellite system (GNSS) observations, the global maps of the F2 layer peak parameters (foF2, hmF2) and W-index of the ionospheric variability are provided at the web pages of SRC PAS and IZMIRAN. The data processing systems include analysis and forecast of geomagnetic indices ap and kp and new eta index applied for the ionosphere forecasting. For the first time in the world the new products of the W-index maps analysis are provided in Catalogues of the ionospheric storms and sub-storms and their association with the global geomagnetic Dst storms is investigated. The products of the Project web sites at http://www.cbk.waw.pl/rwc and http://www.izmiran.ru/services/iweather are widely used in scientific investigations and numerous applications by the telecommunication and navigation operators and users whose number at the web sites is growing substantially from month to month.
Xu, Daolin; Lu, Fangfang
2006-12-01
We address the problem of reconstructing a set of nonlinear differential equations from chaotic time series. A method that combines the implicit Adams integration and the structure-selection technique of an error reduction ratio is proposed for system identification and corresponding parameter estimation of the model. The structure-selection technique identifies the significant terms from a pool of candidates of functional basis and determines the optimal model through orthogonal characteristics on data. The technique with the Adams integration algorithm makes the reconstruction available to data sampled with large time intervals. Numerical experiment on Lorenz and Rossler systems shows that the proposed strategy is effective in global vector field reconstruction from noisy time series.
Classification of the Regional Ionospheric Disturbance Based on Machine Learning Techniques
NASA Astrophysics Data System (ADS)
Terzi, Merve Begum; Arikan, Orhan; Karatay, Secil; Arikan, Feza; Gulyaeva, Tamara
2016-08-01
In this study, Total Electron Content (TEC) estimated from GPS receivers is used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. For the automated classification of regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. Performance of developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing developed classification technique to Global Ionospheric Map (GIM) TEC data, which is provided by the NASA Jet Propulsion Laboratory (JPL), it is shown that SVM can be a suitable learning method to detect anomalies in TEC variations.
Developing the Second Generation CMORPH: A Prototype
NASA Astrophysics Data System (ADS)
Xie, Pingping; Joyce, Robert
2014-05-01
A prototype system of the second generation CMORPH is being developed at NOAA Climate Prediction Center (CPC) to produce global analyses of 30-min precipitation on a 0.05deg lat/lon grid over the entire globe from pole to pole through integration of information from satellite observations as well as numerical model simulations. The second generation CMORPH is built upon the Kalman Filter based CMORPH algorithm of Joyce and Xie (2011). Inputs to the system include rainfall and snowfall rate retrievals from passive microwave (PMW) measurements aboard all available low earth orbit (LEO) satellites, estimates derived from infrared (IR) observations of geostationary (GEO) as well as LEO platforms, and precipitation simulations from numerical global models. First, precipitation estimation / retrievals from various sources are mapped onto a global grid of 0.05deg lat/lon and calibrated against a common reference field to ensure consistency in their precipitation rate PDF structures. The motion vectors for the precipitating cloud systems are then defined using information from both satellite IR observations and precipitation fields generated by the NCEP Climate Forecast System Reanalysis (CFSR). To this end, motion vectors are first computed from CFSR hourly precipitation fields through cross-correlation analysis of consecutive hourly precipitation fields on the global T382 (~35 km) grid. In a similar manner, separate processing is also performed on satellite IR-based precipitation estimates to derive motion vectors from observations. A blended analysis of precipitating cloud motion vectors is then constructed through the combination of CFSR and satellite-derived vectors with an objective analysis technique. Fine resolution mapped PMW precipitation retrievals are then separately propagated along the motion vectors from their respective observation times to the target analysis time from both forward and backward directions. The CMORPH high resolution precipitation analyses are finally constructed through the combination of propagated PMW retrievals with the IR based estimates for the target analysis time. This Kalman Filter based CMORPH processing is performed for rainfall and snowfall fields separately with the same motion vectors. Experiments have been conducted for two periods of two months each, July - August 2009, and January - February 2010, to explore the development of an optimal algorithm that generates global precipitation for summer and winter situations. Preliminary results demonstrated technical feasibility to construct global rainfall and snowfall analyses through the integration of information from multiple sources. More work is underway to refine various technical components of the system for operational applications of the system. Detailed results will be reported at the EGU meeting.
Streak Imaging Flow Cytometer for Rare Cell Analysis.
Balsam, Joshua; Bruck, Hugh Alan; Ossandon, Miguel; Prickril, Ben; Rasooly, Avraham
2017-01-01
There is a need for simple and affordable techniques for cytology for clinical applications, especially for point-of-care (POC) medical diagnostics in resource-poor settings. However, this often requires adapting expensive and complex laboratory-based techniques that often require significant power and are too massive to transport easily. One such technique is flow cytometry, which has great potential for modification due to the simplicity of the principle of optical tracking of cells. However, it is limited in that regard due to the flow focusing technique used to isolate cells for optical detection. This technique inherently reduces the flow rate and is therefore unsuitable for rapid detection of rare cells which require large volume for analysis.To address these limitations, we developed a low-cost, mobile flow cytometer based on streak imaging. In our new configuration we utilize a simple webcam for optical detection over a large area associated with a wide-field flow cell. The new flow cell is capable of larger volume and higher throughput fluorescence detection of rare cells than the flow cells with hydrodynamic focusing used in conventional flow cytometry. The webcam is an inexpensive, commercially available system, and for fluorescence analysis we use a 1 W 450 nm blue laser to excite Syto-9 stained cells with emission at 535 nm. We were able to detect low concentrations of stained cells at high flow rates of 10 mL/min, which is suitable for rapidly analyzing larger specimen volumes to detect rare cells at appropriate concentration levels. The new rapid detection capabilities, combined with the simplicity and low cost of this device, suggest a potential for clinical POC flow cytometry in resource-poor settings associated with global health.
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Codimension-Two Bifurcation Analysis in DC Microgrids Under Droop Control
NASA Astrophysics Data System (ADS)
Lenz, Eduardo; Pagano, Daniel J.; Tahim, André P. N.
This paper addresses local and global bifurcations that may appear in electrical power systems, such as DC microgrids, which recently has attracted interest from the electrical engineering society. Most sources in these networks are voltage-type and operate in parallel. In such configuration, the basic technique for stabilizing the bus voltage is the so-called droop control. The main contribution of this work is a codimension-two bifurcation analysis of a small DC microgrid considering the droop control gain and the power processed by the load as bifurcation parameters. The codimension-two bifurcation set leads to practical rules for achieving a robust droop control design. Moreover, the bifurcation analysis also offers a better understanding of the dynamics involved in the problem and how to avoid possible instabilities. Simulation results are presented in order to illustrate the bifurcation analysis.
Error analysis of multipoint flux domain decomposition methods for evolutionary diffusion problems
NASA Astrophysics Data System (ADS)
Arrarás, A.; Portero, L.; Yotov, I.
2014-01-01
We study space and time discretizations for mixed formulations of parabolic problems. The spatial approximation is based on the multipoint flux mixed finite element method, which reduces to an efficient cell-centered pressure system on general grids, including triangles, quadrilaterals, tetrahedra, and hexahedra. The time integration is performed by using a domain decomposition time-splitting technique combined with multiterm fractional step diagonally implicit Runge-Kutta methods. The resulting scheme is unconditionally stable and computationally efficient, as it reduces the global system to a collection of uncoupled subdomain problems that can be solved in parallel without the need for Schwarz-type iteration. Convergence analysis for both the semidiscrete and fully discrete schemes is presented.
ONU Power Saving Scheme for EPON System
NASA Astrophysics Data System (ADS)
Mukai, Hiroaki; Tano, Fumihiko; Tanaka, Masaki; Kozaki, Seiji; Yamanaka, Hideaki
PON (Passive Optical Network) achieves FTTH (Fiber To The Home) economically, by sharing an optical fiber among plural subscribers. Recently, global climate change has been recognized as a serious near term problem. Power saving techniques for electronic devices are important. In PON system, the ONU (Optical Network Unit) power saving scheme has been studied and defined in XG-PON. In this paper, we propose an ONU power saving scheme for EPON. Then, we present an analysis of the power reduction effect and the data transmission delay caused by the ONU power saving scheme. According to the analysis, we propose an efficient provisioning method for the ONU power saving scheme which is applicable to both of XG-PON and EPON.
Reconstructing biochemical pathways from time course data.
Srividhya, Jeyaraman; Crampin, Edmund J; McSharry, Patrick E; Schnell, Santiago
2007-03-01
Time series data on biochemical reactions reveal transient behavior, away from chemical equilibrium, and contain information on the dynamic interactions among reacting components. However, this information can be difficult to extract using conventional analysis techniques. We present a new method to infer biochemical pathway mechanisms from time course data using a global nonlinear modeling technique to identify the elementary reaction steps which constitute the pathway. The method involves the generation of a complete dictionary of polynomial basis functions based on the law of mass action. Using these basis functions, there are two approaches to model construction, namely the general to specific and the specific to general approach. We demonstrate that our new methodology reconstructs the chemical reaction steps and connectivity of the glycolytic pathway of Lactococcus lactis from time course experimental data.
Preparing Colorful Astronomical Images III: Cosmetic Cleaning
NASA Astrophysics Data System (ADS)
Frattare, L. M.; Levay, Z. G.
2003-12-01
We present cosmetic cleaning techniques for use with mainstream graphics software (Adobe Photoshop) to produce presentation-quality images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope when producing photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to discuss the treatment of various detector-attributed artifacts such as cosmic rays, chip seams, gaps, optical ghosts, diffraction spikes and the like. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to final presentation images. Other pixel-to-pixel applications such as filter smoothing and global noise reduction will be discussed.
Magnetic resonance imaging for diagnosis of early Alzheimer's disease.
Colliot, O; Hamelin, L; Sarazin, M
2013-10-01
A major challenge for neuroimaging is to contribute to the early diagnosis of Alzheimer's disease (AD). In particular, magnetic resonance imaging (MRI) allows detecting different types of structural and functional abnormalities at an early stage of the disease. Anatomical MRI is the most widely used technique and provides local and global measures of atrophy. The recent diagnostic criteria of "mild cognitive impairment due to AD" include hippocampal atrophy, which is considered a marker of neuronal injury. Advanced image analysis techniques generate automatic and reproducible measures both in the hippocampus and throughout the whole brain. Recent modalities such as diffusion-tensor imaging and resting-state functional MRI provide additional measures that could contribute to the early diagnosis but require further validation. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Hamurcu, Fatma; Mamaş, Serhat; Ozdemir, Ummuhan Ozmen; Gündüzalp, Ayla Balaban; Senturk, Ozan Sanlı
2016-08-01
The aromatic/five-membered heteroaromatic butanesulfonylhydrazone derivatives; 5-bromosalicylaldehydebutanesulfonylhydrazone(1), 2-hydroxy-1-naphthaldehydebutane sulfonylhydrazone(2), indole-3-carboxaldehydebutanesulfonylhydrazone (3), 2-acetylfuran- carboxyaldehydebutanesulfonylhydrazone(4), 2-acetylthiophenecarboxyaldehydebutane- sulfonylhydrazone(5) and 2-acetyl-5-chlorothiophenecarboxyaldehydebutanesulfonyl hydrazone (6) were synthesized by the reaction of butane sulfonic acid hydrazide with aldehydes/ketones and characterized by using elemental analysis, 1H NMR, 13C NMR and FT-IR technique. Their geometric parameters and electronic properties consist of global reactivity descriptors were also determined by theoretical methods. The electrochemical behavior of the butanesulfonylhydrazones were investigated by using cyclic voltammetry (CV), controlled potential electrolysis and chronoamperometry (CA) techniques. The number of electrons transferred (n), diffusion coefficient (D) and standard heterogeneous rate constants (ks) were determined by electrochemical methods.
A new parallel-vector finite element analysis software on distributed-memory computers
NASA Technical Reports Server (NTRS)
Qin, Jiangning; Nguyen, Duc T.
1993-01-01
A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.
The Phasor Approach to Fluorescence Lifetime Imaging Analysis
Digman, Michelle A.; Caiolfa, Valeria R.; Zamai, Moreno; Gratton, Enrico
2008-01-01
Changing the data representation from the classical time delay histogram to the phasor representation provides a global view of the fluorescence decay at each pixel of an image. In the phasor representation we can easily recognize the presence of different molecular species in a pixel or the occurrence of fluorescence resonance energy transfer. The analysis of the fluorescence lifetime imaging microscopy (FLIM) data in the phasor space is done observing clustering of pixels values in specific regions of the phasor plot rather than by fitting the fluorescence decay using exponentials. The analysis is instantaneous since is not based on calculations or nonlinear fitting. The phasor approach has the potential to simplify the way data are analyzed in FLIM, paving the way for the analysis of large data sets and, in general, making the FLIM technique accessible to the nonexpert in spectroscopy and data analysis. PMID:17981902
Making Meaning in a Standards-Based World: Negotiating Tensions in Global Education
ERIC Educational Resources Information Center
Klein, Jennifer D.
2013-01-01
In a largely standards-driven educational climate, educators are challenged to navigate the tensions between standards-based, scholarly pursuits and the more experiential, student-driven techniques of technology-enabled global education. At a time when these tensions are at their zenith, we need to prioritize global competencies and other…
ERIC Educational Resources Information Center
Waks, Leonard J.
The French sociologist Jacques Ellul has had great influence on contemporary thought about the role of science and technology in the emerging global society. His books "The Technological Society" (1954) and "The Technological System" (1980) characterize the new social context as a tightly interlocked global technological system…
Progress in the Determination of the Earth's Gravity Field
NASA Technical Reports Server (NTRS)
Rapp, Richard H. (Editor)
1989-01-01
Topics addressed include: global gravity model development; methods for approximation of the gravity field; gravity field measuring techniques; global gravity field applications and requirements in geophysics and oceanography; and future gravity missions.
Challenges of model transferability to data-scarce regions (Invited)
NASA Astrophysics Data System (ADS)
Samaniego, L. E.
2013-12-01
Developing the ability to globally predict the movement of water on the land surface at spatial scales from 1 to 5 km constitute one of grand challenges in land surface modelling. Copying with this grand challenge implies that land surface models (LSM) should be able to make reliable predictions across locations and/or scales other than those used for parameter estimation. In addition to that, data scarcity and quality impose further difficulties in attaining reliable predictions of water and energy fluxes at the scales of interest. Current computational limitations impose also seriously limitations to exhaustively investigate the parameter space of LSM over large domains (e.g. greater than half a million square kilometers). Addressing these challenges require holistic approaches that integrate the best techniques available for parameter estimation, field measurements and remotely sensed data at their native resolutions. An attempt to systematically address these issues is the multiscale parameterisation technique (MPR) that links high resolution land surface characteristics with effective model parameters. This technique requires a number of pedo-transfer functions and a much fewer global parameters (i.e. coefficients) to be inferred by calibration in gauged basins. The key advantage of this technique is the quasi-scale independence of the global parameters which enables to estimate global parameters at coarser spatial resolutions and then to transfer them to (ungauged) areas and scales of interest. In this study we show the ability of this technique to reproduce the observed water fluxes and states over a wide range of climate and land surface conditions ranging from humid to semiarid and from sparse to dense forested regions. Results of transferability of global model parameters in space (from humid to semi-arid basins) and across scales (from coarser to finer) clearly indicate the robustness of this technique. Simulations with coarse data sets (e.g. EOBS forcing 25x25 km2, FAO soil map 1:5000000) using parameters obtained with high resolution information (REGNIE forcing 1x1 km2, BUEK soil map 1:1000000) in different climatic regions indicate the potential of MPR for prediction in data-scarce regions. In this presentation, we will also discuss how the transferability of global model parameters across scales and locations helps to identify deficiencies in model structure and regionalization functions.
NASA Astrophysics Data System (ADS)
Daryanani, Aditya; Dangi, Shusil; Ben-Zikri, Yehuda Kfir; Linte, Cristian A.
2016-03-01
Magnetic Resonance Imaging (MRI) is a standard-of-care imaging modality for cardiac function assessment and guidance of cardiac interventions thanks to its high image quality and lack of exposure to ionizing radiation. Cardiac health parameters such as left ventricular volume, ejection fraction, myocardial mass, thickness, and strain can be assessed by segmenting the heart from cardiac MRI images. Furthermore, the segmented pre-operative anatomical heart models can be used to precisely identify regions of interest to be treated during minimally invasive therapy. Hence, the use of accurate and computationally efficient segmentation techniques is critical, especially for intra-procedural guidance applications that rely on the peri-operative segmentation of subject-specific datasets without delaying the procedure workflow. Atlas-based segmentation incorporates prior knowledge of the anatomy of interest from expertly annotated image datasets. Typically, the ground truth atlas label is propagated to a test image using a combination of global and local registration. The high computational cost of non-rigid registration motivated us to obtain an initial segmentation using global transformations based on an atlas of the left ventricle from a population of patient MRI images and refine it using well developed technique based on graph cuts. Here we quantitatively compare the segmentations obtained from the global and global plus local atlases and refined using graph cut-based techniques with the expert segmentations according to several similarity metrics, including Dice correlation coefficient, Jaccard coefficient, Hausdorff distance, and Mean absolute distance error.
99mTc-d,l-HMPAO and SPECT of the brain in normal aging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldemar, G.; Hasselbalch, S.G.; Andersen, A.R.
1991-05-01
Single photon emission computed tomography (SPECT) with 99mTc-d,l-hexamethylpropyleneamine oxime (99mTc-d,l-HMPAO) was used to determine global and regional CBF in 53 healthy subjects aged 21-83 years. For the whole group, global CBF normalized to the cerebellum was 86.4% +/- 8.4 (SD). The contribution of age, sex, and atrophy to variations in global CBF was studied using stepwise multiple regression analysis. There was a significant negative correlation of global CBF with subjective ratings of cortical atrophy, but not with ratings of ventricular size, Evans ratio, sex, or age. In a subgroup of 33 subjects, in whom volumetric measurements of atrophy were performed,more » cortical atrophy was the only significant determinant for global CBF, accounting for 27% of its variance. Mean global CBF as measured with the 133Xe inhalation technique and SPECT was 54 +/- 9 ml/100 g/min and did not correlate significantly with age. There was a preferential decline of CBF in the frontal cortex with advancing age. The side-to-side asymmetry of several regions of interest increased with age. A method was described for estimation of subcortical CBF, which decreased with advancing cortical atrophy. The relative area of the subcortical low-flow region increased with age. These results are useful in distinguishing the effects of age and simple atrophy from disease effects, when the 99mTc-d,l-HMPAO method is used.« less
Lidar Measurements of Tropospheric Wind Profiles with the Double Edge Technique
NASA Technical Reports Server (NTRS)
Gentry, Bruce M.; Li, Steven X.; Korb, C. Laurence; Mathur, Savyasachee; Chen, Huailin
1998-01-01
Research has established the importance of global tropospheric wind measurements for large scale improvements in numerical weather prediction. In addition, global wind measurements provide data that are fundamental to the understanding and prediction of global climate change. These tasks are closely linked with the goals of the NASA Earth Science Enterprise and Global Climate Change programs. NASA Goddard has been actively involved in the development of direct detection Doppler lidar methods and technologies to meet the wind observing needs of the atmospheric science community. A variety of direct detection Doppler wind lidar measurements have recently been reported indicating the growing interest in this area. Our program at Goddard has concentrated on the development of the edge technique for lidar wind measurements. Implementations of the edge technique using either the aerosol or molecular backscatter for the Doppler wind measurement have been described. The basic principles have been verified in lab and atmospheric lidar wind experiments. The lidar measurements were obtained with an aerosol edge technique lidar operating at 1064 nm. These measurements demonstrated high spatial resolution (22 m) and high velocity sensitivity (rms variances of 0.1 m/s) in the planetary boundary layer (PBL). The aerosol backscatter is typically high in the PBL and the effects of the molecular backscatter can often be neglected. However, as was discussed in the original edge technique paper, the molecular contribution to the signal is significant above the boundary layer and a correction for the effects of molecular backscatter is required to make wind measurements. In addition, the molecular signal is a dominant source of noise in regions where the molecular to aerosol ratio is large since the energy monitor channel used in the single edge technique measures the sum of the aerosol and molecular signals. To extend the operation of the edge technique into the free troposphere we have developed a variation of the edge technique called the double edge technique. In this paper a ground based aerosol double edge lidar is described and the first measurements of wind profiles in the free troposphere obtained with this lidar will be presented.
Parton distributions and lattice QCD calculations: A community white paper
NASA Astrophysics Data System (ADS)
Lin, Huey-Wen; Nocera, Emanuele R.; Olness, Fred; Orginos, Kostas; Rojo, Juan; Accardi, Alberto; Alexandrou, Constantia; Bacchetta, Alessandro; Bozzi, Giuseppe; Chen, Jiunn-Wei; Collins, Sara; Cooper-Sarkar, Amanda; Constantinou, Martha; Del Debbio, Luigi; Engelhardt, Michael; Green, Jeremy; Gupta, Rajan; Harland-Lang, Lucian A.; Ishikawa, Tomomi; Kusina, Aleksander; Liu, Keh-Fei; Liuti, Simonetta; Monahan, Christopher; Nadolsky, Pavel; Qiu, Jian-Wei; Schienbein, Ingo; Schierholz, Gerrit; Thorne, Robert S.; Vogelsang, Werner; Wittig, Hartmut; Yuan, C.-P.; Zanotti, James
2018-05-01
In the framework of quantum chromodynamics (QCD), parton distribution functions (PDFs) quantify how the momentum and spin of a hadron are divided among its quark and gluon constituents. Two main approaches exist to determine PDFs. The first approach, based on QCD factorization theorems, realizes a QCD analysis of a suitable set of hard-scattering measurements, often using a variety of hadronic observables. The second approach, based on first-principle operator definitions of PDFs, uses lattice QCD to compute directly some PDF-related quantities, such as their moments. Motivated by recent progress in both approaches, in this document we present an overview of lattice-QCD and global-analysis techniques used to determine unpolarized and polarized proton PDFs and their moments. We provide benchmark numbers to validate present and future lattice-QCD calculations and we illustrate how they could be used to reduce the PDF uncertainties in current unpolarized and polarized global analyses. This document represents a first step towards establishing a common language between the two communities, to foster dialogue and to further improve our knowledge of PDFs.
NASA Technical Reports Server (NTRS)
Welch, Bryan W.
2016-01-01
NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user from the Earth's surface through the Terrestrial Service Volume (TSV) to the edge of the Space Service Volume (SSV), when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The first phase of that increasing complexity and fidelity analysis initiative was recently expanded to compare nadir-facing and zenith-facing user hemispherical antenna coverage with omnidirectional antenna coverage at different distances of 8,000 km altitude and 36,000 km altitude. This report summarizes the performance using these antenna coverage techniques at distances ranging from 100 km altitude to 36,000 km to be all encompassing, as well as the volumetrically-derived system availability metrics.
Simulation and analysis of differential global positioning system for civil helicopter operations
NASA Technical Reports Server (NTRS)
Denaro, R. P.; Cabak, A. R.
1983-01-01
A Differential Global Positioning System (DGPS) computer simulation was developed, to provide a versatile tool for assessing DGPS referenced civil helicopter navigation. The civil helicopter community will probably be an early user of the GPS capability because of the unique mission requirements which include offshore exploration and low altitude transport into remote areas not currently served by ground based Navaids. The Monte Carlo simulation provided a sufficiently high fidelity dynamic motion and propagation environment to enable accurate comparisons of alternative differential GPS implementations and navigation filter tradeoffs. The analyst has provided the capability to adjust most aspects of the system, the helicopter flight profile, the receiver Kalman filter, and the signal propagation environment to assess differential GPS performance and parameter sensitivities. Preliminary analysis was conducted to evaluate alternative implementations of the differential navigation algorithm in both the position and measurement domain. Results are presented to show that significant performance gains are achieved when compared with conventional GPS but that differences due to DGPS implementation techniques were small. System performance was relatively insensitive to the update rates of the error correction information.
Parton distributions and lattice QCD calculations: A community white paper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Huey-Wen; Nocera, Emanuele R.; Olness, Fred
In the framework of quantum chromodynamics (QCD), parton distribution functions (PDFs) quantify how the momentum and spin of a hadron are divided among its quark and gluon constituents. Two main approaches exist to determine PDFs. The first approach, based on QCD factorization theorems, realizes a QCD analysis of a suitable set of hard-scattering measurements, often using a variety of hadronic observables. The second approach, based on first-principle operator definitions of PDFs, uses lattice QCD to compute directly some PDF-related quantities, such as their moments. Motivated by recent progress in both approaches, in this paper we present an overview of lattice-QCDmore » and global-analysis techniques used to determine unpolarized and polarized proton PDFs and their moments. We provide benchmark numbers to validate present and future lattice-QCD calculations and we illustrate how they could be used to reduce the PDF uncertainties in current unpolarized and polarized global analyses. Finally, this document represents a first step towards establishing a common language between the two communities, to foster dialogue and to further improve our knowledge of PDFs.« less