Sparse Image Reconstruction on the Sphere: Analysis and Synthesis.
Wallis, Christopher G R; Wiaux, Yves; McEwen, Jason D
2017-11-01
We develop techniques to solve ill-posed inverse problems on the sphere by sparse regularization, exploiting sparsity in both axisymmetric and directional scale-discretized wavelet space. Denoising, inpainting, and deconvolution problems and combinations thereof, are considered as examples. Inverse problems are solved in both the analysis and synthesis settings, with a number of different sampling schemes. The most effective approach is that with the most restricted solution-space, which depends on the interplay between the adopted sampling scheme, the selection of the analysis/synthesis problem, and any weighting of the l 1 norm appearing in the regularization problem. More efficient sampling schemes on the sphere improve reconstruction fidelity by restricting the solution-space and also by improving sparsity in wavelet space. We apply the technique to denoise Planck 353-GHz observations, improving the ability to extract the structure of Galactic dust emission, which is important for studying Galactic magnetism.
NASA Astrophysics Data System (ADS)
Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.
2015-06-01
Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.
Two-dimensional T2 distribution mapping in rock core plugs with optimal k-space sampling.
Xiao, Dan; Balcom, Bruce J
2012-07-01
Spin-echo single point imaging has been employed for 1D T(2) distribution mapping, but a simple extension to 2D is challenging since the time increase is n fold, where n is the number of pixels in the second dimension. Nevertheless 2D T(2) mapping in fluid saturated rock core plugs is highly desirable because the bedding plane structure in rocks often results in different pore properties within the sample. The acquisition time can be improved by undersampling k-space. The cylindrical shape of rock core plugs yields well defined intensity distributions in k-space that may be efficiently determined by new k-space sampling patterns that are developed in this work. These patterns acquire 22.2% and 11.7% of the k-space data points. Companion density images may be employed, in a keyhole imaging sense, to improve image quality. T(2) weighted images are fit to extract T(2) distributions, pixel by pixel, employing an inverse Laplace transform. Images reconstructed with compressed sensing, with similar acceleration factors, are also presented. The results show that restricted k-space sampling, in this application, provides high quality results. Copyright © 2012 Elsevier Inc. All rights reserved.
Odéen, Henrik; Todd, Nick; Diakite, Mahamadou; Minalga, Emilee; Payne, Allison; Parker, Dennis L.
2014-01-01
Purpose: To investigate k-space subsampling strategies to achieve fast, large field-of-view (FOV) temperature monitoring using segmented echo planar imaging (EPI) proton resonance frequency shift thermometry for MR guided high intensity focused ultrasound (MRgHIFU) applications. Methods: Five different k-space sampling approaches were investigated, varying sample spacing (equally vs nonequally spaced within the echo train), sampling density (variable sampling density in zero, one, and two dimensions), and utilizing sequential or centric sampling. Three of the schemes utilized sequential sampling with the sampling density varied in zero, one, and two dimensions, to investigate sampling the k-space center more frequently. Two of the schemes utilized centric sampling to acquire the k-space center with a longer echo time for improved phase measurements, and vary the sampling density in zero and two dimensions, respectively. Phantom experiments and a theoretical point spread function analysis were performed to investigate their performance. Variable density sampling in zero and two dimensions was also implemented in a non-EPI GRE pulse sequence for comparison. All subsampled data were reconstructed with a previously described temporally constrained reconstruction (TCR) algorithm. Results: The accuracy of each sampling strategy in measuring the temperature rise in the HIFU focal spot was measured in terms of the root-mean-square-error (RMSE) compared to fully sampled “truth.” For the schemes utilizing sequential sampling, the accuracy was found to improve with the dimensionality of the variable density sampling, giving values of 0.65 °C, 0.49 °C, and 0.35 °C for density variation in zero, one, and two dimensions, respectively. The schemes utilizing centric sampling were found to underestimate the temperature rise, with RMSE values of 1.05 °C and 1.31 °C, for variable density sampling in zero and two dimensions, respectively. Similar subsampling schemes with variable density sampling implemented in zero and two dimensions in a non-EPI GRE pulse sequence both resulted in accurate temperature measurements (RMSE of 0.70 °C and 0.63 °C, respectively). With sequential sampling in the described EPI implementation, temperature monitoring over a 192 × 144 × 135 mm3 FOV with a temporal resolution of 3.6 s was achieved, while keeping the RMSE compared to fully sampled “truth” below 0.35 °C. Conclusions: When segmented EPI readouts are used in conjunction with k-space subsampling for MR thermometry applications, sampling schemes with sequential sampling, with or without variable density sampling, obtain accurate phase and temperature measurements when using a TCR reconstruction algorithm. Improved temperature measurement accuracy can be achieved with variable density sampling. Centric sampling leads to phase bias, resulting in temperature underestimations. PMID:25186406
MontePython 3: Parameter inference code for cosmology
NASA Astrophysics Data System (ADS)
Brinckmann, Thejs; Lesgourgues, Julien; Audren, Benjamin; Benabed, Karim; Prunet, Simon
2018-05-01
MontePython 3 provides numerous ways to explore parameter space using Monte Carlo Markov Chain (MCMC) sampling, including Metropolis-Hastings, Nested Sampling, Cosmo Hammer, and a Fisher sampling method. This improved version of the Monte Python (ascl:1307.002) parameter inference code for cosmology offers new ingredients that improve the performance of Metropolis-Hastings sampling, speeding up convergence and offering significant time improvement in difficult runs. Additional likelihoods and plotting options are available, as are post-processing algorithms such as Importance Sampling and Adding Derived Parameter.
Radial q-space sampling for DSI
Baete, Steven H.; Yutzy, Stephen; Boada, Fernando, E.
2015-01-01
Purpose Diffusion Spectrum Imaging (DSI) has been shown to be an effective tool for non-invasively depicting the anatomical details of brain microstructure. Existing implementations of DSI sample the diffusion encoding space using a rectangular grid. Here we present a different implementation of DSI whereby a radially symmetric q-space sampling scheme for DSI (RDSI) is used to improve the angular resolution and accuracy of the reconstructed Orientation Distribution Functions (ODF). Methods Q-space is sampled by acquiring several q-space samples along a number of radial lines. Each of these radial lines in q-space is analytically connected to a value of the ODF at the same angular location by the Fourier slice theorem. Results Computer simulations and in vivo brain results demonstrate that RDSI correctly estimates the ODF when moderately high b-values (4000 s/mm2) and number of q-space samples (236) are used. Conclusion The nominal angular resolution of RDSI depends on the number of radial lines used in the sampling scheme, and only weakly on the maximum b-value. In addition, the radial analytical reconstruction reduces truncation artifacts which affect Cartesian reconstructions. Hence, a radial acquisition of q-space can be favorable for DSI. PMID:26363002
Virtual k -Space Modulation Optical Microscopy
NASA Astrophysics Data System (ADS)
Kuang, Cuifang; Ma, Ye; Zhou, Renjie; Zheng, Guoan; Fang, Yue; Xu, Yingke; Liu, Xu; So, Peter T. C.
2016-07-01
We report a novel superresolution microscopy approach for imaging fluorescence samples. The reported approach, termed virtual k -space modulation optical microscopy (VIKMOM), is able to improve the lateral resolution by a factor of 2, reduce the background level, improve the optical sectioning effect and correct for unknown optical aberrations. In the acquisition process of VIKMOM, we used a scanning confocal microscope setup with a 2D detector array to capture sample information at each scanned x -y position. In the recovery process of VIKMOM, we first modulated the captured data by virtual k -space coding and then employed a ptychography-inspired procedure to recover the sample information and correct for unknown optical aberrations. We demonstrated the performance of the reported approach by imaging fluorescent beads, fixed bovine pulmonary artery endothelial (BPAE) cells, and living human astrocytes (HA). As the VIKMOM approach is fully compatible with conventional confocal microscope setups, it may provide a turn-key solution for imaging biological samples with ˜100 nm lateral resolution, in two or three dimensions, with improved optical sectioning capabilities and aberration correcting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odéen, Henrik, E-mail: h.odeen@gmail.com; Diakite, Mahamadou; Todd, Nick
2014-09-15
Purpose: To investigate k-space subsampling strategies to achieve fast, large field-of-view (FOV) temperature monitoring using segmented echo planar imaging (EPI) proton resonance frequency shift thermometry for MR guided high intensity focused ultrasound (MRgHIFU) applications. Methods: Five different k-space sampling approaches were investigated, varying sample spacing (equally vs nonequally spaced within the echo train), sampling density (variable sampling density in zero, one, and two dimensions), and utilizing sequential or centric sampling. Three of the schemes utilized sequential sampling with the sampling density varied in zero, one, and two dimensions, to investigate sampling the k-space center more frequently. Two of the schemesmore » utilized centric sampling to acquire the k-space center with a longer echo time for improved phase measurements, and vary the sampling density in zero and two dimensions, respectively. Phantom experiments and a theoretical point spread function analysis were performed to investigate their performance. Variable density sampling in zero and two dimensions was also implemented in a non-EPI GRE pulse sequence for comparison. All subsampled data were reconstructed with a previously described temporally constrained reconstruction (TCR) algorithm. Results: The accuracy of each sampling strategy in measuring the temperature rise in the HIFU focal spot was measured in terms of the root-mean-square-error (RMSE) compared to fully sampled “truth.” For the schemes utilizing sequential sampling, the accuracy was found to improve with the dimensionality of the variable density sampling, giving values of 0.65 °C, 0.49 °C, and 0.35 °C for density variation in zero, one, and two dimensions, respectively. The schemes utilizing centric sampling were found to underestimate the temperature rise, with RMSE values of 1.05 °C and 1.31 °C, for variable density sampling in zero and two dimensions, respectively. Similar subsampling schemes with variable density sampling implemented in zero and two dimensions in a non-EPI GRE pulse sequence both resulted in accurate temperature measurements (RMSE of 0.70 °C and 0.63 °C, respectively). With sequential sampling in the described EPI implementation, temperature monitoring over a 192 × 144 × 135 mm{sup 3} FOV with a temporal resolution of 3.6 s was achieved, while keeping the RMSE compared to fully sampled “truth” below 0.35 °C. Conclusions: When segmented EPI readouts are used in conjunction with k-space subsampling for MR thermometry applications, sampling schemes with sequential sampling, with or without variable density sampling, obtain accurate phase and temperature measurements when using a TCR reconstruction algorithm. Improved temperature measurement accuracy can be achieved with variable density sampling. Centric sampling leads to phase bias, resulting in temperature underestimations.« less
Radial q-space sampling for DSI.
Baete, Steven H; Yutzy, Stephen; Boada, Fernando E
2016-09-01
Diffusion spectrum imaging (DSI) has been shown to be an effective tool for noninvasively depicting the anatomical details of brain microstructure. Existing implementations of DSI sample the diffusion encoding space using a rectangular grid. Here we present a different implementation of DSI whereby a radially symmetric q-space sampling scheme for DSI is used to improve the angular resolution and accuracy of the reconstructed orientation distribution functions. Q-space is sampled by acquiring several q-space samples along a number of radial lines. Each of these radial lines in q-space is analytically connected to a value of the orientation distribution functions at the same angular location by the Fourier slice theorem. Computer simulations and in vivo brain results demonstrate that radial diffusion spectrum imaging correctly estimates the orientation distribution functions when moderately high b-values (4000 s/mm2) and number of q-space samples (236) are used. The nominal angular resolution of radial diffusion spectrum imaging depends on the number of radial lines used in the sampling scheme, and only weakly on the maximum b-value. In addition, the radial analytical reconstruction reduces truncation artifacts which affect Cartesian reconstructions. Hence, a radial acquisition of q-space can be favorable for DSI. Magn Reson Med 76:769-780, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Li, Hongzhi; Min, Donghong; Liu, Yusong; Yang, Wei
2007-09-01
To overcome the possible pseudoergodicity problem, molecular dynamic simulation can be accelerated via the realization of an energy space random walk. To achieve this, a biased free energy function (BFEF) needs to be priori obtained. Although the quality of BFEF is essential for sampling efficiency, its generation is usually tedious and nontrivial. In this work, we present an energy space metadynamics algorithm to efficiently and robustly obtain BFEFs. Moreover, in order to deal with the associated diffusion sampling problem caused by the random walk in the total energy space, the idea in the original umbrella sampling method is generalized to be the random walk in the essential energy space, which only includes the energy terms determining the conformation of a region of interest. This essential energy space generalization allows the realization of efficient localized enhanced sampling and also offers the possibility of further sampling efficiency improvement when high frequency energy terms irrelevant to the target events are free of activation. The energy space metadynamics method and its generalization in the essential energy space for the molecular dynamics acceleration are demonstrated in the simulation of a pentanelike system, the blocked alanine dipeptide model, and the leucine model.
O-space with high resolution readouts outperforms radial imaging.
Wang, Haifeng; Tam, Leo; Kopanoglu, Emre; Peters, Dana C; Constable, R Todd; Galiana, Gigi
2017-04-01
While O-Space imaging is well known to accelerate image acquisition beyond traditional Cartesian sampling, its advantages compared to undersampled radial imaging, the linear trajectory most akin to O-Space imaging, have not been detailed. In addition, previous studies have focused on ultrafast imaging with very high acceleration factors and relatively low resolution. The purpose of this work is to directly compare O-Space and radial imaging in their potential to deliver highly undersampled images of high resolution and minimal artifacts, as needed for diagnostic applications. We report that the greatest advantages to O-Space imaging are observed with extended data acquisition readouts. A sampling strategy that uses high resolution readouts is presented and applied to compare the potential of radial and O-Space sequences to generate high resolution images at high undersampling factors. Simulations and phantom studies were performed to investigate whether use of extended readout windows in O-Space imaging would increase k-space sampling and improve image quality, compared to radial imaging. Experimental O-Space images acquired with high resolution readouts show fewer artifacts and greater sharpness than radial imaging with equivalent scan parameters. Radial images taken with longer readouts show stronger undersampling artifacts, which can cause small or subtle image features to disappear. These features are preserved in a comparable O-Space image. High resolution O-Space imaging yields highly undersampled images of high resolution and minimal artifacts. The additional nonlinear gradient field improves image quality beyond conventional radial imaging. Copyright © 2016 Elsevier Inc. All rights reserved.
A real-space approach to the X-ray phase problem
NASA Astrophysics Data System (ADS)
Liu, Xiangan
Over the past few decades, the phase problem of X-ray crystallography has been explored in reciprocal space in the so called direct methods . Here we investigate the problem using a real-space approach that bypasses the laborious procedure of frequent Fourier synthesis and peak picking. Starting from a completely random structure, we move the atoms around in real space to minimize a cost function. A Monte Carlo method named simulated annealing (SA) is employed to search the global minimum of the cost function which could be constructed in either real space or reciprocal space. In the hybrid minimal principle, we combine the dual space costs together. One part of the cost function monitors the probability distribution of the phase triplets, while the other is a real space cost function which represents the discrepancy between measured and calculated intensities. Compared to the single space cost functions, the dual space cost function has a greatly improved landscape and therefore could prevent the system from being trapped in metastable states. Thus, the structures of large molecules such as virginiamycin (C43H 49N7O10 · 3CH0OH), isoleucinomycin (C60H102N 6O18) and hexadecaisoleucinomycin (HEXIL) (C80H136 N8O24) can now be solved, whereas it would not be possible using the single cost function. When a molecule gets larger, the configurational space becomes larger, and the requirement of CPU time increases exponentially. The method of improved Monte Carlo sampling has demonstrated its capability to solve large molecular structures. The atoms are encouraged to sample the high density regions in space determined by an approximate density map which in turn is updated and modified by averaging and Fourier synthesis. This type of biased sampling has led to considerable reduction of the configurational space. It greatly improves the algorithm compared to the previous uniform sampling. Hence, for instance, 90% of computer run time could be cut in solving the complex structure of isoleucinomycin. Successful trial calculations include larger molecular structures such as HEXIL and a collagen-like peptide (PPG). Moving chemical fragment is proposed to reduce the degrees of freedom. Furthermore, stereochemical parameters are considered for geometric constraints and for a cost function related to chemical energy.
Wear resistance of machine tools' bionic linear rolling guides by laser cladding
NASA Astrophysics Data System (ADS)
Wang, Yiqiang; Liu, Botao; Guo, Zhengcai
2017-06-01
In order to improve the rolling wear resistance (RWR) of linear rolling guides (LRG) as well as prolong the life of machine tools, various shape samples with different units spaces ranged from 1 to 5 mm are designed through the observation of animals in the desert and manufactured by laser cladding. Wear resistance tests reproducing closely the real operational condition are conducted by using a homemade linear reciprocating wear test machine, and wear resistance is evaluated by means of weight loss measurement. Results indicate that the samples with bionic units have better RWR than the untreated one, of which the reticulate treated sample with unit space 3 mm present the best RWR. More specifically, among the punctuate treated samples, the mass loss increases with the increase of unit space; among the striate treated samples, the mass loss changes slightly with the increase of unit space, attaining a minimum at the unit space of 4 mm; among the reticulate treated samples, with the increase of unit space, the mass loss initially decreases, but turns to increase after reaching a minimum at the unit space of 3 mm. Additionally, the samples with striate shape perform better wear resistance than the other shape groups on the whole. From the ratio value of laser treated area to contacted area perspective, that the samples with ratio value between 0.15 and 0.3 possess better wear resistance is concluded.
Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation
NASA Astrophysics Data System (ADS)
Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.
2016-12-01
With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.
Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling
Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David
2016-01-01
Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464
NASA Astrophysics Data System (ADS)
Chang, Tianying; Zhang, Xiansheng; Yang, Chuanfa; Sun, Zhonglin; Cui, Hong-Liang
2017-04-01
The complex dielectric properties of non-polar solid polymer materials were measured in the terahertz (THz) band by a free-space technique employing a frequency-extended vector network analyzer (VNA), and by THz time-domain spectroscopy (TDS). Mindful of THz wave’s unique characteristics, the free-space method for measurement of material dielectric properties in the microwave band was expanded and improved for application in the THz frequency region. To ascertain the soundness and utility of the proposed method, measurements of the complex dielectric properties of a variety of polymers were carried out, including polytetrafluoroethylene (PTFE, known also by the brand name Teflon), polypropylene (PP), polyethylene (PE), and glass fiber resin (Composite Stone). The free-space method relies on the determination of electromagnetic scattering parameters (S-parameters) of the sample, with the gated-reflect-line (GRL) calibration technique commonly employed using a VNA. Subsequently, based on the S-parameters, the dielectric constant and loss characteristic of the sample were calculated by using a Newtonian iterative algorithm. To verify the calculated results, THz TDS technique, which produced Fresnel parameters such as reflection and transmission coefficients, was also used to independently determine the dielectric properties of these polymer samples, with results satisfactorily corroborating those obtained by the free-space extended microwave technique.
Rock images classification by using deep convolution neural network
NASA Astrophysics Data System (ADS)
Cheng, Guojian; Guo, Wenhui
2017-08-01
Granularity analysis is one of the most essential issues in authenticate under microscope. To improve the efficiency and accuracy of traditional manual work, an convolutional neural network based method is proposed for granularity analysis from thin section image, which chooses and extracts features from image samples while build classifier to recognize granularity of input image samples. 4800 samples from Ordos basin are used for experiments under colour spaces of HSV, YCbCr and RGB respectively. On the test dataset, the correct rate in RGB colour space is 98.5%, and it is believable in HSV and YCbCr colour space. The results show that the convolution neural network can classify the rock images with high reliability.
Commander Bowersox Tends to Zeolite Crystal Samples Aboard Space Station
NASA Technical Reports Server (NTRS)
2003-01-01
Expedition Six Commander Ken Bowersox spins Zeolite Crystal Growth sample tubes to eliminate bubbles that could affect crystal formation in preparation of a 15 day experiment aboard the International Space Station (ISS). Zeolites are hard as rock, yet are able to absorb liquids and gases like a sponge. By using the ISS microgravity environment to grow better, larger crystals, NASA and its commercial partners hope to improve petroleum manufacturing and other processes.
NASA Astrophysics Data System (ADS)
Sheikholeslami, R.; Hosseini, N.; Razavi, S.
2016-12-01
Modern earth and environmental models are usually characterized by a large parameter space and high computational cost. These two features prevent effective implementation of sampling-based analysis such as sensitivity and uncertainty analysis, which require running these computationally expensive models several times to adequately explore the parameter/problem space. Therefore, developing efficient sampling techniques that scale with the size of the problem, computational budget, and users' needs is essential. In this presentation, we propose an efficient sequential sampling strategy, called Progressive Latin Hypercube Sampling (PLHS), which provides an increasingly improved coverage of the parameter space, while satisfying pre-defined requirements. The original Latin hypercube sampling (LHS) approach generates the entire sample set in one stage; on the contrary, PLHS generates a series of smaller sub-sets (also called `slices') while: (1) each sub-set is Latin hypercube and achieves maximum stratification in any one dimensional projection; (2) the progressive addition of sub-sets remains Latin hypercube; and thus (3) the entire sample set is Latin hypercube. Therefore, it has the capability to preserve the intended sampling properties throughout the sampling procedure. PLHS is deemed advantageous over the existing methods, particularly because it nearly avoids over- or under-sampling. Through different case studies, we show that PHLS has multiple advantages over the one-stage sampling approaches, including improved convergence and stability of the analysis results with fewer model runs. In addition, PLHS can help to minimize the total simulation time by only running the simulations necessary to achieve the desired level of quality (e.g., accuracy, and convergence rate).
Metadynamics for training neural network model chemistries: A competitive assessment
NASA Astrophysics Data System (ADS)
Herr, John E.; Yao, Kun; McIntyre, Ryker; Toth, David W.; Parkhill, John
2018-06-01
Neural network model chemistries (NNMCs) promise to facilitate the accurate exploration of chemical space and simulation of large reactive systems. One important path to improving these models is to add layers of physical detail, especially long-range forces. At short range, however, these models are data driven and data limited. Little is systematically known about how data should be sampled, and "test data" chosen randomly from some sampling techniques can provide poor information about generality. If the sampling method is narrow, "test error" can appear encouragingly tiny while the model fails catastrophically elsewhere. In this manuscript, we competitively evaluate two common sampling methods: molecular dynamics (MD), normal-mode sampling, and one uncommon alternative, Metadynamics (MetaMD), for preparing training geometries. We show that MD is an inefficient sampling method in the sense that additional samples do not improve generality. We also show that MetaMD is easily implemented in any NNMC software package with cost that scales linearly with the number of atoms in a sample molecule. MetaMD is a black-box way to ensure samples always reach out to new regions of chemical space, while remaining relevant to chemistry near kbT. It is a cheap tool to address the issue of generalization.
Molecular dynamics in principal component space.
Michielssens, Servaas; van Erp, Titus S; Kutzner, Carsten; Ceulemans, Arnout; de Groot, Bert L
2012-07-26
A molecular dynamics algorithm in principal component space is presented. It is demonstrated that sampling can be improved without changing the ensemble by assigning masses to the principal components proportional to the inverse square root of the eigenvalues. The setup of the simulation requires no prior knowledge of the system; a short initial MD simulation to extract the eigenvectors and eigenvalues suffices. Independent measures indicated a 6-7 times faster sampling compared to a regular molecular dynamics simulation.
Sampling and Visualizing Creases with Scale-Space Particles
Kindlmann, Gordon L.; Estépar, Raúl San José; Smith, Stephen M.; Westin, Carl-Fredrik
2010-01-01
Particle systems have gained importance as a methodology for sampling implicit surfaces and segmented objects to improve mesh generation and shape analysis. We propose that particle systems have a significantly more general role in sampling structure from unsegmented data. We describe a particle system that computes samplings of crease features (i.e. ridges and valleys, as lines or surfaces) that effectively represent many anatomical structures in scanned medical data. Because structure naturally exists at a range of sizes relative to the image resolution, computer vision has developed the theory of scale-space, which considers an n-D image as an (n + 1)-D stack of images at different blurring levels. Our scale-space particles move through continuous four-dimensional scale-space according to spatial constraints imposed by the crease features, a particle-image energy that draws particles towards scales of maximal feature strength, and an inter-particle energy that controls sampling density in space and scale. To make scale-space practical for large three-dimensional data, we present a spline-based interpolation across scale from a small number of pre-computed blurrings at optimally selected scales. The configuration of the particle system is visualized with tensor glyphs that display information about the local Hessian of the image, and the scale of the particle. We use scale-space particles to sample the complex three-dimensional branching structure of airways in lung CT, and the major white matter structures in brain DTI. PMID:19834216
Earth as a Tool for Astrobiology—A European Perspective
NASA Astrophysics Data System (ADS)
Martins, Zita; Cottin, Hervé; Kotler, Julia Michelle; Carrasco, Nathalie; Cockell, Charles S.; de la Torre Noetzel, Rosa; Demets, René; de Vera, Jean-Pierre; d'Hendecourt, Louis; Ehrenfreund, Pascale; Elsaesser, Andreas; Foing, Bernard; Onofri, Silvano; Quinn, Richard; Rabbow, Elke; Rettberg, Petra; Ricco, Antonio J.; Slenzka, Klaus; Stalport, Fabien; ten Kate, Inge L.; van Loon, Jack J. W. A.; Westall, Frances
2017-07-01
Scientists use the Earth as a tool for astrobiology by analyzing planetary field analogues (i.e. terrestrial samples and field sites that resemble planetary bodies in our Solar System). In addition, they expose the selected planetary field analogues in simulation chambers to conditions that mimic the ones of planets, moons and Low Earth Orbit (LEO) space conditions, as well as the chemistry occurring in interstellar and cometary ices. This paper reviews the ways the Earth is used by astrobiologists: (i) by conducting planetary field analogue studies to investigate extant life from extreme environments, its metabolisms, adaptation strategies and modern biosignatures; (ii) by conducting planetary field analogue studies to investigate extinct life from the oldest rocks on our planet and its biosignatures; (iii) by exposing terrestrial samples to simulated space or planetary environments and producing a sample analogue to investigate changes in minerals, biosignatures and microorganisms. The European Space Agency (ESA) created a topical team in 2011 to investigate recent activities using the Earth as a tool for astrobiology and to formulate recommendations and scientific needs to improve ground-based astrobiological research. Space is an important tool for astrobiology (see Horneck et al. in Astrobiology, 16:201-243, 2016; Cottin et al., 2017), but access to space is limited. Complementing research on Earth provides fast access, more replications and higher sample throughput. The major conclusions of the topical team and suggestions for the future include more scientifically qualified calls for field campaigns with planetary analogy, and a centralized point of contact at ESA or the EU for the organization of a survey of such expeditions. An improvement of the coordinated logistics, infrastructures and funding system supporting the combination of field work with planetary simulation investigations, as well as an optimization of the scientific return and data processing, data storage and data distribution is also needed. Finally, a coordinated EU or ESA education and outreach program would improve the participation of the public in the astrobiological activities.
Unified method for serial study of body fluid compartments
NASA Technical Reports Server (NTRS)
Spears, C. P.; Hyatt, K. H.; Vogel, J. M.; Langfitt, S. B.
1974-01-01
Methods for the simultaneous determination of equilibrium space of I-125/RISA(radio-iodinated serum albumin) (plasma volume), Cr-51 red cell mass, Br-82 space (extracellular fluid volume), and tritiated water space (total body water) are described. Determinations were made on two occasions separated by a 1 week interval in 43 healthy young men who were on a strict metabolic diet. Hourly samples were taken for 6 hours after injection of the radionuclides. Correlation of these values to the inscribed exponential disappearance curve was high. In 15 subjects, earlier and more-frequent sampling led to no improvement in the accuracy of estimation of the I-125/RISA space. Use of this method gave results in 12 subjects for Br-82 space and in 11 subjects for tritiated water space which were not significantly different from those obtained by correction for urine loss.
Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering
NASA Astrophysics Data System (ADS)
Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki
2018-03-01
We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.
Propagative selection of tilted array patterns in directional solidification
NASA Astrophysics Data System (ADS)
Song, Younggil; Akamatsu, Silvère; Bottin-Rousseau, Sabine; Karma, Alain
2018-05-01
We investigate the dynamics of tilted cellular/dendritic array patterns that form during directional solidification of a binary alloy when a preferred-growth crystal axis is misoriented with respect to the temperature gradient. In situ experimental observations and phase-field simulations in thin samples reveal the existence of a propagative source-sink mechanism of array spacing selection that operates on larger space and time scales than the competitive growth at play during the initial solidification transient. For tilted arrays, tertiary branching at the diverging edge of the sample acts as a source of new cells with a spacing that can be significantly larger than the initial average spacing. A spatial domain of large spacing then invades the sample propagatively. It thus yields a uniform spacing everywhere, selected independently of the initial conditions, except in a small region near the converging edge of the sample, which acts as a sink of cells. We propose a discrete geometrical model that describes the large-scale evolution of the spatial spacing profile based on the local dependence of the cell drift velocity on the spacing. We also derive a nonlinear advection equation that predicts the invasion velocity of the large-spacing domain, and sheds light on the fundamental nature of this process. The models also account for more complex spacing modulations produced by an irregular dynamics at the source, in good quantitative agreement with both phase-field simulations and experiments. This basic knowledge provides a theoretical basis to improve the processing of single crystals or textured polycrystals for advanced materials.
Liu, Fei; Zhang, Xi; Jia, Yan
2015-01-01
In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.
NASA Technical Reports Server (NTRS)
Simanonok, K.; Mosely, E.; Charles, J.
1992-01-01
Nine preflight variables related to fluid, electrolyte, and cardiovascular status from 64 first-time Shuttle crewmembers were differentially weighted by discrimination analysis to predict the incidence and severity of each crewmember's space sickness as rated by NASA flight surgeons. The nine variables are serum uric acid, red cell count, environmental temperature at the launch site, serum phosphate, urine osmolality, serum thyroxine, sitting systolic blood pressure, calculated blood volume, and serum chloride. Using two methods of cross-validation on the original samples (jackknife and a stratefied random subsample), these variables enable the prediction of space sickness incidence (NONE or SICK) with 80 percent sickness and space severity (NONE, MILD, MODERATE, of SEVERE) with 59 percent success by one method of cross-validation and 67 percent by another method. Addition of a tenth variable, hours spent in the Weightlessness Environment Training Facility (WETF) did not improve the prediction of space sickness incidences but did improve the prediction of space sickness severity to 66 percent success by the first method of cross-validation of original samples and to 71 percent by the second method. Results to date suggest the presence of predisposing physiologic factors to space sickness that implicate fluid shift etiology. The data also suggest that prior exposure to fluid shift during WETF training may produce some circulatory pre-adaption to fluid shifts in weightlessness that results in a reduction of space sickness severity.
NASA Astrophysics Data System (ADS)
Lv, Chao; Zheng, Lianqing; Yang, Wei
2012-01-01
Molecular dynamics sampling can be enhanced via the promoting of potential energy fluctuations, for instance, based on a Hamiltonian modified with the addition of a potential-energy-dependent biasing term. To overcome the diffusion sampling issue, which reveals the fact that enlargement of event-irrelevant energy fluctuations may abolish sampling efficiency, the essential energy space random walk (EESRW) approach was proposed earlier. To more effectively accelerate the sampling of solute conformations in aqueous environment, in the current work, we generalized the EESRW method to a two-dimension-EESRW (2D-EESRW) strategy. Specifically, the essential internal energy component of a focused region and the essential interaction energy component between the focused region and the environmental region are employed to define the two-dimensional essential energy space. This proposal is motivated by the general observation that in different conformational events, the two essential energy components have distinctive interplays. Model studies on the alanine dipeptide and the aspartate-arginine peptide demonstrate sampling improvement over the original one-dimension-EESRW strategy; with the same biasing level, the present generalization allows more effective acceleration of the sampling of conformational transitions in aqueous solution. The 2D-EESRW generalization is readily extended to higher dimension schemes and employed in more advanced enhanced-sampling schemes, such as the recent orthogonal space random walk method.
Sampling errors for a nadir viewing instrument on the International Space Station
NASA Astrophysics Data System (ADS)
Berger, H. I.; Pincus, R.; Evans, F.; Santek, D.; Ackerman, S.; Ackerman, S.
2001-12-01
In an effort to improve the observational charactarization of ice clouds in the earth's atmosphere, we are developing a sub-millimeter wavelength radiometer which we propose to fly on the International Space Station for two years. Our goal is to accurately measure the ice water path and mass-weighted particle size at the finest possible temporal and spatial resolution. The ISS orbit precesses, sampling through the dirunal cycle every 16 days, but technological constraints limit our instrument to a single pixel viewed near nadir. We discuss sampling errors associated with this instrument/platform configuration. We use as "truth" the ISCCP dataset of pixel-level cloud optical retrievals, which acts as a proxy for ice water path; this dataset is sampled according to the orbital characteristics of the space station, and the statistics computed from the sub-sampled population are compared with those from the full dataset. We explore the tradeoffs in average sampling error as a function of the averaging time and spatial scale, and explore the possibility of resolving the dirunal cycle.
Diagnosing hyperuniformity in two-dimensional, disordered, jammed packings of soft spheres.
Dreyfus, Remi; Xu, Ye; Still, Tim; Hough, L A; Yodh, A G; Torquato, Salvatore
2015-01-01
Hyperuniformity characterizes a state of matter for which (scaled) density fluctuations diminish towards zero at the largest length scales. However, the task of determining whether or not an image of an experimental system is hyperuniform is experimentally challenging due to finite-resolution, noise, and sample-size effects that influence characterization measurements. Here we explore these issues, employing video optical microscopy to study hyperuniformity phenomena in disordered two-dimensional jammed packings of soft spheres. Using a combination of experiment and simulation we characterize the possible adverse effects of particle polydispersity, image noise, and finite-size effects on the assignment of hyperuniformity, and we develop a methodology that permits improved diagnosis of hyperuniformity from real-space measurements. The key to this improvement is a simple packing reconstruction algorithm that incorporates particle polydispersity to minimize the free volume. In addition, simulations show that hyperuniformity in finite-sized samples can be ascertained more accurately in direct space than in reciprocal space. Finally, our experimental colloidal packings of soft polymeric spheres are shown to be effectively hyperuniform.
Diagnosing hyperuniformity in two-dimensional, disordered, jammed packings of soft spheres
NASA Astrophysics Data System (ADS)
Dreyfus, Remi; Xu, Ye; Still, Tim; Hough, L. A.; Yodh, A. G.; Torquato, Salvatore
2015-01-01
Hyperuniformity characterizes a state of matter for which (scaled) density fluctuations diminish towards zero at the largest length scales. However, the task of determining whether or not an image of an experimental system is hyperuniform is experimentally challenging due to finite-resolution, noise, and sample-size effects that influence characterization measurements. Here we explore these issues, employing video optical microscopy to study hyperuniformity phenomena in disordered two-dimensional jammed packings of soft spheres. Using a combination of experiment and simulation we characterize the possible adverse effects of particle polydispersity, image noise, and finite-size effects on the assignment of hyperuniformity, and we develop a methodology that permits improved diagnosis of hyperuniformity from real-space measurements. The key to this improvement is a simple packing reconstruction algorithm that incorporates particle polydispersity to minimize the free volume. In addition, simulations show that hyperuniformity in finite-sized samples can be ascertained more accurately in direct space than in reciprocal space. Finally, our experimental colloidal packings of soft polymeric spheres are shown to be effectively hyperuniform.
High performance transcription factor-DNA docking with GPU computing
2012-01-01
Background Protein-DNA docking is a very challenging problem in structural bioinformatics and has important implications in a number of applications, such as structure-based prediction of transcription factor binding sites and rational drug design. Protein-DNA docking is very computational demanding due to the high cost of energy calculation and the statistical nature of conformational sampling algorithms. More importantly, experiments show that the docking quality depends on the coverage of the conformational sampling space. It is therefore desirable to accelerate the computation of the docking algorithm, not only to reduce computing time, but also to improve docking quality. Methods In an attempt to accelerate the sampling process and to improve the docking performance, we developed a graphics processing unit (GPU)-based protein-DNA docking algorithm. The algorithm employs a potential-based energy function to describe the binding affinity of a protein-DNA pair, and integrates Monte-Carlo simulation and a simulated annealing method to search through the conformational space. Algorithmic techniques were developed to improve the computation efficiency and scalability on GPU-based high performance computing systems. Results The effectiveness of our approach is tested on a non-redundant set of 75 TF-DNA complexes and a newly developed TF-DNA docking benchmark. We demonstrated that the GPU-based docking algorithm can significantly accelerate the simulation process and thereby improving the chance of finding near-native TF-DNA complex structures. This study also suggests that further improvement in protein-DNA docking research would require efforts from two integral aspects: improvement in computation efficiency and energy function design. Conclusions We present a high performance computing approach for improving the prediction accuracy of protein-DNA docking. The GPU-based docking algorithm accelerates the search of the conformational space and thus increases the chance of finding more near-native structures. To the best of our knowledge, this is the first ad hoc effort of applying GPU or GPU clusters to the protein-DNA docking problem. PMID:22759575
NASA Astrophysics Data System (ADS)
Muller, Wayne; Scheuermann, Alexander
2016-04-01
Measuring the electrical permittivity of civil engineering materials is important for a range of ground penetrating radar (GPR) and pavement moisture measurement applications. Compacted unbound granular (UBG) pavement materials present a number of preparation and measurement challenges using conventional characterisation techniques. As an alternative to these methods, a modified free-space (MFS) characterisation approach has previously been investigated. This paper describes recent work to optimise and validate the MFS technique. The research included finite difference time domain (FDTD) modelling to better understand the nature of wave propagation within material samples and the test apparatus. This research led to improvements in the test approach and optimisation of sample sizes. The influence of antenna spacing and sample thickness on the permittivity results was investigated by a series of experiments separating antennas and measuring samples of nylon and water. Permittivity measurements of samples of nylon and water approximately 100 mm and 170 mm thick were also compared, showing consistent results. These measurements also agreed well with surface probe measurements of the nylon sample and literature values for water. The results indicate permittivity estimates of acceptable accuracy can be obtained using the proposed approach, apparatus and sample sizes.
NASA Technical Reports Server (NTRS)
Ang, C.-Y.; Lacy, L. L.
1979-01-01
Typical commercial or laboratory-prepared samples of polycrystalline AlSb contain microstructural inhomogeneities of Al- or Sb-rich phases in addition to the primary AlSb grains. The paper reports on gravitational influences, such as density-driven convection or sedimentation, that cause microscopic phase separation and nonequilibrium conditions to exist in earth-based melts of AlSb. A triple-cavity electric furnace is used to homogenize the multiphase AlSb samples in space and on earth. A comparative characterization of identically processed low- and one-gravity samples of commercial AlSb reveals major improvements in the homogeneity of the low-gravity homogenized material.
Adaptive single-pixel imaging with aggregated sampling and continuous differential measurements
NASA Astrophysics Data System (ADS)
Huo, Yaoran; He, Hongjie; Chen, Fan; Tai, Heng-Ming
2018-06-01
This paper proposes an adaptive compressive imaging technique with one single-pixel detector and single arm. The aggregated sampling (AS) method enables the reduction of resolutions of the reconstructed images. It aims to reduce the time and space consumption. The target image with a resolution up to 1024 × 1024 can be reconstructed successfully at the 20% sampling rate. The continuous differential measurement (CDM) method combined with a ratio factor of significant coefficient (RFSC) improves the imaging quality. Moreover, RFSC reduces the human intervention in parameter setting. This technique enhances the practicability of single-pixel imaging with the benefits from less time and space consumption, better imaging quality and less human intervention.
Yin, X X; Ng, B W-H; Ramamohanarao, K; Baghai-Wadji, A; Abbott, D
2012-09-01
It has been shown that, magnetic resonance images (MRIs) with sparsity representation in a transformed domain, e.g. spatial finite-differences (FD), or discrete cosine transform (DCT), can be restored from undersampled k-space via applying current compressive sampling theory. The paper presents a model-based method for the restoration of MRIs. The reduced-order model, in which a full-system-response is projected onto a subspace of lower dimensionality, has been used to accelerate image reconstruction by reducing the size of the involved linear system. In this paper, the singular value threshold (SVT) technique is applied as a denoising scheme to reduce and select the model order of the inverse Fourier transform image, and to restore multi-slice breast MRIs that have been compressively sampled in k-space. The restored MRIs with SVT for denoising show reduced sampling errors compared to the direct MRI restoration methods via spatial FD, or DCT. Compressive sampling is a technique for finding sparse solutions to underdetermined linear systems. The sparsity that is implicit in MRIs is to explore the solution to MRI reconstruction after transformation from significantly undersampled k-space. The challenge, however, is that, since some incoherent artifacts result from the random undersampling, noise-like interference is added to the image with sparse representation. These recovery algorithms in the literature are not capable of fully removing the artifacts. It is necessary to introduce a denoising procedure to improve the quality of image recovery. This paper applies a singular value threshold algorithm to reduce the model order of image basis functions, which allows further improvement of the quality of image reconstruction with removal of noise artifacts. The principle of the denoising scheme is to reconstruct the sparse MRI matrices optimally with a lower rank via selecting smaller number of dominant singular values. The singular value threshold algorithm is performed by minimizing the nuclear norm of difference between the sampled image and the recovered image. It has been illustrated that this algorithm improves the ability of previous image reconstruction algorithms to remove noise artifacts while significantly improving the quality of MRI recovery.
Materials Science Research Rack Onboard the International Space Station
NASA Technical Reports Server (NTRS)
Frazier, Natalie C.; Johnson, Jimmie; Aicher, Winfried
2011-01-01
The Materials Science Research Rack (MSRR) allows for the study of a variety of materials including metals, ceramics, semiconductor crystals, and glasses onboard the International Space Station (ISS). MSRR was launched on STS-128 in August 2009, and is currently installed in the U. S. Destiny Laboratory Module. Since that time, MSRR has performed virtually flawlessly logging more than 550 hours of operating time. Materials science is an integral part of development of new materials for everyday life here on Earth. The goal of studying materials processing in space is to develop a better understanding of the chemical and physical mechanisms involved. Materials science research benefits from the microgravity environment of space, where the researcher can better isolate chemical and thermal properties of materials from the effects of gravity. With this knowledge, reliable predictions can be made about the conditions required on Earth to achieve improved materials. MSRR is a highly automated facility containing two furnace inserts in which Sample Cartridge Assemblies (SCAs), each containing one material sample, can be processed up to temperatures of 1400C. Once an SCA is installed by a Crew Member, the experiment can be run by automatic command or science conducted via telemetry commands from the ground. Initially, 12 SCAs were processed in the first furnace insert for a team of European and US investigators. The processed samples have been returned to Earth for evaluation and comparison of their properties to samples similarly processed on the ground. A preliminary examination of the samples indicates that the majority of the desired science objectives have been successfully met leading to significant improvements in the understanding of alloy solidification processes. The second furnace insert will be installed in the facility in January 2011 for processing the remaining SCA currently on orbit. Six SCAs are planned for launch summer 2011, and additional batches are planned for future processing. This facility is available to support additional materials science investigations through programs such as the US National Laboratory, Technology Development, NASA Research Announcements, ESA application oriented research programs, and others. The development of the research rack was a cooperative effort between NASA's Marshall Space Flight Center and the European Space Agency (ESA).
Sparse magnetic resonance imaging reconstruction using the bregman iteration
NASA Astrophysics Data System (ADS)
Lee, Dong-Hoon; Hong, Cheol-Pyo; Lee, Man-Woo
2013-01-01
Magnetic resonance imaging (MRI) reconstruction needs many samples that are sequentially sampled by using phase encoding gradients in a MRI system. It is directly connected to the scan time for the MRI system and takes a long time. Therefore, many researchers have studied ways to reduce the scan time, especially, compressed sensing (CS), which is used for sparse images and reconstruction for fewer sampling datasets when the k-space is not fully sampled. Recently, an iterative technique based on the bregman method was developed for denoising. The bregman iteration method improves on total variation (TV) regularization by gradually recovering the fine-scale structures that are usually lost in TV regularization. In this study, we studied sparse sampling image reconstruction using the bregman iteration for a low-field MRI system to improve its temporal resolution and to validate its usefulness. The image was obtained with a 0.32 T MRI scanner (Magfinder II, SCIMEDIX, Korea) with a phantom and an in-vivo human brain in a head coil. We applied random k-space sampling, and we determined the sampling ratios by using half the fully sampled k-space. The bregman iteration was used to generate the final images based on the reduced data. We also calculated the root-mean-square-error (RMSE) values from error images that were obtained using various numbers of bregman iterations. Our reconstructed images using the bregman iteration for sparse sampling images showed good results compared with the original images. Moreover, the RMSE values showed that the sparse reconstructed phantom and the human images converged to the original images. We confirmed the feasibility of sparse sampling image reconstruction methods using the bregman iteration with a low-field MRI system and obtained good results. Although our results used half the sampling ratio, this method will be helpful in increasing the temporal resolution at low-field MRI systems.
Megchelenbrink, Wout; Huynen, Martijn; Marchiori, Elena
2014-01-01
Constraint-based models of metabolic networks are typically underdetermined, because they contain more reactions than metabolites. Therefore the solutions to this system do not consist of unique flux rates for each reaction, but rather a space of possible flux rates. By uniformly sampling this space, an estimated probability distribution for each reaction's flux in the network can be obtained. However, sampling a high dimensional network is time-consuming. Furthermore, the constraints imposed on the network give rise to an irregularly shaped solution space. Therefore more tailored, efficient sampling methods are needed. We propose an efficient sampling algorithm (called optGpSampler), which implements the Artificial Centering Hit-and-Run algorithm in a different manner than the sampling algorithm implemented in the COBRA Toolbox for metabolic network analysis, here called gpSampler. Results of extensive experiments on different genome-scale metabolic networks show that optGpSampler is up to 40 times faster than gpSampler. Application of existing convergence diagnostics on small network reconstructions indicate that optGpSampler converges roughly ten times faster than gpSampler towards similar sampling distributions. For networks of higher dimension (i.e. containing more than 500 reactions), we observed significantly better convergence of optGpSampler and a large deviation between the samples generated by the two algorithms. optGpSampler for Matlab and Python is available for non-commercial use at: http://cs.ru.nl/~wmegchel/optGpSampler/.
Space sickness predictors suggest fluid shift involvement and possible countermeasures
NASA Technical Reports Server (NTRS)
Simanonok, K. E.; Moseley, E. C.; Charles, J. B.
1992-01-01
Preflight data from 64 first time Shuttle crew members were examined retrospectively to predict space sickness severity (NONE, MILD, MODERATE, or SEVERE) by discriminant analysis. From 9 input variables relating to fluid, electrolyte, and cardiovascular status, 8 variables were chosen by discriminant analysis that correctly predicted space sickness severity with 59 pct. success by one method of cross validation on the original sample and 67 pct. by another method. The 8 variables in order of their importance for predicting space sickness severity are sitting systolic blood pressure, serum uric acid, calculated blood volume, serum phosphate, urine osmolality, environmental temperature at the launch site, red cell count, and serum chloride. These results suggest the presence of predisposing physiologic factors to space sickness that implicate a fluid shift etiology. Addition of a 10th input variable, hours spent in the Weightless Environment Training Facility (WETF), improved the prediction of space sickness severity to 66 pct. success by the first method of cross validation on the original sample and to 71 pct. by the second method. The data suggest that WETF training may reduce space sickness severity.
NASA Technical Reports Server (NTRS)
1975-01-01
The retention of granular catalyst in a metal foam matrix was demonstrated to greatly increase the life capability of hydrazine monopropellant reactors. Since nickel foam used in previous tests was found to become degraded after long-term exposure the cause of degradation was examined and metal foams of improved durability were developed. The most durable foam developed was a rhodium-coated nickel foam. An all-platinum foam was found to be incompatible in a hot ammonia (hydrazine) environment. It is recommended to scale up the manufacturing process for the improved foam to produce samples sufficiently large for space shuttle APU gas generator testing.
Simulating and assessing boson sampling experiments with phase-space representations
NASA Astrophysics Data System (ADS)
Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.
2018-04-01
The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.
NASA Astrophysics Data System (ADS)
Arabshahi, P.; Chao, Y.; Chien, S.; Gray, A.; Howe, B. M.; Roy, S.
2008-12-01
In many areas of Earth science, including climate change research, there is a need for near real-time integration of data from heterogeneous and spatially distributed sensors, in particular in-situ and space- based sensors. The data integration, as provided by a smart sensor web, enables numerous improvements, namely, 1) adaptive sampling for more efficient use of expensive space-based sensing assets, 2) higher fidelity information gathering from data sources through integration of complementary data sets, and 3) improved sensor calibration. The specific purpose of the smart sensor web development presented here is to provide for adaptive sampling and calibration of space-based data via in-situ data. Our ocean-observing smart sensor web presented herein is composed of both mobile and fixed underwater in-situ ocean sensing assets and Earth Observing System (EOS) satellite sensors providing larger-scale sensing. An acoustic communications network forms a critical link in the web between the in-situ and space-based sensors and facilitates adaptive sampling and calibration. After an overview of primary design challenges, we report on the development of various elements of the smart sensor web. These include (a) a cable-connected mooring system with a profiler under real-time control with inductive battery charging; (b) a glider with integrated acoustic communications and broadband receiving capability; (c) satellite sensor elements; (d) an integrated acoustic navigation and communication network; and (e) a predictive model via the Regional Ocean Modeling System (ROMS). Results from field experiments, including an upcoming one in Monterey Bay (October 2008) using live data from NASA's EO-1 mission in a semi closed-loop system, together with ocean models from ROMS, are described. Plans for future adaptive sampling demonstrations using the smart sensor web are also presented.
Evaluation of Techniques Used to Estimate Cortical Feature Maps
Katta, Nalin; Chen, Thomas L.; Watkins, Paul V.; Barbour, Dennis L.
2011-01-01
Functional properties of neurons are often distributed nonrandomly within a cortical area and form topographic maps that reveal insights into neuronal organization and interconnection. Some functional maps, such as in visual cortex, are fairly straightforward to discern with a variety of techniques, while other maps, such as in auditory cortex, have resisted easy characterization. In order to determine appropriate protocols for establishing accurate functional maps in auditory cortex, artificial topographic maps were probed under various conditions, and the accuracy of estimates formed from the actual maps was quantified. Under these conditions, low-complexity maps such as sound frequency can be estimated accurately with as few as 25 total samples (e.g., electrode penetrations or imaging pixels) if neural responses are averaged together. More samples are required to achieve the highest estimation accuracy for higher complexity maps, and averaging improves map estimate accuracy even more than increasing sampling density. Undersampling without averaging can result in misleading map estimates, while undersampling with averaging can lead to the false conclusion of no map when one actually exists. Uniform sample spacing only slightly improves map estimation over nonuniform sample spacing typical of serial electrode penetrations. Tessellation plots commonly used to visualize maps estimated using nonuniform sampling are always inferior to linearly interpolated estimates, although differences are slight at higher sampling densities. Within primary auditory cortex, then, multiunit sampling with at least 100 samples would likely result in reasonable feature map estimates for all but the highest complexity maps and the highest variability that might be expected. PMID:21889537
Long Duration Space Materials Exposure (LDSE)
NASA Technical Reports Server (NTRS)
Allen, David; Schmidt, Robert
1992-01-01
The Center on Materials for Space Structures (CMSS) at Case Western Reserve University is one of seventeen Commercial Centers for the Development of Space. It was founded to: (1) produce and evaluate materials for space structures; (2) develop passive and active facilities for materials exposure and analysis in space; and (3) develop improved material systems for space structures. A major active facility for materials exposure is proposed to be mounted on the exterior truss of the Space Station Freedom (SSF). This Long Duration Space Materials Exposure (LDSE) experiment will be an approximately 6 1/2 ft. x 4 ft. panel facing into the velocity vector (RAM) to provide long term exposure (up to 30 years) to atomic oxygen, UV, micro meteorites, and other low earth orbit effects. It can expose large or small active (instrumented) or passive samples. These samples may be mounted in a removable Materials Flight Experiment (MFLEX) carrier which may be periodically brought into the SSF for examination by CMSS's other SSF facility, the Space Materials Evaluation Facility (SMEF), which will contain a Scanning Electron Microscope, a Variable Angle & Scanning Ellipsometer, a Fourier Transform Infrared Spectrometer, and other analysis equipment. These facilities will allow commercial firms to test their materials in space and promptly obtain information on their materials survivability in the LEO environment.
Leap-dynamics: efficient sampling of conformational space of proteins and peptides in solution.
Kleinjung, J; Bayley, P; Fraternali, F
2000-03-31
A molecular simulation scheme, called Leap-dynamics, that provides efficient sampling of protein conformational space in solution is presented. The scheme is a combined approach using a fast sampling method, imposing conformational 'leaps' to force the system over energy barriers, and molecular dynamics (MD) for refinement. The presence of solvent is approximated by a potential of mean force depending on the solvent accessible surface area. The method has been successfully applied to N-acetyl-L-alanine-N-methylamide (alanine dipeptide), sampling experimentally observed conformations inaccessible to MD alone under the chosen conditions. The method predicts correctly the increased partial flexibility of the mutant Y35G compared to native bovine pancreatic trypsin inhibitor. In particular, the improvement over MD consists of the detection of conformational flexibility that corresponds closely to slow motions identified by nuclear magnetic resonance techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai
We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less
Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai
2018-03-01
We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less
Zeller, Fabian; Zacharias, Martin
2014-02-11
The accurate calculation of potentials of mean force for ligand-receptor binding is one of the most important applications of molecular simulation techniques. Typically, the separation distance between ligand and receptor is chosen as a reaction coordinate along which a PMF can be calculated with the aid of umbrella sampling (US) techniques. In addition, restraints can be applied on the relative position and orientation of the partner molecules to reduce accessible phase space. An approach combining such phase space reduction with flattening of the free energy landscape and configurational exchanges has been developed, which significantly improves the convergence of PMF calculations in comparison with standard umbrella sampling. The free energy surface along the reaction coordinate is smoothened by iteratively adapting biasing potentials corresponding to previously calculated PMFs. Configurations are allowed to exchange between the umbrella simulation windows via the Hamiltonian replica exchange method. The application to a DNA molecule in complex with a minor groove binding ligand indicates significantly improved convergence and complete reversibility of the sampling along the pathway. The calculated binding free energy is in excellent agreement with experimental results. In contrast, the application of standard US resulted in large differences between PMFs calculated for association and dissociation pathways. The approach could be a useful alternative to standard US for computational studies on biomolecular recognition processes.
Multiplexed phase-space imaging for 3D fluorescence microscopy.
Liu, Hsiou-Yuan; Zhong, Jingshan; Waller, Laura
2017-06-26
Optical phase-space functions describe spatial and angular information simultaneously; examples of optical phase-space functions include light fields in ray optics and Wigner functions in wave optics. Measurement of phase-space enables digital refocusing, aberration removal and 3D reconstruction. High-resolution capture of 4D phase-space datasets is, however, challenging. Previous scanning approaches are slow, light inefficient and do not achieve diffraction-limited resolution. Here, we propose a multiplexed method that solves these problems. We use a spatial light modulator (SLM) in the pupil plane of a microscope in order to sequentially pattern multiplexed coded apertures while capturing images in real space. Then, we reconstruct the 3D fluorescence distribution of our sample by solving an inverse problem via regularized least squares with a proximal accelerated gradient descent solver. We experimentally reconstruct a 101 Megavoxel 3D volume (1010×510×500µm with NA 0.4), demonstrating improved acquisition time, light throughput and resolution compared to scanning aperture methods. Our flexible patterning scheme further allows sparsity in the sample to be exploited for reduced data capture.
Yu, Yong-Jie; Wu, Hai-Long; Fu, Hai-Yan; Zhao, Juan; Li, Yuan-Na; Li, Shu-Fang; Kang, Chao; Yu, Ru-Qin
2013-08-09
Chromatographic background drift correction has been an important field of research in chromatographic analysis. In the present work, orthogonal spectral space projection for background drift correction of three-dimensional chromatographic data was described in detail and combined with parallel factor analysis (PARAFAC) to resolve overlapped chromatographic peaks and obtain the second-order advantage. This strategy was verified by simulated chromatographic data and afforded significant improvement in quantitative results. Finally, this strategy was successfully utilized to quantify eleven antibiotics in tap water samples. Compared with the traditional methodology of introducing excessive factors for the PARAFAC model to eliminate the effect of background drift, clear improvement in the quantitative performance of PARAFAC was observed after background drift correction by orthogonal spectral space projection. Copyright © 2013 Elsevier B.V. All rights reserved.
Improving the sampling efficiency of Monte Carlo molecular simulations: an evolutionary approach
NASA Astrophysics Data System (ADS)
Leblanc, Benoit; Braunschweig, Bertrand; Toulhoat, Hervé; Lutton, Evelyne
We present a new approach in order to improve the convergence of Monte Carlo (MC) simulations of molecular systems belonging to complex energetic landscapes: the problem is redefined in terms of the dynamic allocation of MC move frequencies depending on their past efficiency, measured with respect to a relevant sampling criterion. We introduce various empirical criteria with the aim of accounting for the proper convergence in phase space sampling. The dynamic allocation is performed over parallel simulations by means of a new evolutionary algorithm involving 'immortal' individuals. The method is bench marked with respect to conventional procedures on a model for melt linear polyethylene. We record significant improvement in sampling efficiencies, thus in computational load, while the optimal sets of move frequencies are liable to allow interesting physical insights into the particular systems simulated. This last aspect should provide a new tool for designing more efficient new MC moves.
Movie of phase separation during physics of colloids in space experiment
NASA Technical Reports Server (NTRS)
2002-01-01
Still photographs taken over 16 hours on Nov. 13, 2001, on the International Space Station have been condensed into a few seconds to show the de-mixing -- or phase separation -- process studied by the Experiment on Physics of Colloids in Space. Commanded from the ground, dozens of similar tests have been conducted since the experiment arrived on ISS in 2000. The sample is a mix of polymethylmethacrylate (PMMA or acrylic) colloids, polystyrene polymers and solvents. The circular area in the video is 2 cm (0.8 in.) in diameter. The phase separation process occurs spontaneously after the sample is mechanically mixed. The evolving lighter regions are rich in colloid and have the structure of a liquid. The dark regions are poor in colloids and have the structure of a gas. This behavior carnot be observed on Earth because gravity causes the particles to fall out of solution faster than the phase separation can occur. While similar to a gas-liquid phase transition, the growth rate observed in this test is different from any atomic gas-liquid or liquid-liquid phase transition ever measured experimentally. Ultimately, the sample separates into colloid-poor and colloid-rich areas, just as oil and vinegar separate. The fundamental science of de-mixing in this colloid-polymer sample is the same found in the annealing of metal alloys and plastic polymer blends. Improving the understanding of this process may lead to improving processing of these materials on Earth.
Phase separation during the Experiment on Physics of Colloids in Space
NASA Technical Reports Server (NTRS)
2003-01-01
Still photographs taken over 16 hours on Nov. 13, 2001, on the International Space Station have been condensed into a few seconds to show the de-mixing -- or phase separation -- process studied by the Experiment on Physics of Colloids in Space. Commanded from the ground, dozens of similar tests have been conducted since the experiment arrived on ISS in 2000. The sample is a mix of polymethylmethacrylate (PMMA or acrylic) colloids, polystyrene polymers and solvents. The circular area is 2 cm (0.8 in.) in diameter. The phase separation process occurs spontaneously after the sample is mechanically mixed. The evolving lighter regions are rich in colloid and have the structure of a liquid. The dark regions are poor in colloids and have the structure of a gas. This behavior carnot be observed on Earth because gravity causes the particles to fall out of solution faster than the phase separation can occur. While similar to a gas-liquid phase transition, the growth rate observed in this test is different from any atomic gas-liquid or liquid-liquid phase transition ever measured experimentally. Ultimately, the sample separates into colloid-poor and colloid-rich areas, just as oil and vinegar separate. The fundamental science of de-mixing in this colloid-polymer sample is the same found in the annealing of metal alloys and plastic polymer blends. Improving the understanding of this process may lead to improving processing of these materials on Earth.
Astromaterials Curation Online Resources for Principal Investigators
NASA Technical Reports Server (NTRS)
Todd, Nancy S.; Zeigler, Ryan A.; Mueller, Lina
2017-01-01
The Astromaterials Acquisition and Curation office at NASA Johnson Space Center curates all of NASA's extraterrestrial samples, the most extensive set of astromaterials samples available to the research community worldwide. The office allocates 1500 individual samples to researchers and students each year and has served the planetary research community for 45+ years. The Astromaterials Curation office provides access to its sample data repository and digital resources to support the research needs of sample investigators and to aid in the selection and request of samples for scientific study. These resources can be found on the Astromaterials Acquisition and Curation website at https://curator.jsc.nasa.gov. To better serve our users, we have engaged in several activities to enhance the data available for astromaterials samples, to improve the accessibility and performance of the website, and to address user feedback. We havealso put plans in place for continuing improvements to our existing data products.
Crane cabins' interior space multivariate anthropometric modeling.
Essdai, Ahmed; Spasojević Brkić, Vesna K; Golubović, Tamara; Brkić, Aleksandar; Popović, Vladimir
2018-01-01
Previous research has shown that today's crane cabins fail to meet the needs of a large proportion of operators. Performance and financial losses and effects on safety should not be overlooked as well. The first aim of this survey is to model the crane cabin interior space using up-to-date crane operator anthropometric data and to compare the multivariate and univariate method anthropometric models. The second aim of the paper is to define the crane cabin interior space dimensions that enable anthropometric convenience. To facilitate the cabin design, the anthropometric dimensions of 64 crane operators in the first sample and 19 more in the second sample were collected in Serbia. The multivariate anthropometric models, spanning 95% of the population on the basis of a set of 8 anthropometric dimensions, have been developed. The percentile method was also used on the same set of data. The dimensions of the interior space, necessary for the accommodation of the crane operator, are 1174×1080×1865 mm. The percentiles results for the 5th and 95th model are within the obtained dimensions. The results of this study may prove useful to crane cabin designers in eliminating anthropometric inconsistencies and improving the health of operators, but can also aid in improving the safety, performance and financial results of the companies where crane cabins operate.
Chen, Changjun; Huang, Yanzhao; Xiao, Yi
2013-01-01
Low sampling efficiency in conformational space is the well-known problem for conventional molecular dynamics. It greatly increases the difficulty for molecules to find the transition path to native state, and costs amount of CPU time. To accelerate the sampling, in this paper, we re-couple the critical degrees of freedom in the molecule to environment temperature, like dihedrals in generalized coordinates or nonhydrogen atoms in Cartesian coordinate. After applying to ALA dipeptide model, we find that this modified molecular dynamics greatly enhances the sampling behavior in the conformational space and provides more information about the state-to-state transition, while conventional molecular dynamics fails to do so. Moreover, from the results of 16 independent 100 ns simulations by the new method, it shows that trpzip2 has one-half chances to reach the naive state in all the trajectories, which is greatly higher than conventional molecular dynamics. Such an improvement would provide a potential way for searching the conformational space or predicting the most stable states of peptides and proteins.
Avoiding Surgical Skill Decay: A Systematic Review on the Spacing of Training Sessions.
Cecilio-Fernandes, Dario; Cnossen, Fokie; Jaarsma, Debbie A D C; Tio, René A
Spreading training sessions over time instead of training in just 1 session leads to an improvement of long-term retention for factual knowledge. However, it is not clear whether this would also apply to surgical skills. Thus, we performed a systematic review to find out whether spacing training sessions would also improve long-term retention of surgical skills. We searched the Medline, PsycINFO, Embase, Eric, and Web of Science online databases. We only included articles that were randomized trials with a sample of medical trainees acquiring surgical motor skills in which the spacing effect was reported. The quality and bias of the articles were assessed using the Cochrane Collaboration's risk of bias assessment tool. With respect to the spacing effect, 1955 articles were retrieved. After removing duplicates and articles that did not meet the inclusion criteria, 11 articles remained. The overall quality of the experiments was "moderate." Trainees in the spaced condition scored higher in a retention test than students in the massed condition. Our systematic review showed evidence that spacing training sessions improves long-term surgical skills retention when compared to massed practice. However, the optimal gap between the re-study sessions is unclear. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Nonuniform depth grids in parabolic equation solutions.
Sanders, William M; Collins, Michael D
2013-04-01
The parabolic wave equation is solved using a finite-difference solution in depth that involves a nonuniform grid. The depth operator is discretized using Galerkin's method with asymmetric hat functions. Examples are presented to illustrate that this approach can be used to improve efficiency for problems in ocean acoustics and seismo-acoustics. For shallow water problems, accuracy is sensitive to the precise placement of the ocean bottom interface. This issue is often addressed with the inefficient approach of using a fine grid spacing over all depth. Efficiency may be improved by using a relatively coarse grid with nonuniform sampling to precisely position the interface. Efficiency may also be improved by reducing the sampling in the sediment and in an absorbing layer that is used to truncate the computational domain. Nonuniform sampling may also be used to improve the implementation of a single-scattering approximation for sloping fluid-solid interfaces.
A Time and Place for Everything: Developmental Differences in the Building Blocks of Episodic Memory
Lee, Joshua K.; Wendelken, J. Carter; Bunge, Silvia A.; Ghetti, Simona
2015-01-01
This research investigated whether episodic memory development can be explained by improvements in relational binding processes, involved in forming novel associations between events and the context in which they occurred. Memory for item-space, item-time, and item-item relations was assessed in an ethnically diverse sample of 151 children aged 7 to 11 years and 28 young adults. Item-space memory reached adult performance by 9½ years, whereas item-time and item-item memory improved into adulthood. In path analysis, item-space, but not item-time best explained item-item memory. Across age groups, relational binding related to source memory and performance on standardized memory assessments. In conclusion, relational binding development depends on relation type, but relational binding overall supports episodic memory development. PMID:26493950
Joint 6D k-q Space Compressed Sensing for Accelerated High Angular Resolution Diffusion MRI.
Cheng, Jian; Shen, Dinggang; Basser, Peter J; Yap, Pew-Thian
2015-01-01
High Angular Resolution Diffusion Imaging (HARDI) avoids the Gaussian. diffusion assumption that is inherent in Diffusion Tensor Imaging (DTI), and is capable of characterizing complex white matter micro-structure with greater precision. However, HARDI methods such as Diffusion Spectrum Imaging (DSI) typically require significantly more signal measurements than DTI, resulting in prohibitively long scanning times. One of the goals in HARDI research is therefore to improve estimation of quantities such as the Ensemble Average Propagator (EAP) and the Orientation Distribution Function (ODF) with a limited number of diffusion-weighted measurements. A popular approach to this problem, Compressed Sensing (CS), affords highly accurate signal reconstruction using significantly fewer (sub-Nyquist) data points than required traditionally. Existing approaches to CS diffusion MRI (CS-dMRI) mainly focus on applying CS in the q-space of diffusion signal measurements and fail to take into consideration information redundancy in the k-space. In this paper, we propose a framework, called 6-Dimensional Compressed Sensing diffusion MRI (6D-CS-dMRI), for reconstruction of the diffusion signal and the EAP from data sub-sampled in both 3D k-space and 3D q-space. To our knowledge, 6D-CS-dMRI is the first work that applies compressed sensing in the full 6D k-q space and reconstructs the diffusion signal in the full continuous q-space and the EAP in continuous displacement space. Experimental results on synthetic and real data demonstrate that, compared with full DSI sampling in k-q space, 6D-CS-dMRI yields excellent diffusion signal and EAP reconstruction with low root-mean-square error (RMSE) using 11 times less samples (3-fold reduction in k-space and 3.7-fold reduction in q-space).
Food and waste management biotechnology for the space shuttle
NASA Technical Reports Server (NTRS)
Murray, R. W.; Schelkopf, J. D.; Hunt, S. R.; Sauer, R. L.
1979-01-01
Space-crew facilities for preparation, eating, personal hygiene and waste management are contained in one small area of the Shuttle Orbiter Mid-Deck, all the functional systems being interconnected. The paper discusses three major systems: (1) the Galley, which includes the personal hygiene station and food packages; (2) the Waste Collector, which includes provisions for male and female users, urine, feces and emesis collection in both a normal and contigency mode of operation; and (3) Biowaste Monitoring, which includes mass measurement and sampling. The technology improvement continues by assuring that the Orbiter systems have sufficient design flexibility to permit later improvements in operation and in function.
Participation in the Infrared Space Observatory (ISO) Mission
NASA Technical Reports Server (NTRS)
Joseph, Robert D.
2002-01-01
All the Infrared Space Observatory (ISO) data have been transmitted from the ISO Data Centre, reduced, and calibrated. This has been rather labor-intensive as new calibrations for both the ISOPHOT and ISOCAM data have been released and the algorithms for data reduction have improved. We actually discovered errors in the calibration in earlier versions of the software. However the data reduction improvements have now converged and we have a self-consistent, well-calibrated database. It has also been a major effort to obtain the ground-based JHK imaging, 450 micrometer and 850 micrometer imaging and the 1-2.5 micrometer near-infrared spectroscopy for most of the sample galaxies.
Compressively sampled MR image reconstruction using generalized thresholding iterative algorithm
NASA Astrophysics Data System (ADS)
Elahi, Sana; kaleem, Muhammad; Omer, Hammad
2018-01-01
Compressed sensing (CS) is an emerging area of interest in Magnetic Resonance Imaging (MRI). CS is used for the reconstruction of the images from a very limited number of samples in k-space. This significantly reduces the MRI data acquisition time. One important requirement for signal recovery in CS is the use of an appropriate non-linear reconstruction algorithm. It is a challenging task to choose a reconstruction algorithm that would accurately reconstruct the MR images from the under-sampled k-space data. Various algorithms have been used to solve the system of non-linear equations for better image quality and reconstruction speed in CS. In the recent past, iterative soft thresholding algorithm (ISTA) has been introduced in CS-MRI. This algorithm directly cancels the incoherent artifacts produced because of the undersampling in k -space. This paper introduces an improved iterative algorithm based on p -thresholding technique for CS-MRI image reconstruction. The use of p -thresholding function promotes sparsity in the image which is a key factor for CS based image reconstruction. The p -thresholding based iterative algorithm is a modification of ISTA, and minimizes non-convex functions. It has been shown that the proposed p -thresholding iterative algorithm can be used effectively to recover fully sampled image from the under-sampled data in MRI. The performance of the proposed method is verified using simulated and actual MRI data taken at St. Mary's Hospital, London. The quality of the reconstructed images is measured in terms of peak signal-to-noise ratio (PSNR), artifact power (AP), and structural similarity index measure (SSIM). The proposed approach shows improved performance when compared to other iterative algorithms based on log thresholding, soft thresholding and hard thresholding techniques at different reduction factors.
Space Network Time Distribution and Synchronization Protocol Development for Mars Proximity Link
NASA Technical Reports Server (NTRS)
Woo, Simon S.; Gao, Jay L.; Mills, David
2010-01-01
Time distribution and synchronization in deep space network are challenging due to long propagation delays, spacecraft movements, and relativistic effects. Further, the Network Time Protocol (NTP) designed for terrestrial networks may not work properly in space. In this work, we consider the time distribution protocol based on time message exchanges similar to Network Time Protocol (NTP). We present the Proximity-1 Space Link Interleaved Time Synchronization (PITS) algorithm that can work with the CCSDS Proximity-1 Space Data Link Protocol. The PITS algorithm provides faster time synchronization via two-way time transfer over proximity links, improves scalability as the number of spacecraft increase, lowers storage space requirement for collecting time samples, and is robust against packet loss and duplication which underlying protocol mechanisms provide.
NASA Technical Reports Server (NTRS)
Shahshahani, Behzad M.; Landgrebe, David A.
1992-01-01
The effect of additional unlabeled samples in improving the supervised learning process is studied in this paper. Three learning processes. supervised, unsupervised, and combined supervised-unsupervised, are compared by studying the asymptotic behavior of the estimates obtained under each process. Upper and lower bounds on the asymptotic covariance matrices are derived. It is shown that under a normal mixture density assumption for the probability density function of the feature space, the combined supervised-unsupervised learning is always superior to the supervised learning in achieving better estimates. Experimental results are provided to verify the theoretical concepts.
Matrix completion-based reconstruction for undersampled magnetic resonance fingerprinting data.
Doneva, Mariya; Amthor, Thomas; Koken, Peter; Sommer, Karsten; Börnert, Peter
2017-09-01
An iterative reconstruction method for undersampled magnetic resonance fingerprinting data is presented. The method performs the reconstruction entirely in k-space and is related to low rank matrix completion methods. A low dimensional data subspace is estimated from a small number of k-space locations fully sampled in the temporal direction and used to reconstruct the missing k-space samples before MRF dictionary matching. Performing the iterations in k-space eliminates the need for applying a forward and an inverse Fourier transform in each iteration required in previously proposed iterative reconstruction methods for undersampled MRF data. A projection onto the low dimensional data subspace is performed as a matrix multiplication instead of a singular value thresholding typically used in low rank matrix completion, further reducing the computational complexity of the reconstruction. The method is theoretically described and validated in phantom and in-vivo experiments. The quality of the parameter maps can be significantly improved compared to direct matching on undersampled data. Copyright © 2017 Elsevier Inc. All rights reserved.
Improved Technique for Finding Vibration Parameters
NASA Technical Reports Server (NTRS)
Andrew, L. V.; Park, C. C.
1986-01-01
Filtering and sample manipulation reduce noise effects. Analysis technique improves extraction of vibrational frequencies and damping rates from measurements of vibrations of complicated structure. Structural vibrations measured by accelerometers. Outputs digitized at frequency high enough to cover all modes of interest. Use of method on set of vibrational measurements from Space Shuttle, raised level of coherence from previous values below 50 percent to values between 90 and 99 percent
An Improved Thermal Conductivity Polyurethane Composite for a Space Borne 20KV Power Supply
NASA Technical Reports Server (NTRS)
Shapiro, Andrew A.; Haque, Inam
2005-01-01
This effort was designed to find a way to reduce the temperature rise of critical components of a 20KV High Voltage Power Supply (HVPS) by improving the overall thermal conductivity of the encapsulated modules. Three strategies were evaluated by developing complete procedures, preparing samples, and performing tests. The three strategies were: 1. Improve the thermal conductivity of the polyurethane encapsulant through the addition of thermally conductive powder while minimizing impact on other characteristics of the encapsulant. 2. Improve the thermal conductivity of the polyurethane encapsulated assembly by the addition of a slab of thermally conductive, electrically insulating material, which is to act as a heat spreader. 3. Employ a more thermally conductive substrate (Al203) with the existing encapsulation scheme. The materials were chosen based on the following criteria: high dielectric breakdown strength; high thermal conductivity, ease of manufacturing, high compliance, and other standard space qualified materials properties (low out-gassing, etc.). An optimized cure was determined by a statistical design of experiments for both filled and unfilled materials. The materials were characterized for the desired properties and a complete process was developed and tested. The thermal performance was substantially improved and the strategies may be used for space flight.
Farr, W. M.; Mandel, I.; Stevens, D.
2015-01-01
Selection among alternative theoretical models given an observed dataset is an important challenge in many areas of physics and astronomy. Reversible-jump Markov chain Monte Carlo (RJMCMC) is an extremely powerful technique for performing Bayesian model selection, but it suffers from a fundamental difficulty and it requires jumps between model parameter spaces, but cannot efficiently explore both parameter spaces at once. Thus, a naive jump between parameter spaces is unlikely to be accepted in the Markov chain Monte Carlo (MCMC) algorithm and convergence is correspondingly slow. Here, we demonstrate an interpolation technique that uses samples from single-model MCMCs to propose intermodel jumps from an approximation to the single-model posterior of the target parameter space. The interpolation technique, based on a kD-tree data structure, is adaptive and efficient in modest dimensionality. We show that our technique leads to improved convergence over naive jumps in an RJMCMC, and compare it to other proposals in the literature to improve the convergence of RJMCMCs. We also demonstrate the use of the same interpolation technique as a way to construct efficient ‘global’ proposal distributions for single-model MCMCs without prior knowledge of the structure of the posterior distribution, and discuss improvements that permit the method to be used in higher dimensional spaces efficiently. PMID:26543580
Kim, J-J; Joo, S H; Lee, K S; Yoo, J H; Park, M S; Kwak, J S; Lee, Jinho
2017-04-01
The Low Temperature Scanning Tunneling Microscope (LT-STM) is an extremely valuable tool not only in surface science but also in condensed matter physics. For years, numerous new ideas have been adopted to perfect LT-STM performances-Ultra-Low Vibration (ULV) laboratory and the rigid STM head design are among them. Here, we present three improvements for the design of the ULV laboratory and the LT-STM: tip treatment stage, sample cleaving stage, and vibration isolation system. The improved tip treatment stage enables us to perform field emission for the purpose of tip treatment in situ without exchanging samples, while our enhanced sample cleaving stage allows us to cleave samples at low temperature in a vacuum without optical access by a simple pressing motion. Our newly designed vibration isolation system provides efficient space usage while maintaining vibration isolation capability. These improvements enhance the quality of spectroscopic imaging experiments that can last for many days and provide increased data yield, which we expect can be indispensable elements in future LT-STM designs.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Dorothy Rasco, NASA Deputy Associate Administrator for the Space Technology Mission Directorate, speaks at the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Paquet, Victor; Joseph, Caroline; D'Souza, Clive
2012-01-01
Anthropometric studies typically require a large number of individuals that are selected in a manner so that demographic characteristics that impact body size and function are proportionally representative of a user population. This sampling approach does not allow for an efficient characterization of the distribution of body sizes and functions of sub-groups within a population and the demographic characteristics of user populations can often change with time, limiting the application of the anthropometric data in design. The objective of this study is to demonstrate how demographically representative user populations can be developed from samples that are not proportionally representative in order to improve the application of anthropometric data in design. An engineering anthropometry problem of door width and clear floor space width is used to illustrate the value of the approach.
SU-G-IeP1-13: Sub-Nyquist Dynamic MRI Via Prior Rank, Intensity and Sparsity Model (PRISM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, B; Gao, H
Purpose: Accelerated dynamic MRI is important for MRI guided radiotherapy. Inspired by compressive sensing (CS), sub-Nyquist dynamic MRI has been an active research area, i.e., sparse sampling in k-t space for accelerated dynamic MRI. This work is to investigate sub-Nyquist dynamic MRI via a previously developed CS model, namely Prior Rank, Intensity and Sparsity Model (PRISM). Methods: The proposed method utilizes PRISM with rank minimization and incoherent sampling patterns for sub-Nyquist reconstruction. In PRISM, the low-rank background image, which is automatically calculated by rank minimization, is excluded from the L1 minimization step of the CS reconstruction to further sparsify themore » residual image, thus allowing for higher acceleration rates. Furthermore, the sampling pattern in k-t space is made more incoherent by sampling a different set of k-space points at different temporal frames. Results: Reconstruction results from L1-sparsity method and PRISM method with 30% undersampled data and 15% undersampled data are compared to demonstrate the power of PRISM for dynamic MRI. Conclusion: A sub- Nyquist MRI reconstruction method based on PRISM is developed with improved image quality from the L1-sparsity method.« less
Wu, Zhichao; Medeiros, Felipe A
2018-03-20
Visual field testing is an important endpoint in glaucoma clinical trials, and the testing paradigm used can have a significant impact on the sample size requirements. To investigate this, this study included 353 eyes of 247 glaucoma patients seen over a 3-year period to extract real-world visual field rates of change and variability estimates to provide sample size estimates from computer simulations. The clinical trial scenario assumed that a new treatment was added to one of two groups that were both under routine clinical care, with various treatment effects examined. Three different visual field testing paradigms were evaluated: a) evenly spaced testing, b) United Kingdom Glaucoma Treatment Study (UKGTS) follow-up scheme, which adds clustered tests at the beginning and end of follow-up in addition to evenly spaced testing, and c) clustered testing paradigm, with clusters of tests at the beginning and end of the trial period and two intermediary visits. The sample size requirements were reduced by 17-19% and 39-40% using the UKGTS and clustered testing paradigms, respectively, when compared to the evenly spaced approach. These findings highlight how the clustered testing paradigm can substantially reduce sample size requirements and improve the feasibility of future glaucoma clinical trials.
Toward a benchmark material in aerogel development
NASA Astrophysics Data System (ADS)
Sibille, Laurent; Cronise, Raymond J.; Noever, David A.; Hunt, Arlon J.
1996-03-01
Discovered in the thirties, aerogels constitute today the lightest solids known while exhibiting outstanding thermal and noise insulation properties in air and vacuum. In a far-reaching collaboration, the Space Science Laboratory at NASA Marshall Space Flight Center and the Microstructured Materials Group at Lawrence Berkeley National Laboratory are engaged in a two-fold research effort aiming at characterizing the microstructure of silica aerogels and the development of benchmark samples through the use of in-orbit microgravity environment. Absence of density-driven convection flows and sedimentation is sought to produce aerogel samples with narrow distribution of pore sizes, thus largely improving transparency of the material in the visible range. Furthermore, highly isotropic distribution of doping materials are attainable even in large gels grown in microgravity. Aerospace companies (cryogenic tanks insulation and high temperature insulation of space vehicles), insulation manufacturers (household and industrial applications) as well as pharmaceutical companies (biosensors) are potential end-users of this rapidly developing technology.
Correlation ion mobility spectroscopy
Pfeifer, Kent B [Los Lunas, NM; Rohde, Steven B [Corrales, NM
2008-08-26
Correlation ion mobility spectrometry (CIMS) uses gating modulation and correlation signal processing to improve IMS instrument performance. Closely spaced ion peaks can be resolved by adding discriminating codes to the gate and matched filtering for the received ion current signal, thereby improving sensitivity and resolution of an ion mobility spectrometer. CIMS can be used to improve the signal-to-noise ratio even for transient chemical samples. CIMS is especially advantageous for small geometry IMS drift tubes that can otherwise have poor resolution due to their small size.
NASA Technical Reports Server (NTRS)
Eley, Michael H.; Crews, Lavonne; Johnston, Ben; Lee, David; Colebaugh, James
1995-01-01
The primary objectives of the study were to characterize the solid waste stream for MSFC facilities in Huntsville, Alabama, and to evaluate their present recycling program. The purpose of the study was to determine if improvements could be made in terms of increasing quantities of the present commodities collected, adding more recyclables to the program, and streamlining or improving operational efficiency. In conducting the study, various elements were implemented. These included sampling and sorting representative samples of the waste stream; visually inspecting each refuse bin, recycle bin, and roll-off; interviewing employees and recycling coordinators of other companies; touring local material recycling facilities; contacting experts in the field; and performing a literature search.
1998-10-01
Research with plants in microgravity offers many exciting opportunities to gain new insights and could improve products on Earth ranging from crop production to fragrances and food flavorings. The ASTROCULTURE facility is a lead commercial facility for plant growth and plant research in microgravity and was developed by the Wisconsin Center for Space Automation and Robotics (WSCAR), a NASA Commercial Space Center. On STS-95 it will support research that could help improve crop development leading to plants that are more disease resistant or have a higher yield and provide data on the production of plant essential oils---oils that contain the essence of the plant and provide both fragrance and flavoring. On STS-95, a flowering plant will be grown in ASTROCULTURE and samples taken using a method developed by the industry partner for this investigation. On Earth the samples will be analyzed by gas chromatography/mass spectrometry and the data used to evaluate both the production of fragrant oils in microgravity and in the development of one or more products.
NASA Technical Reports Server (NTRS)
1998-01-01
Research with plants in microgravity offers many exciting opportunities to gain new insights and could improve products on Earth ranging from crop production to fragrances and food flavorings. The ASTROCULTURE facility is a lead commercial facility for plant growth and plant research in microgravity and was developed by the Wisconsin Center for Space Automation and Robotics (WSCAR), a NASA Commercial Space Center. On STS-95 it will support research that could help improve crop development leading to plants that are more disease resistant or have a higher yield and provide data on the production of plant essential oils---oils that contain the essence of the plant and provide both fragrance and flavoring. On STS-95, a flowering plant will be grown in ASTROCULTURE and samples taken using a method developed by the industry partner for this investigation. On Earth the samples will be analyzed by gas chromatography/mass spectrometry and the data used to evaluate both the production of fragrant oils in microgravity and in the development of one or more products.
Zheng, Lianqing; Chen, Mengen; Yang, Wei
2009-06-21
To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.
NASA Astrophysics Data System (ADS)
Choi, Jinhyeok; Kim, Hyeonjin
2016-12-01
To improve the efficacy of undersampled MRI, a method of designing adaptive sampling functions is proposed that is simple to implement on an MR scanner and yet effectively improves the performance of the sampling functions. An approximation of the energy distribution of an image (E-map) is estimated from highly undersampled k-space data acquired in a prescan and efficiently recycled in the main scan. An adaptive probability density function (PDF) is generated by combining the E-map with a modeled PDF. A set of candidate sampling functions are then prepared from the adaptive PDF, among which the one with maximum energy is selected as the final sampling function. To validate its computational efficiency, the proposed method was implemented on an MR scanner, and its robust performance in Fourier-transform (FT) MRI and compressed sensing (CS) MRI was tested by simulations and in a cherry tomato. The proposed method consistently outperforms the conventional modeled PDF approach for undersampling ratios of 0.2 or higher in both FT-MRI and CS-MRI. To fully benefit from undersampled MRI, it is preferable that the design of adaptive sampling functions be performed online immediately before the main scan. In this way, the proposed method may further improve the efficacy of the undersampled MRI.
High-Dimensional Function Approximation With Neural Networks for Large Volumes of Data.
Andras, Peter
2018-02-01
Approximation of high-dimensional functions is a challenge for neural networks due to the curse of dimensionality. Often the data for which the approximated function is defined resides on a low-dimensional manifold and in principle the approximation of the function over this manifold should improve the approximation performance. It has been show that projecting the data manifold into a lower dimensional space, followed by the neural network approximation of the function over this space, provides a more precise approximation of the function than the approximation of the function with neural networks in the original data space. However, if the data volume is very large, the projection into the low-dimensional space has to be based on a limited sample of the data. Here, we investigate the nature of the approximation error of neural networks trained over the projection space. We show that such neural networks should have better approximation performance than neural networks trained on high-dimensional data even if the projection is based on a relatively sparse sample of the data manifold. We also find that it is preferable to use a uniformly distributed sparse sample of the data for the purpose of the generation of the low-dimensional projection. We illustrate these results considering the practical neural network approximation of a set of functions defined on high-dimensional data including real world data as well.
Resolving structural influences on water-retention properties of alluvial deposits
Winfield, K.A.; Nimmo, J.R.; Izbicki, J.A.; Martin, P.M.
2006-01-01
With the goal of improving property-transfer model (PTM) predictions of unsaturated hydraulic properties, we investigated the influence of sedimentary structure, defined as particle arrangement during deposition, on laboratory-measured water retention (water content vs. potential [??(??)]) of 10 undisturbed core samples from alluvial deposits in the western Mojave Desert, California. The samples were classified as having fluvial or debris-flow structure based on observed stratification and measured spread of particle-size distribution. The ??(??) data were fit with the Rossi-Nimmo junction model, representing water retention with three parameters: the maximum water content (??max), the ??-scaling parameter (??o), and the shape parameter (??). We examined trends between these hydraulic parameters and bulk physical properties, both textural - geometric mean, Mg, and geometric standard deviation, ??g, of particle diameter - and structural - bulk density, ??b, the fraction of unfilled pore space at natural saturation, Ae, and porosity-based randomness index, ??s, defined as the excess of total porosity over 0.3. Structural parameters ??s and Ae were greater for fluvial samples, indicating greater structural pore space and a possibly broader pore-size distribution associated with a more systematic arrangement of particles. Multiple linear regression analysis and Mallow's Cp statistic identified combinations of textural and structural parameters for the most useful predictive models: for ??max, including Ae, ??s, and ??g, and for both ??o and ??, including only textural parameters, although use of Ae can somewhat improve ??o predictions. Textural properties can explain most of the sample-to-sample variation in ??(??) independent of deposit type, but inclusion of the simple structural indicators Ae and ??s can improve PTM predictions, especially for the wettest part of the ??(??) curve. ?? Soil Science Society of America.
State-space modeling of population sizes and trends in Nihoa Finch and Millerbird
Gorresen, P. Marcos; Brinck, Kevin W.; Camp, Richard J.; Farmer, Chris; Plentovich, Sheldon M.; Banko, Paul C.
2016-01-01
Both of the 2 passerines endemic to Nihoa Island, Hawai‘i, USA—the Nihoa Millerbird (Acrocephalus familiaris kingi) and Nihoa Finch (Telespiza ultima)—are listed as endangered by federal and state agencies. Their abundances have been estimated by irregularly implemented fixed-width strip-transect sampling from 1967 to 2012, from which area-based extrapolation of the raw counts produced highly variable abundance estimates for both species. To evaluate an alternative survey method and improve abundance estimates, we conducted variable-distance point-transect sampling between 2010 and 2014. We compared our results to those obtained from strip-transect samples. In addition, we applied state-space models to derive improved estimates of population size and trends from the legacy time series of strip-transect counts. Both species were fairly evenly distributed across Nihoa and occurred in all or nearly all available habitat. Population trends for Nihoa Millerbird were inconclusive because of high within-year variance. Trends for Nihoa Finch were positive, particularly since the early 1990s. Distance-based analysis of point-transect counts produced mean estimates of abundance similar to those from strip-transects but was generally more precise. However, both survey methods produced biologically unrealistic variability between years. State-space modeling of the long-term time series of abundances obtained from strip-transect counts effectively reduced uncertainty in both within- and between-year estimates of population size, and allowed short-term changes in abundance trajectories to be smoothed into a long-term trend.
Characterising Passive Dosemeters for Dosimetry of Biological Experiments in Space (dobies)
NASA Astrophysics Data System (ADS)
Vanhavere, Filip; Spurny, Frantisek; Yukihara, Eduardo; Genicot, Jean-Louis
Introduction: The DOBIES (Dosimetry of biological experi-ments in space) project focusses on the use of a stan-dard dosimetric method (as a combination of differ-ent passive techniques) to measure accurately the absorbed doses and equivalent doses in biological samples. Dose measurements on biological samples are of high interest in the fields of radiobiology and exobiology. Radiation doses absorbed by biological samples must be quantified to be able to determine the relationship between observed biological effects and the radiation dose. The radiation field in space is very complex, con-sisting of protons, neutrons, electrons and high-energy heavy charged particles. It is not straightfor-ward to measure doses in this radiation field, cer-tainly not with only small and light passive doseme-ters. The properties of the passive detectors must be tested in radiation fields that are representative of the space radiation. We will report on the characterisation of different type of passive detectors at high energy fields. The results from such characterisation measurements will be applied to recent exposures of detectors on the International Space Station. Material and methods: Following passive detectors are used: • thermoluminescent detectors (TLD) • optically stimulated luminescence detectors (OSLD) • track etch detectors (TED) The different groups have participated in the past to the ICCHIBAN series of irradiations. Here protons and other particles of high energy were used to de-termine the LET-dependency of the passive detec-tors. The last few months, new irradiations have been done at the iThemba labs (100-200 MeV protons), Dubna (145 MeV protons) and the JRC-IRMM (quasi mono energetic neutrons up to 19 MeV). All these detectors were also exposed to a simulated space radiation field at CERN (CERF-field). Discussion: The interpretation of the TLD and OSLD results is done using the measured LET spectrum (TED) and the LET-dependency curves of ths TLD and OSLDs. These LET- dependency curves are determined based on the different irradiations listed above. We will report on the results of the different detectors in these fields. Further information on the LET of the space irradia-tion can be deduced from the ratio of the different peaks of the TLDs after glow curve deconvolution, and from the shape of the decay curve of the OSLDs. The results in the CERF field can on the other hand directly being used as a calibration for space radia-tion fields. Conclusion: Combining different passive detectors will lead to improved information on the radiation field, and thus to a better estimation of the absorbed dose to the bio-logical samples. We use the characterisations on high energy accelerators to improve the estimation of some recent space doses.
Kholmovski, Eugene G; Parker, Dennis L
2005-07-01
There is a considerable similarity between proton density-weighted (PDw) and T2-weighted (T2w) images acquired by dual echo fast spin-echo (FSE) sequences. The similarity manifests itself not only in image space as correspondence between intensities of PDw and T2w images, but also in phase space as consistency between phases of PDw and T2w images. Methods for improving the imaging efficiency and image quality of dual echo FSE sequences based on this feature have been developed. The total scan time of dual echo FSE acquisition may be reduced by as much as 25% by incorporating an estimate of the image phase from a fully sampled PDw image when reconstructing partially sampled T2w images. The quality of T2w images acquired using phased array coils may be significantly improved by using the developed noise reduction reconstruction scheme, which is based on the correspondence between the PDw and T2w image intensities and the consistency between the PDw and T2w image phases. Studies of phantom and human subject MRI data were performed to evaluate the effectiveness of the techniques.
NASA Astrophysics Data System (ADS)
Yasui, Takeshi
2017-08-01
Optical frequency combs are innovative tools for broadband spectroscopy because a series of comb modes can serve as frequency markers that are traceable to a microwave frequency standard. However, a mode distribution that is too discrete limits the spectral sampling interval to the mode frequency spacing even though individual mode linewidth is sufficiently narrow. Here, using a combination of a spectral interleaving and dual-comb spectroscopy in the terahertz (THz) region, we achieved a spectral sampling interval equal to the mode linewidth rather than the mode spacing. The spectrally interleaved THz comb was realized by sweeping the laser repetition frequency and interleaving additional frequency marks. In low-pressure gas spectroscopy, we achieved an improved spectral sampling density of 2.5 MHz and enhanced spectral accuracy of 8.39 × 10-7 in the THz region. The proposed method is a powerful tool for simultaneously achieving high resolution, high accuracy, and broad spectral coverage in THz spectroscopy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.
We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less
Adaptive Landscape Flattening Accelerates Sampling of Alchemical Space in Multisite λ Dynamics.
Hayes, Ryan L; Armacost, Kira A; Vilseck, Jonah Z; Brooks, Charles L
2017-04-20
Multisite λ dynamics (MSλD) is a powerful emerging method in free energy calculation that allows prediction of relative free energies for a large set of compounds from very few simulations. Calculating free energy differences between substituents that constitute large volume or flexibility jumps in chemical space is difficult for free energy methods in general, and for MSλD in particular, due to large free energy barriers in alchemical space. This study demonstrates that a simple biasing potential can flatten these barriers and introduces an algorithm that determines system specific biasing potential coefficients. Two sources of error, deep traps at the end points and solvent disruption by hard-core potentials, are identified. Both scale with the size of the perturbed substituent and are removed by sharp biasing potentials and a new soft-core implementation, respectively. MSλD with landscape flattening is demonstrated on two sets of molecules: derivatives of the heat shock protein 90 inhibitor geldanamycin and derivatives of benzoquinone. In the benzoquinone system, landscape flattening leads to 2 orders of magnitude improvement in transition rates between substituents and robust solvation free energies. Landscape flattening opens up new applications for MSλD by enabling larger chemical perturbations to be sampled with improved precision and accuracy.
Chaudhary, Neha; Tøndel, Kristin; Bhatnagar, Rakesh; dos Santos, Vítor A P Martins; Puchałka, Jacek
2016-03-01
Genome-Scale Metabolic Reconstructions (GSMRs), along with optimization-based methods, predominantly Flux Balance Analysis (FBA) and its derivatives, are widely applied for assessing and predicting the behavior of metabolic networks upon perturbation, thereby enabling identification of potential novel drug targets and biotechnologically relevant pathways. The abundance of alternate flux profiles has led to the evolution of methods to explore the complete solution space aiming to increase the accuracy of predictions. Herein we present a novel, generic algorithm to characterize the entire flux space of GSMR upon application of FBA, leading to the optimal value of the objective (the optimal flux space). Our method employs Modified Latin-Hypercube Sampling (LHS) to effectively border the optimal space, followed by Principal Component Analysis (PCA) to identify and explain the major sources of variability within it. The approach was validated with the elementary mode analysis of a smaller network of Saccharomyces cerevisiae and applied to the GSMR of Pseudomonas aeruginosa PAO1 (iMO1086). It is shown to surpass the commonly used Monte Carlo Sampling (MCS) in providing a more uniform coverage for a much larger network in less number of samples. Results show that although many fluxes are identified as variable upon fixing the objective value, majority of the variability can be reduced to several main patterns arising from a few alternative pathways. In iMO1086, initial variability of 211 reactions could almost entirely be explained by 7 alternative pathway groups. These findings imply that the possibilities to reroute greater portions of flux may be limited within metabolic networks of bacteria. Furthermore, the optimal flux space is subject to change with environmental conditions. Our method may be a useful device to validate the predictions made by FBA-based tools, by describing the optimal flux space associated with these predictions, thus to improve them.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team KuuKulgur waits to begin the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
NASA Astrophysics Data System (ADS)
Castelletti, Davide; Demir, Begüm; Bruzzone, Lorenzo
2014-10-01
This paper presents a novel semisupervised learning (SSL) technique defined in the context of ɛ-insensitive support vector regression (SVR) to estimate biophysical parameters from remotely sensed images. The proposed SSL method aims to mitigate the problems of small-sized biased training sets without collecting any additional samples with reference measures. This is achieved on the basis of two consecutive steps. The first step is devoted to inject additional priors information in the learning phase of the SVR in order to adapt the importance of each training sample according to distribution of the unlabeled samples. To this end, a weight is initially associated to each training sample based on a novel strategy that defines higher weights for the samples located in the high density regions of the feature space while giving reduced weights to those that fall into the low density regions of the feature space. Then, in order to exploit different weights for training samples in the learning phase of the SVR, we introduce a weighted SVR (WSVR) algorithm. The second step is devoted to jointly exploit labeled and informative unlabeled samples for further improving the definition of the WSVR learning function. To this end, the most informative unlabeled samples that have an expected accurate target values are initially selected according to a novel strategy that relies on the distribution of the unlabeled samples in the feature space and on the WSVR function estimated at the first step. Then, we introduce a restructured WSVR algorithm that jointly uses labeled and unlabeled samples in the learning phase of the WSVR algorithm and tunes their importance by different values of regularization parameters. Experimental results obtained for the estimation of single-tree stem volume show the effectiveness of the proposed SSL method.
NASA Astrophysics Data System (ADS)
Wei, Pei; Wei, Zhengying; Chen, Zhen; Du, Jun; He, Yuyang; Li, Junfeng; Zhou, Yatong
2017-06-01
This densification behavior and attendant microstructural characteristics of the selective laser melting (SLM) processed AlSi10Mg alloy affected by the processing parameters were systematically investigated. The samples with a single track were produced by SLM to study the influences of laser power and scanning speed on the surface morphologies of scan tracks. Additionally, the bulk samples were produced to investigate the influence of the laser power, scanning speed, and hatch spacing on the densification level and the resultant microstructure. The experimental results showed that the level of porosity of the SLM-processed samples was significantly governed by energy density of laser beam and the hatch spacing. The tensile properties of SLM-processed samples and the attendant fracture surface can be enhanced by decreasing the level of porosity. The microstructure of SLM-processed samples consists of supersaturated Al-rich cellular structure along with eutectic Al/Si situated at the cellular boundaries. The Si content in the cellular boundaries increases with increasing the laser power and decreasing the scanning speed. The hardness of SLM-processed samples was significantly improved by this fine microstructure compared with the cast samples. Moreover, the hardness of SLM-processed samples at overlaps was lower than the hardness observed at track cores.
2003-01-22
Still photographs taken over 16 hours on Nov. 13, 2001, on the International Space Station have been condensed into a few seconds to show the de-mixing -- or phase separation -- process studied by the Experiment on Physics of Colloids in Space. Commanded from the ground, dozens of similar tests have been conducted since the experiment arrived on ISS in 2000. The sample is a mix of polymethylmethacrylate (PMMA or acrylic) colloids, polystyrene polymers and solvents. The circular area is 2 cm (0.8 in.) in diameter. The phase separation process occurs spontaneously after the sample is mechanically mixed. The evolving lighter regions are rich in colloid and have the structure of a liquid. The dark regions are poor in colloids and have the structure of a gas. This behavior carnot be observed on Earth because gravity causes the particles to fall out of solution faster than the phase separation can occur. While similar to a gas-liquid phase transition, the growth rate observed in this test is different from any atomic gas-liquid or liquid-liquid phase transition ever measured experimentally. Ultimately, the sample separates into colloid-poor and colloid-rich areas, just as oil and vinegar separate. The fundamental science of de-mixing in this colloid-polymer sample is the same found in the annealing of metal alloys and plastic polymer blends. Improving the understanding of this process may lead to improving processing of these materials on Earth.
Recipes for free energy calculations in biomolecular systems.
Moradi, Mahmoud; Babin, Volodymyr; Sagui, Celeste; Roland, Christopher
2013-01-01
During the last decade, several methods for sampling phase space and calculating various free energies in biomolecular systems have been devised or refined for molecular dynamics (MD) simulations. Thus, state-of-the-art methodology and the ever increasing computer power allow calculations that were forbidden a decade ago. These calculations, however, are not trivial as they require knowledge of the methods, insight into the system under study, and, quite often, an artful combination of different methodologies in order to avoid the various traps inherent in an unknown free energy landscape. In this chapter, we illustrate some of these concepts with two relatively simple systems, a sugar ring and proline oligopeptides, whose free energy landscapes still offer considerable challenges. In order to explore the configurational space of these systems, and to surmount the various free energy barriers, we combine three complementary methods: a nonequilibrium umbrella sampling method (adaptively biased MD, or ABMD), replica-exchange molecular dynamics (REMD), and steered molecular dynamics (SMD). In particular, ABMD is used to compute the free energy surface of a set of collective variables; REMD is used to improve the performance of ABMD, to carry out sampling in space complementary to the collective variables, and to sample equilibrium configurations directly; and SMD is used to study different transition mechanisms.
Improvement of Predictive Ability by Uniform Coverage of the Target Genetic Space
Bustos-Korts, Daniela; Malosetti, Marcos; Chapman, Scott; Biddulph, Ben; van Eeuwijk, Fred
2016-01-01
Genome-enabled prediction provides breeders with the means to increase the number of genotypes that can be evaluated for selection. One of the major challenges in genome-enabled prediction is how to construct a training set of genotypes from a calibration set that represents the target population of genotypes, where the calibration set is composed of a training and validation set. A random sampling protocol of genotypes from the calibration set will lead to low quality coverage of the total genetic space by the training set when the calibration set contains population structure. As a consequence, predictive ability will be affected negatively, because some parts of the genotypic diversity in the target population will be under-represented in the training set, whereas other parts will be over-represented. Therefore, we propose a training set construction method that uniformly samples the genetic space spanned by the target population of genotypes, thereby increasing predictive ability. To evaluate our method, we constructed training sets alongside with the identification of corresponding genomic prediction models for four genotype panels that differed in the amount of population structure they contained (maize Flint, maize Dent, wheat, and rice). Training sets were constructed using uniform sampling, stratified-uniform sampling, stratified sampling and random sampling. We compared these methods with a method that maximizes the generalized coefficient of determination (CD). Several training set sizes were considered. We investigated four genomic prediction models: multi-locus QTL models, GBLUP models, combinations of QTL and GBLUPs, and Reproducing Kernel Hilbert Space (RKHS) models. For the maize and wheat panels, construction of the training set under uniform sampling led to a larger predictive ability than under stratified and random sampling. The results of our methods were similar to those of the CD method. For the rice panel, all training set construction methods led to similar predictive ability, a reflection of the very strong population structure in this panel. PMID:27672112
Chodera, John D; Shirts, Michael R
2011-11-21
The widespread popularity of replica exchange and expanded ensemble algorithms for simulating complex molecular systems in chemistry and biophysics has generated much interest in discovering new ways to enhance the phase space mixing of these protocols in order to improve sampling of uncorrelated configurations. Here, we demonstrate how both of these classes of algorithms can be considered as special cases of Gibbs sampling within a Markov chain Monte Carlo framework. Gibbs sampling is a well-studied scheme in the field of statistical inference in which different random variables are alternately updated from conditional distributions. While the update of the conformational degrees of freedom by Metropolis Monte Carlo or molecular dynamics unavoidably generates correlated samples, we show how judicious updating of the thermodynamic state indices--corresponding to thermodynamic parameters such as temperature or alchemical coupling variables--can substantially increase mixing while still sampling from the desired distributions. We show how state update methods in common use can lead to suboptimal mixing, and present some simple, inexpensive alternatives that can increase mixing of the overall Markov chain, reducing simulation times necessary to obtain estimates of the desired precision. These improved schemes are demonstrated for several common applications, including an alchemical expanded ensemble simulation, parallel tempering, and multidimensional replica exchange umbrella sampling.
Nonequilibrium umbrella sampling in spaces of many order parameters
NASA Astrophysics Data System (ADS)
Dickson, Alex; Warmflash, Aryeh; Dinner, Aaron R.
2009-02-01
We recently introduced an umbrella sampling method for obtaining nonequilibrium steady-state probability distributions projected onto an arbitrary number of coordinates that characterize a system (order parameters) [A. Warmflash, P. Bhimalapuram, and A. R. Dinner, J. Chem. Phys. 127, 154112 (2007)]. Here, we show how our algorithm can be combined with the image update procedure from the finite-temperature string method for reversible processes [E. Vanden-Eijnden and M. Venturoli, "Revisiting the finite temperature string method for calculation of reaction tubes and free energies," J. Chem. Phys. (in press)] to enable restricted sampling of a nonequilibrium steady state in the vicinity of a path in a many-dimensional space of order parameters. For the study of transitions between stable states, the adapted algorithm results in improved scaling with the number of order parameters and the ability to progressively refine the regions of enforced sampling. We demonstrate the algorithm by applying it to a two-dimensional model of driven Brownian motion and a coarse-grained (Ising) model for nucleation under shear. It is found that the choice of order parameters can significantly affect the convergence of the simulation; local magnetization variables other than those used previously for sampling transition paths in Ising systems are needed to ensure that the reactive flux is primarily contained within a tube in the space of order parameters. The relation of this method to other algorithms that sample the statistics of path ensembles is discussed.
NASA Technical Reports Server (NTRS)
Fishman, Julianna L.; Mudgett, Paul D.; Packham, Nigel J.; Schultz, John R.; Straub, John E., II
2005-01-01
On August 9, 2003, NASA, with the cooperative support of the Vehicle Office of the International Space Station Program, the Advanced Human Support Technology Program, and the Johnson Space Center Habitability and Environmental Factors Office released a Request for Information, or RFI, to identify next-generation environmental monitoring systems that have demonstrated ability or the potential to meet defined requirements for monitoring air and water quality onboard the International Space Station. This report summarizes the review and analysis of the proposed solutions submitted to meet the water quality monitoring requirements. Proposals were to improve upon the functionality of the existing Space Station Total Organic Carbon Analyzer (TOCA) and monitor additional contaminants in water samples. The TOCA is responsible for in-flight measurement of total organic carbon, total inorganic carbon, total carbon, pH, and conductivity in the Space Station potable water supplies. The current TOCA requires hazardous reagents to accomplish the carbon analyses. NASA is using the request for information process to investigate new technologies that may improve upon existing capabilities, as well as reduce or eliminate the need for hazardous reagents. Ideally, a replacement for the TOCA would be deployed in conjunction with the delivery of the Node 3 water recovery system currently scheduled for November 2007.
Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.
2017-01-01
Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.
Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness
NASA Astrophysics Data System (ADS)
Hardy, Tyler J.; Cain, Stephen C.
2016-05-01
The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.
Richardson, Elizabeth A; Shortt, Niamh K; Mitchell, Richard; Pearce, Jamie
2018-02-01
Birthweight is an important determinant of health across the life course. Maternal exposure to natural space has been linked to higher birthweight, but stronger evidence of a causal link is needed. We use a quasi-experimental sibling study design to investigate if change in the mother's exposure to natural space between births was related to birthweight, in urban Scotland. Amount (% area) of total natural space, total accessible (public) natural space, parks, woodlands and open water within 100 m of the mother's postcode was calculated for eligible births (n = 40 194; 1991-2010) in the Scottish Longitudinal Study (a semi-random 5.3% sample of the Scottish population). Associations between natural space and birthweight were estimated, using ordinary least squares and fixed effects models. Birthweight was associated with the total amount of natural space around the mother's home (+8.2 g for interquartile range increase), but was unrelated to specific types of natural space. This whole-sample relationship disappeared in the sibling analysis, indicating residual confounding. The sibling models showed effects for total natural space with births to women who already had children (+20.1 g), and to those with an intermediate level of education (+14.1 g). The importance of total natural space for birthweight suggests that benefits can be experienced near to as well as within natural space. Ensuring expectant mothers have good access to high quality neighbourhood natural space has the potential to improve the infant's start in life, and consequently their health trajectory over the life course. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association
The effect of solidification rate on the formability of nickel aluminide containing iron and boron
NASA Technical Reports Server (NTRS)
Carro, G.; Flanagan, W. F.
1987-01-01
Following reports that rapid solidification improves the ductility of some nickel aluminides, an investigation has been conducted of the possibility of additional improvement in a nickel aluminide containing both Fe and B. Free fall-solidified and free fall/splat-quenched samples similar to those producible under microgravity conditions in space were prepared, and their microstructure was characterized. Attention is given to the preliminary results of tests quantitatively measuring mechanical properties.
Space Weathering Rates in Lunar and Itokawa Samples
NASA Technical Reports Server (NTRS)
Keller, L. P.; Berger, E. L.
2017-01-01
Space weathering alters the chemistry, microstructure, and spectral proper-ties of grains on the surfaces of airless bodies by two major processes: micrometeorite impacts and solar wind interactions. Investigating the nature of space weathering processes both in returned samples and in remote sensing observations provides information fundamental to understanding the evolution of airless body regoliths, improving our ability to determine the surface composition of asteroids, and linking meteorites to specific asteroidal parent bodies. Despite decades of research into space weathering processes and their effects, we still know very little about weathering rates. For example, what is the timescale to alter the reflectance spectrum of an ordinary chondrite meteorite to resemble the overall spectral shape and slope from an S-type asteroid? One approach to answering this question has been to determine ages of asteroid families by dynamical modeling and determine the spectral proper-ties of the daughter fragments. However, large differences exist between inferred space weathering rates and timescales derived from laboratory experiments, analysis of asteroid family spectra and the space weathering styles; estimated timescales range from 5000 years up to 108 years. Vernazza et al. concluded that solar wind interactions dominate asteroid space weathering on rapid timescales of 10(exp 4)-10(exp 6) years. Shestopalov et al. suggested that impact-gardening of regolith particles and asteroid resurfacing counteract the rapid progress of solar wind optical maturation of asteroid surfaces and proposed a space weathering timescale of 10(exp 5)-10(exp 6) years.
Simplified Abrasion Test Methodology for Candidate EVA Glove Lay-Ups
NASA Technical Reports Server (NTRS)
Rabel, Emily; Aitchison, Lindsay
2015-01-01
During the Apollo Program, space suit outer-layer fabrics were badly abraded after performing just a few extravehicular activities (EVAs). For example, the Apollo 12 commander reported abrasive wear on the boots that penetrated the outer-layer fabric into the thermal protection layers after less than 8 hrs of surface operations. Current plans for the exploration planetary space suits require the space suits to support hundreds of hours of EVA on a lunar or Martian surface, creating a challenge for space suit designers to utilize materials advances made over the last 40 years and improve on the space suit fabrics used in the Apollo Program. Over the past 25 years the NASA Johnson Space Center Crew and Thermal Systems Division has focused on tumble testing as means of simulating wear on the outer layer of the space suit fabric. Most recently, in 2009, testing was performed on 4 different candidate outer layers to gather baseline data for future use in design of planetary space suit outer layers. In support of the High Performance EVA Glove Element of the Next Generation Life Support Project, testing a new configuration was recently attempted in which require 10% of the fabric per replicate of that need in 2009. The smaller fabric samples allowed for reduced per sample cost and flexibility to test small samples from manufacturers without the overhead to have a production run completed. Data collected from this iteration was compared to that taken in 2009 to validate the new test method. In addition the method also evaluated the fabrics and fabric layups used in a prototype thermal micrometeoroid garment (TMG) developed for EVA gloves under the NASA High Performance EVA Glove Project. This paper provides a review of previous abrasion studies on space suit fabrics, details methodologies used for abrasion testing in this particular study, results of the validation study, and results of the TMG testing.
POF-Darts: Geometric adaptive sampling for probability of failure
Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.; ...
2016-06-18
We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less
NASA Astrophysics Data System (ADS)
Dafu, Shen; Leihong, Zhang; Dong, Liang; Bei, Li; Yi, Kang
2017-07-01
The purpose of this study is to improve the reconstruction precision and better copy the color of spectral image surfaces. A new spectral reflectance reconstruction algorithm based on an iterative threshold combined with weighted principal component space is presented in this paper, and the principal component with weighted visual features is the sparse basis. Different numbers of color cards are selected as the training samples, a multispectral image is the testing sample, and the color differences in the reconstructions are compared. The channel response value is obtained by a Mega Vision high-accuracy, multi-channel imaging system. The results show that spectral reconstruction based on weighted principal component space is superior in performance to that based on traditional principal component space. Therefore, the color difference obtained using the compressive-sensing algorithm with weighted principal component analysis is less than that obtained using the algorithm with traditional principal component analysis, and better reconstructed color consistency with human eye vision is achieved.
Space Shuttle Corrosion Protection Performance
NASA Technical Reports Server (NTRS)
Curtis, Cris E.
2007-01-01
The reusable Manned Space Shuttle has been flying into Space and returning to earth for more than 25 years. The launch pad environment can be corrosive to metallic substrates and the Space Shuttles are exposed to this environment when preparing for launch. The Orbiter has been in service well past its design life of 10 years or 100 missions. As part of the aging vehicle assessment one question under evaluation is how the thermal protection system and aging protective coatings are performing to insure structural integrity. The assessment of this cost resources and time. The information is invaluable when minimizing risk to the safety of Astronauts and Vehicle. This paper will outline a strategic sampling plan and some operational improvements made by the Orbiter Structures team and Corrosion Control Review Board.
Space processing applications rocket project. SPAR 8
NASA Technical Reports Server (NTRS)
Chassay, R. P. (Editor)
1984-01-01
The Space Processing Applications Rocket Project (SPAR) VIII Final Report contains the engineering report prepared at the Marshall Space Flight Center (MSFC) as well as the three reports from the principal investigators. These reports also describe pertinent portions of ground-based research leading to the ultimate selection of the flight sample composition, including design, fabrication, and testing, all of which are expected to contribute immeasurably to an improved comprehension of materials processing in space. This technical memorandum is directed entirely to the payload manifest flown in the eighth of a series of SPAR flights conducted at the White Sands Missile Range (WSMR) and includes the experiments entitled Glass Formation Experiment SPAR 74-42/1R, Glass Fining Experiment in Low-Gravity SPAR 77-13/1, and Dynamics of Liquid Bubbles SPAR Experiment 77-18/2.
1998-10-01
Research with plants in microgravity offers many exciting opportunities to gain new insights and could improve products on Earth ranging from crop production to fragrances and food flavorings. The ASTROCULTURE facility is a lead commercial facility for plant growth and plant research in microgravity and was developed by the Wisconsin Center for Space Automation and Robotics (WSCAR), a NASA Commercial Space Center. On STS-95 it will support research that could help improve crop development leading to plants that are more disease resistant or have a higher yield and provide data on the production of plant essential oils---oils that contain the essence of the plant and provide both fragrance and flavoring. On STS-95, a flowering plant will be grown in ASTROCULTURE and samples taken using a method developed by the industry partner for this investigation. On Earth, the samples will be analyzed by gas chromatography/mass spectrometry and the data used to evaluate both the production of fragrant oils in microgravity and in the development of one or more products. The ASTROCULTURE payload uses these pourous tubes with precise pressure sensing and control for fluid delivery to the plant root tray.
NASA Technical Reports Server (NTRS)
1998-01-01
Research with plants in microgravity offers many exciting opportunities to gain new insights and could improve products on Earth ranging from crop production to fragrances and food flavorings. The ASTROCULTURE facility is a lead commercial facility for plant growth and plant research in microgravity and was developed by the Wisconsin Center for Space Automation and Robotics (WSCAR), a NASA Commercial Space Center. On STS-95 it will support research that could help improve crop development leading to plants that are more disease resistant or have a higher yield and provide data on the production of plant essential oils---oils that contain the essence of the plant and provide both fragrance and flavoring. On STS-95, a flowering plant will be grown in ASTROCULTURE and samples taken using a method developed by the industry partner for this investigation. On Earth, the samples will be analyzed by gas chromatography/mass spectrometry and the data used to evaluate both the production of fragrant oils in microgravity and in the development of one or more products. The ASTROCULTURE payload uses these pourous tubes with precise pressure sensing and control for fluid delivery to the plant root tray.
NASA Astrophysics Data System (ADS)
Wang, Dandan; Zhao, Gong-Bo; Wang, Yuting; Percival, Will J.; Ruggeri, Rossana; Zhu, Fangzhou; Tojeiro, Rita; Myers, Adam D.; Chuang, Chia-Hsun; Baumgarten, Falk; Zhao, Cheng; Gil-Marín, Héctor; Ross, Ashley J.; Burtin, Etienne; Zarrouk, Pauline; Bautista, Julian; Brinkmann, Jonathan; Dawson, Kyle; Brownstein, Joel R.; de la Macorra, Axel; Schneider, Donald P.; Shafieloo, Arman
2018-06-01
We present a measurement of the anisotropic and isotropic Baryon Acoustic Oscillations (BAO) from the extended Baryon Oscillation Spectroscopic Survey Data Release 14 quasar sample with optimal redshift weights. Applying the redshift weights improves the constraint on the BAO dilation parameter α(zeff) by 17 per cent. We reconstruct the evolution history of the BAO distance indicators in the redshift range of 0.8 < z < 2.2. This paper is part of a set that analyses the eBOSS DR14 quasar sample.
NASA Launches CubeSat to Study Bacteria in Space
2017-11-08
Ever wonder what would happen if you got sick in space? NASA is sending samples of bacteria into low-Earth orbit to find out. One of the latest small satellite missions from NASA’s Ames Research Center in California’s Silicon Valley is the E. coli Anti-Microbial Satellite, or EcAMSat for short. This CubeSat – a spacecraft the size of a shoebox built from cube-shaped units – will explore how effectively antibiotics can combat E. coli bacteria in the low gravity of space. This information will help us improve how we fight infections, providing safer journeys for astronauts on their future voyages, and offer benefits for medicine here on Earth.
A BRDF statistical model applying to space target materials modeling
NASA Astrophysics Data System (ADS)
Liu, Chenghao; Li, Zhi; Xu, Can; Tian, Qichen
2017-10-01
In order to solve the problem of poor effect in modeling the large density BRDF measured data with five-parameter semi-empirical model, a refined statistical model of BRDF which is suitable for multi-class space target material modeling were proposed. The refined model improved the Torrance-Sparrow model while having the modeling advantages of five-parameter model. Compared with the existing empirical model, the model contains six simple parameters, which can approximate the roughness distribution of the material surface, can approximate the intensity of the Fresnel reflectance phenomenon and the attenuation of the reflected light's brightness with the azimuth angle changes. The model is able to achieve parameter inversion quickly with no extra loss of accuracy. The genetic algorithm was used to invert the parameters of 11 different samples in the space target commonly used materials, and the fitting errors of all materials were below 6%, which were much lower than those of five-parameter model. The effect of the refined model is verified by comparing the fitting results of the three samples at different incident zenith angles in 0° azimuth angle. Finally, the three-dimensional modeling visualizations of these samples in the upper hemisphere space was given, in which the strength of the optical scattering of different materials could be clearly shown. It proved the good describing ability of the refined model at the material characterization as well.
Towards a Radiation Hardened Fluxgate Magnetometer for Space Physics Applications
NASA Astrophysics Data System (ADS)
Miles, David M.
Space-based measurements of the Earth's magnetic field are required to understand the plasma processes of the solar-terrestrial connection which energize the Van Allen radiation belts and cause space weather. This thesis describes a fluxgate magnetometer payload developed for the proposed Canadian Space Agencys Outer Radiation Belt Injection, Transport, Acceleration and Loss Satellite (ORBITALS) mission. The instrument can resolve 8 pT on a 65,000 nT field at 900 samples per second with a magnetic noise of less than 10 pT per square-root Hertz at 1 Hertz. The design can be manufactured from radiation tolerant (100 krad) space grade parts. A novel combination of analog temperature compensation and digital feedback simplifies and miniaturises the instrument while improving the measurement bandwidth and resolution. The prototype instrument was successfully validated at the Natural Resources Canada Geomagnetics Laboratory, and is being considered for future ground, satellite and sounding rocket applications.
Decades of Data: Extracting Trends from Microgravity Crystallization History
NASA Technical Reports Server (NTRS)
Judge, R. A.; Snell, E. H.; Kephart, R.; vanderWoerd, M.
2004-01-01
The reduced acceleration environment of an orbiting spacecraft has been proposed as an ideal environment for biological crystal growth as the first sounding rocket flight in 1981 many crystallization experiments have flown with some showing improvement and others not. To further explore macromolecule crystal improvement in microgravity we have accumulated data from published reports and reports submitted by 63 missions including the Space Shuttle program, unmanned satellites, the Russian Space Station MIR and sounding rocket experiments. While it is not at this point in time a comprehensive record of all flight crystallization experimental results, there is however sufficient information for emerging trends to be identified. In this study the effects of the acceleration environment, the techniques of crystallization, sample molecular weight and the response of individual macromolecules to microgravity crystallization will be investigated.
Low illumination color image enhancement based on improved Retinex
NASA Astrophysics Data System (ADS)
Liao, Shujing; Piao, Yan; Li, Bing
2017-11-01
Low illumination color image usually has the characteristics of low brightness, low contrast, detail blur and high salt and pepper noise, which greatly affected the later image recognition and information extraction. Therefore, in view of the degradation of night images, the improved algorithm of traditional Retinex. The specific approach is: First, the original RGB low illumination map is converted to the YUV color space (Y represents brightness, UV represents color), and the Y component is estimated by using the sampling acceleration guidance filter to estimate the background light; Then, the reflection component is calculated by the classical Retinex formula and the brightness enhancement ratio between original and enhanced is calculated. Finally, the color space conversion from YUV to RGB and the feedback enhancement of the UV color component are carried out.
NASA Technical Reports Server (NTRS)
Goyet, Catherine; Davis, Daniel; Peltzer, Edward T.; Brewer, Peter G.
1995-01-01
Large-scale ocean observing programs such as the Joint Global Ocean Flux Study (JGOFS) and the World Ocean Circulation Experiment (WOCE) today, must face the problem of designing an adequate sampling strategy. For ocean chemical variables, the goals and observing technologies are quite different from ocean physical variables (temperature, salinity, pressure). We have recently acquired data on the ocean CO2 properties on WOCE cruises P16c and P17c that are sufficiently dense to test for sampling redundancy. We use linear and quadratic interpolation methods on the sampled field to investigate what is the minimum number of samples required to define the deep ocean total inorganic carbon (TCO2) field within the limits of experimental accuracy (+/- 4 micromol/kg). Within the limits of current measurements, these lines were oversampled in the deep ocean. Should the precision of the measurement be improved, then a denser sampling pattern may be desirable in the future. This approach rationalizes the efficient use of resources for field work and for estimating gridded (TCO2) fields needed to constrain geochemical models.
A rapid and robust gradient measurement technique using dynamic single-point imaging.
Jang, Hyungseok; McMillan, Alan B
2017-09-01
We propose a new gradient measurement technique based on dynamic single-point imaging (SPI), which allows simple, rapid, and robust measurement of k-space trajectory. To enable gradient measurement, we utilize the variable field-of-view (FOV) property of dynamic SPI, which is dependent on gradient shape. First, one-dimensional (1D) dynamic SPI data are acquired from a targeted gradient axis, and then relative FOV scaling factors between 1D images or k-spaces at varying encoding times are found. These relative scaling factors are the relative k-space position that can be used for image reconstruction. The gradient measurement technique also can be used to estimate the gradient impulse response function for reproducible gradient estimation as a linear time invariant system. The proposed measurement technique was used to improve reconstructed image quality in 3D ultrashort echo, 2D spiral, and multi-echo bipolar gradient-echo imaging. In multi-echo bipolar gradient-echo imaging, measurement of the k-space trajectory allowed the use of a ramp-sampled trajectory for improved acquisition speed (approximately 30%) and more accurate quantitative fat and water separation in a phantom. The proposed dynamic SPI-based method allows fast k-space trajectory measurement with a simple implementation and no additional hardware for improved image quality. Magn Reson Med 78:950-962, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
POCS-enhanced correction of motion artifacts in parallel MRI.
Samsonov, Alexey A; Velikina, Julia; Jung, Youngkyoo; Kholmovski, Eugene G; Johnson, Chris R; Block, Walter F
2010-04-01
A new method for correction of MRI motion artifacts induced by corrupted k-space data, acquired by multiple receiver coils such as phased arrays, is presented. In our approach, a projections onto convex sets (POCS)-based method for reconstruction of sensitivity encoded MRI data (POCSENSE) is employed to identify corrupted k-space samples. After the erroneous data are discarded from the dataset, the artifact-free images are restored from the remaining data using coil sensitivity profiles. The error detection and data restoration are based on informational redundancy of phased-array data and may be applied to full and reduced datasets. An important advantage of the new POCS-based method is that, in addition to multicoil data redundancy, it can use a priori known properties about the imaged object for improved MR image artifact correction. The use of such information was shown to improve significantly k-space error detection and image artifact correction. The method was validated on data corrupted by simulated and real motion such as head motion and pulsatile flow.
NASA Technical Reports Server (NTRS)
Vaughan, William W.; Anderson, B. Jeffrey
2005-01-01
In modern government and aerospace industry institutions the necessity of controlling current year costs often leads to high mobility in the technical workforce, "one-deep" technical capabilities, and minimal mentoring for young engineers. Thus, formal recording, use, and teaching of lessons learned are especially important in the maintenance and improvement of current knowledge and development of new technologies, regardless of the discipline area. Within the NASA Technical Standards Program Website http://standards.nasa.gov there is a menu item entitled "Lessons Learned/Best Practices". It contains links to a large number of engineering and technical disciplines related data sets that contain a wealth of lessons learned information based on past experiences. This paper has provided a small sample of lessons learned relative to the atmospheric and space environment. There are many more whose subsequent applications have improved our knowledge of the atmosphere and space environment, and the application of this knowledge to the engineering and operations for a variety of aerospace programs.
Kotb, N A; Solieman, Ahmed H M; El-Zakla, T; Amer, T Z; Elmeniawi, S; Comsan, M N H
2018-05-01
A neutron irradiation facility consisting of six 241 Am-Be neutron sources of 30 Ci total activity and 6.6 × 10 7 n/s total neutron yield is designed. The sources are embedded in a cubic paraffin wax, which plays a dual role as both moderator and reflector. The sample passage and irradiation channel are represented by a cylindrical path of 5 cm diameter passing through the facility core. The proposed design yields a high degree of space symmetry and thermal neutron homogeneity within 98% of flux distribution throughout the irradiated spherical sample of 5 cm diameter. The obtained thermal neutron flux is 8.0 × 10 4 n/cm 2 .s over the sample volume, with thermal-to-fast and thermal-to-epithermal ratios of 1.20 and 3.35, respectively. The design is optimized for maximizing the thermal neutron flux at sample position using the MCNP-5 code. The irradiation facility is supposed to be employed principally for neutron activation analysis. Copyright © 2018 Elsevier Ltd. All rights reserved.
Chi, Baofang; Tao, Shiheng; Liu, Yanlin
2015-01-01
Sampling the solution space of genome-scale models is generally conducted to determine the feasible region for metabolic flux distribution. Because the region for actual metabolic states resides only in a small fraction of the entire space, it is necessary to shrink the solution space to improve the predictive power of a model. A common strategy is to constrain models by integrating extra datasets such as high-throughput datasets and C13-labeled flux datasets. However, studies refining these approaches by performing a meta-analysis of massive experimental metabolic flux measurements, which are closely linked to cellular phenotypes, are limited. In the present study, experimentally identified metabolic flux data from 96 published reports were systematically reviewed. Several strong associations among metabolic flux phenotypes were observed. These phenotype-phenotype associations at the flux level were quantified and integrated into a Saccharomyces cerevisiae genome-scale model as extra physiological constraints. By sampling the shrunken solution space of the model, the metabolic flux fluctuation level, which is an intrinsic trait of metabolic reactions determined by the network, was estimated and utilized to explore its relationship to gene expression noise. Although no correlation was observed in all enzyme-coding genes, a relationship between metabolic flux fluctuation and expression noise of genes associated with enzyme-dosage sensitive reactions was detected, suggesting that the metabolic network plays a role in shaping gene expression noise. Such correlation was mainly attributed to the genes corresponding to non-essential reactions, rather than essential ones. This was at least partially, due to regulations underlying the flux phenotype-phenotype associations. Altogether, this study proposes a new approach in shrinking the solution space of a genome-scale model, of which sampling provides new insights into gene expression noise.
Mass spectrometer with electron source for reducing space charge effects in sample beam
Houk, Robert S.; Praphairaksit, Narong
2003-10-14
A mass spectrometer includes an ion source which generates a beam including positive ions, a sampling interface which extracts a portion of the beam from the ion source to form a sample beam that travels along a path and has an excess of positive ions over at least part of the path, thereby causing space charge effects to occur in the sample beam due to the excess of positive ions in the sample beam, an electron source which adds electrons to the sample beam to reduce space charge repulsion between the positive ions in the sample beam, thereby reducing the space charge effects in the sample beam and producing a sample beam having reduced space charge effects, and a mass analyzer which analyzes the sample beam having reduced space charge effects.
GlioLab-a space system for Glioblastoma multiforme cells on orbit behavior study
NASA Astrophysics Data System (ADS)
Cappelletti, Chantal; Twiggs, Robert J.
Microgravity conditions and ionizing radiation pose significant health risks for human life in space. This is a concern for future missions and also for future space tourism flights. Nev-ertheless, at the same time it is very interesting to study the effects of these conditions in unhealthy organism like biological samples affected by cancer. It is possible that space envi-ronment increases, decreases or doesn't have any effect on cancer cells. In any case the test results give important informations about cancer treatment or space tourism flight for people affected by cancer. GlioLab is a joint project between GAUSS-Group of Astrodynamics at the "Sapienza" University of Roma and the Morehead State University (MSU) Space Science Center in Kentucky. The main goal of this project is the design and manufacturing of an autonomous space system to investigate potential effects of the space environment exposure on a human glioblastoma multiforme cell line derived from a 65-year-old male and on Normal Human Astrocytes (NHA). In particular the samples are Glioblastoma multiforme cancer cells because the radiotherapy using ionizing radiation is the only treatment after surgery that can give on ground an improvement on the survival rate for this very malignant cancer. During a mission on the ISS, GlioLab mission has to test the in orbit behavior of glioblastoma cancer cells and healthy neuronal cells, which are extremely fragile and require complex experimentation and testing. In this paper engineering solutions to design and manufacturing of an autonomous space system that can allow to keep alive these kind of cells are described. This autonomous system is characterized also by an optical device dedicated to cells behavior analysis and by microdosimeters for monitoring space radiation environment.
Multidimensionally encoded magnetic resonance imaging.
Lin, Fa-Hsuan
2013-07-01
Magnetic resonance imaging (MRI) typically achieves spatial encoding by measuring the projection of a q-dimensional object over q-dimensional spatial bases created by linear spatial encoding magnetic fields (SEMs). Recently, imaging strategies using nonlinear SEMs have demonstrated potential advantages for reconstructing images with higher spatiotemporal resolution and reducing peripheral nerve stimulation. In practice, nonlinear SEMs and linear SEMs can be used jointly to further improve the image reconstruction performance. Here, we propose the multidimensionally encoded (MDE) MRI to map a q-dimensional object onto a p-dimensional encoding space where p > q. MDE MRI is a theoretical framework linking imaging strategies using linear and nonlinear SEMs. Using a system of eight surface SEM coils with an eight-channel radiofrequency coil array, we demonstrate the five-dimensional MDE MRI for a two-dimensional object as a further generalization of PatLoc imaging and O-space imaging. We also present a method of optimizing spatial bases in MDE MRI. Results show that MDE MRI with a higher dimensional encoding space can reconstruct images more efficiently and with a smaller reconstruction error when the k-space sampling distribution and the number of samples are controlled. Copyright © 2012 Wiley Periodicals, Inc.
Spectrally interleaved, comb-mode-resolved spectroscopy using swept dual terahertz combs
Hsieh, Yi-Da; Iyonaga, Yuki; Sakaguchi, Yoshiyuki; Yokoyama, Shuko; Inaba, Hajime; Minoshima, Kaoru; Hindle, Francis; Araki, Tsutomu; Yasui, Takeshi
2014-01-01
Optical frequency combs are innovative tools for broadband spectroscopy because a series of comb modes can serve as frequency markers that are traceable to a microwave frequency standard. However, a mode distribution that is too discrete limits the spectral sampling interval to the mode frequency spacing even though individual mode linewidth is sufficiently narrow. Here, using a combination of a spectral interleaving and dual-comb spectroscopy in the terahertz (THz) region, we achieved a spectral sampling interval equal to the mode linewidth rather than the mode spacing. The spectrally interleaved THz comb was realized by sweeping the laser repetition frequency and interleaving additional frequency marks. In low-pressure gas spectroscopy, we achieved an improved spectral sampling density of 2.5 MHz and enhanced spectral accuracy of 8.39 × 10−7 in the THz region. The proposed method is a powerful tool for simultaneously achieving high resolution, high accuracy, and broad spectral coverage in THz spectroscopy. PMID:24448604
Adaptive Sampling of Time Series During Remote Exploration
NASA Technical Reports Server (NTRS)
Thompson, David R.
2012-01-01
This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models are stationary, e.g., the covariance relationships are time-invariant. In such cases, information gain is independent of previously collected data, and the optimal solution can always be computed in advance. Information-optimal sampling of a stationary GP time series thus reduces to even spacing, and such models are not appropriate for tracking localized anomalies. Additionally, GP model inference can be computationally expensive.
A Time and Place for Everything: Developmental Differences in the Building Blocks of Episodic Memory
ERIC Educational Resources Information Center
Lee, Joshua K.; Wendelken, Carter; Bunge, Silvia A.; Ghetti, Simona
2016-01-01
This research investigated whether episodic memory development can be explained by improvements in relational binding processes, involved in forming novel associations between events and the context in which they occurred. Memory for item-space, item-time, and item-item relations was assessed in an ethnically diverse sample of 151 children aged…
Dai, Erpeng; Zhang, Zhe; Ma, Xiaodong; Dong, Zijing; Li, Xuesong; Xiong, Yuhui; Yuan, Chun; Guo, Hua
2018-03-23
To study the effects of 2D navigator distortion and noise level on interleaved EPI (iEPI) DWI reconstruction, using either the image- or k-space-based method. The 2D navigator acquisition was adjusted by reducing its echo spacing in the readout direction and undersampling in the phase encoding direction. A POCS-based reconstruction using image-space sampling function (IRIS) algorithm (POCSIRIS) was developed to reduce the impact of navigator distortion. POCSIRIS was then compared with the original IRIS algorithm and a SPIRiT-based k-space algorithm, under different navigator distortion and noise levels. Reducing the navigator distortion can improve the reconstruction of iEPI DWI. The proposed POCSIRIS and SPIRiT-based algorithms are more tolerable to different navigator distortion levels, compared to the original IRIS algorithm. SPIRiT may be hindered by low SNR of the navigator. Multi-shot iEPI DWI reconstruction can be improved by reducing the 2D navigator distortion. Different reconstruction methods show variable sensitivity to navigator distortion or noise levels. Furthermore, the findings can be valuable in applications such as simultaneous multi-slice accelerated iEPI DWI and multi-slab diffusion imaging. © 2018 International Society for Magnetic Resonance in Medicine.
NASA Technical Reports Server (NTRS)
Merrick, E. B.
1979-01-01
An alternative space suit insulation concept using a monolayer woven pile material is discussed. The material reduces cost and improves the durability of the overgarment, while providing protection similar to that provided by multilayer insulation (MLI). Twelve samples of different configurations were fabricated and tested for compressibility and thermal conductivity as a function of compression loading. Two samples which showed good results in the initial tests were further tested for thermal conductivity with respect to ambient pressure and temperature. Results of these tests were similar to results of the MLI tests, indicating the potential of the monolayer fabric to replace the present MLI. A seaming study illustrated that the fabric can be sewn in a structurally sound seam with minimal heat loss. It is recommended that a prototype thermal meteroid garment be fabricated.
Response of Heterogeneous and Fractured Carbonate Samples to CO2-Brine Exposure
NASA Astrophysics Data System (ADS)
Smith, M. M.; Mason, H. E.; Hao, Y.; Carroll, S.
2014-12-01
Carbonate rock units are often considered as candidate sites for storage of carbon dioxide (CO2), whether as stand-alone reservoirs or coupled with enhanced oil recovery efforts. In order to accept injected carbon dioxide, carbonate reservoirs must either possess sufficient preexisting connected void space, or react with CO2-acidified fluids to produce more pore space and improve permeability. However, upward migration of CO2 through barrier zones or seal layers must be minimized for effective safe storage. Therefore, prediction of the changes to porosity and permeability in these systems over time is a key component of reservoir management. Towards this goal, we present the results of several experiments on carbonate core samples from the Wellington, Kansas 1-32 well, conducted under reservoir temperature, pressure, and CO2 conditions. These samples were imaged by X-ray computed tomography (XRCT) and analyzed with nuclear magnetic resonance (NMR) spectroscopy both prior to and after reaction with CO2-enriched brines. The carbonate samples each displayed distinct responses to CO2 exposure in terms of permeability change with time and relative abundance of calcite versus dolomite dissolution. The measured permeability of each sample was also much lower than that estimated by downhole NMR logging, with samples with larger fractured regions possessing higher permeability values. We present also our modeling approach and preliminary simulation results for a specific sample from the targeted injection zone. The heterogeneous composition as well as the presence of large fractured zones within the rock necessitated the use of a nested three-region approach to represent the range of void space observed via tomography. Currently, the physical response to CO2-brine flow (i.e., pressure declines with time) is reproduced well but the extent of chemical reaction is overestimated by the model.
NASA Technical Reports Server (NTRS)
Panda, Binayak; Gorti, Sridhar
2013-01-01
A number of research instruments are available at NASA's Marshall Space Flight Center (MSFC) to support ISS researchers and their investigations. These modern analytical tools yield valuable and sometimes new informative resulting from sample characterization. Instruments include modern scanning electron microscopes equipped with field emission guns providing analytical capabilities that include angstron-level image resolution of dry, wet and biological samples. These microscopes are also equipped with silicon drift X-ray detectors (SDD) for fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations in crystalline alloys. Sample chambers admit large samples, provide variable pressures for wet samples, and quantitative analysis software to determine phase relations. Advances in solid-state electronics have also facilitated improvements for surface chemical analysis that are successfully employed to analyze metallic materials and alloys, ceramics, slags, and organic polymers. Another analytical capability at MSFC is a mganetic sector Secondary Ion Mass Spectroscopy (SIMS) that quantitatively determines and maps light elements such as hydrogen, lithium, and boron along with their isotopes, identifies and quantifies very low level impurities even at parts per billion (ppb) levels. Still other methods available at MSFC include X-ray photo-electron spectroscopy (XPS) that can determine oxidation states of elements as well as identify polymers and measure film thicknesses on coated materials, Scanning Auger electron spectroscopy (SAM) which combines surface sensitivity, spatial lateral resolution (approximately 20 nm), and depth profiling capabilities to describe elemental compositions in near surface regions and even the chemical state of analyzed atoms. Conventional Transmission Electron Microscope (TEM) for observing internal microstructures at very high magnifications and the Electron Probe Micro-analyzer (EPMA) for very precise microanalysis are available as needed by the researcher. Space Station researchers are invited to work with MSFC in analyzing their samples using these techniques.
NASA Astrophysics Data System (ADS)
Zheng, Lianqing; Yang, Wei
2008-07-01
Recently, accelerated molecular dynamics (AMD) technique was generalized to realize essential energy space random walks so that further sampling enhancement and effective localized enhanced sampling could be achieved. This method is especially meaningful when essential coordinates of the target events are not priori known; moreover, the energy space metadynamics method was also introduced so that biasing free energy functions can be robustly generated. Despite the promising features of this method, due to the nonequilibrium nature of the metadynamics recursion, it is challenging to rigorously use the data obtained at the recursion stage to perform equilibrium analysis, such as free energy surface mapping; therefore, a large amount of data ought to be wasted. To resolve such problem so as to further improve simulation convergence, as promised in our original paper, we are reporting an alternate approach: the adaptive-length self-healing (ALSH) strategy for AMD simulations; this development is based on a recent self-healing umbrella sampling method. Here, the unit simulation length for each self-healing recursion is increasingly updated based on the Wang-Landau flattening judgment. When the unit simulation length for each update is long enough, all the following unit simulations naturally run into the equilibrium regime. Thereafter, these unit simulations can serve for the dual purposes of recursion and equilibrium analysis. As demonstrated in our model studies, by applying ALSH, both fast recursion and short nonequilibrium data waste can be compromised. As a result, combining all the data obtained from all the unit simulations that are in the equilibrium regime via the weighted histogram analysis method, efficient convergence can be robustly ensured, especially for the purpose of free energy surface mapping.
Statistical Evaluation of Molecular Contamination During Spacecraft Thermal Vacuum Test
NASA Technical Reports Server (NTRS)
Chen, Philip; Hedgeland, Randy; Montoya, Alex; Roman-Velazquez, Juan; Dunn, Jamie; Colony, Joe; Petitto, Joseph
1998-01-01
The purpose of this paper is to evaluate the statistical molecular contamination data with a goal to improve spacecraft contamination control. The statistical data was generated in typical thermal vacuum tests at the National Aeronautics and Space Administration, Goddard Space Flight Center (GSFC). The magnitude of material outgassing was measured using a Quartz Crystal Microbalance (QCM) device during the test. A solvent rinse sample was taken at the conclusion of each test. Then detailed qualitative and quantitative measurements were obtained through chemical analyses. All data used in this study encompassed numerous spacecraft tests in recent years.
Statistical Evaluation of Molecular Contamination During Spacecraft Thermal Vacuum Test
NASA Technical Reports Server (NTRS)
Chen, Philip; Hedgeland, Randy; Montoya, Alex; Roman-Velazquez, Juan; Dunn, Jamie; Colony, Joe; Petitto, Joseph
1999-01-01
The purpose of this paper is to evaluate the statistical molecular contamination data with a goal to improve spacecraft contamination control. The statistical data was generated in typical thermal vacuum tests at the National Aeronautics and Space Administration, Goddard Space Flight Center (GSFC). The magnitude of material outgassing was measured using a Quartz Crystal Microbalance (QCNO device during the test. A solvent rinse sample was taken at the conclusion of each test. Then detailed qualitative and quantitative measurements were obtained through chemical analyses. All data used in this study encompassed numerous spacecraft tests in recent years.
Statistical Evaluation of Molecular Contamination During Spacecraft Thermal Vacuum Test
NASA Technical Reports Server (NTRS)
Chen, Philip; Hedgeland, Randy; Montoya, Alex; Roman-Velazquez, Juan; Dunn, Jamie; Colony, Joe; Petitto, Joseph
1997-01-01
The purpose of this paper is to evaluate the statistical molecular contamination data with a goal to improve spacecraft contamination control. The statistical data was generated in typical thermal vacuum tests at the National Aeronautics and Space Administration, Goddard Space Flight Center (GSFC). The magnitude of material outgassing was measured using a Quartz Crystal Microbalance (QCM) device during the test. A solvent rinse sample was taken at the conclusion of the each test. Then detailed qualitative and quantitative measurements were obtained through chemical analyses. All data used in this study encompassed numerous spacecraft tests in recent years.
Bearing Fault Diagnosis Based on Statistical Locally Linear Embedding
Wang, Xiang; Zheng, Yuan; Zhao, Zhenzhou; Wang, Jinping
2015-01-01
Fault diagnosis is essentially a kind of pattern recognition. The measured signal samples usually distribute on nonlinear low-dimensional manifolds embedded in the high-dimensional signal space, so how to implement feature extraction, dimensionality reduction and improve recognition performance is a crucial task. In this paper a novel machinery fault diagnosis approach based on a statistical locally linear embedding (S-LLE) algorithm which is an extension of LLE by exploiting the fault class label information is proposed. The fault diagnosis approach first extracts the intrinsic manifold features from the high-dimensional feature vectors which are obtained from vibration signals that feature extraction by time-domain, frequency-domain and empirical mode decomposition (EMD), and then translates the complex mode space into a salient low-dimensional feature space by the manifold learning algorithm S-LLE, which outperforms other feature reduction methods such as PCA, LDA and LLE. Finally in the feature reduction space pattern classification and fault diagnosis by classifier are carried out easily and rapidly. Rolling bearing fault signals are used to validate the proposed fault diagnosis approach. The results indicate that the proposed approach obviously improves the classification performance of fault pattern recognition and outperforms the other traditional approaches. PMID:26153771
Ultra Low Outgassing silicone performance in a simulated space ionizing radiation environment
NASA Astrophysics Data System (ADS)
Velderrain, M.; Malave, V.; Taylor, E. W.
2010-09-01
The improvement of silicone-based materials used in space and aerospace environments has garnered much attention for several decades. Most recently, an Ultra Low Outgassing™ silicone incorporating innovative reinforcing and functional fillers has shown that silicone elastomers with unique and specific properties can be developed to meet applications requiring stringent outgassing requirements. This paper will report on the next crucial step in qualifying these materials for spacecraft applications requiring chemical and physical stability in the presence of ionizing radiation. As a first step in this process, selected materials were irradiated with Co-60 gamma-rays to simulate the total dose received in near- Earth orbits. The paper will present pre-and post-irradiation response data of Ultra Low Outgassing silicone samples exposed under ambient air environment coupled with measurements of collected volatile condensable material (CVCM) and total mass loss (TML) per the standard conditions in ASTM E 595. The data will show an insignificant effect on the CVCMs and TMLs after exposure to various dosages of gamma radiation. This data may favorably impact new applications for these silicone materials for use as an improved sealant for space solar cell systems, space structures, satellite systems and aerospace systems.
Centre-based restricted nearest feature plane with angle classifier for face recognition
NASA Astrophysics Data System (ADS)
Tang, Linlin; Lu, Huifen; Zhao, Liang; Li, Zuohua
2017-10-01
An improved classifier based on the nearest feature plane (NFP), called the centre-based restricted nearest feature plane with the angle (RNFPA) classifier, is proposed for the face recognition problems here. The famous NFP uses the geometrical information of samples to increase the number of training samples, but it increases the computation complexity and it also has an inaccuracy problem coursed by the extended feature plane. To solve the above problems, RNFPA exploits a centre-based feature plane and utilizes a threshold of angle to restrict extended feature space. By choosing the appropriate angle threshold, RNFPA can improve the performance and decrease computation complexity. Experiments in the AT&T face database, AR face database and FERET face database are used to evaluate the proposed classifier. Compared with the original NFP classifier, the nearest feature line (NFL) classifier, the nearest neighbour (NN) classifier and some other improved NFP classifiers, the proposed one achieves competitive performance.
Kim, Ilsoo; Allen, Toby W
2012-04-28
Free energy perturbation, a method for computing the free energy difference between two states, is often combined with non-Boltzmann biased sampling techniques in order to accelerate the convergence of free energy calculations. Here we present a new extension of the Bennett acceptance ratio (BAR) method by combining it with umbrella sampling (US) along a reaction coordinate in configurational space. In this approach, which we call Bennett acceptance ratio with umbrella sampling (BAR-US), the conditional histogram of energy difference (a mapping of the 3N-dimensional configurational space via a reaction coordinate onto 1D energy difference space) is weighted for marginalization with the associated population density along a reaction coordinate computed by US. This procedure produces marginal histograms of energy difference, from forward and backward simulations, with higher overlap in energy difference space, rendering free energy difference estimations using BAR statistically more reliable. In addition to BAR-US, two histogram analysis methods, termed Bennett overlapping histograms with US (BOH-US) and Bennett-Hummer (linear) least square with US (BHLS-US), are employed as consistency and convergence checks for free energy difference estimation by BAR-US. The proposed methods (BAR-US, BOH-US, and BHLS-US) are applied to a 1-dimensional asymmetric model potential, as has been used previously to test free energy calculations from non-equilibrium processes. We then consider the more stringent test of a 1-dimensional strongly (but linearly) shifted harmonic oscillator, which exhibits no overlap between two states when sampled using unbiased Brownian dynamics. We find that the efficiency of the proposed methods is enhanced over the original Bennett's methods (BAR, BOH, and BHLS) through fast uniform sampling of energy difference space via US in configurational space. We apply the proposed methods to the calculation of the electrostatic contribution to the absolute solvation free energy (excess chemical potential) of water. We then address the controversial issue of ion selectivity in the K(+) ion channel, KcsA. We have calculated the relative binding affinity of K(+) over Na(+) within a binding site of the KcsA channel for which different, though adjacent, K(+) and Na(+) configurations exist, ideally suited to these US-enhanced methods. Our studies demonstrate that the significant improvements in free energy calculations obtained using the proposed methods can have serious consequences for elucidating biological mechanisms and for the interpretation of experimental data.
2017-01-09
Deena Dombrosky (Zin Technologies Engineer) is shown here filling a Procter & Gamble (P & G) sample that will be used in ground-testing as NASA prepares for their experiment on the International Space Station (ISS). The sample particles are the size of the wavelength of light and they are dyed orange/pink to glow when illuminated with the laser light enabling a confocal microscope to produce 3D images. The P & G experiment will improve product stabilizers that extend product shelf life. This has the added advantage of leading to more compact environmentally friendly containers.
Novel Amalgams for In-Space Fabrication of Replacement Parts
NASA Technical Reports Server (NTRS)
Cochran, Calvin T.; Van Hoose, James R.; Grugel, R. N.
2012-01-01
Being able to fabricate replacement parts during extended space flight missions precludes the weight, storage volume, and speculation necessary to accommodate spares. Amalgams, widely used in dentistry, are potential candidates for fabricating parts in microgravity environments as they are moldable, do not require energy for melting, and do not pose fluid handling problems. Unfortunately, amalgams have poor tensile strength and the room temperature liquid component is mercury. To possibly resolve these issues a gallium-indium alloy was substituted for mercury and small steel fibers were mixed in with the commercial alloy powder. Subsequent microscopic examination of the novel amalgam revealed complete bonding of the components, and mechanical testing of comparable samples showed those containing steel fibers to have a significant improvement in strength. Experimental procedures, microstructures, and test results are presented and discussed in view of further improving properties.
Yang, Yang; He, Jinliang; Wu, Guangning; Hu, Jun
2015-01-01
Insulation performance of the dielectrics under extreme conditions always attracts widespread attention in electrical and electronic field. How to improve the high-temperature dielectric properties of insulation materials is one of the key issues in insulation system design of electrical devices. This paper studies the temperature-dependent corona resistance of polyimide (PI)/Al2O3 nanocomposite films under high-frequency square-wave pulse conditions. Extended corona resistant lifetime under high-temperature conditions is experimentally observed in the 2 wt% nanocomposite samples. The “thermal stabilization effect” is proposed to explain this phenomenon which attributes to a new kind of trap band caused by nanoparticles. This effect brings about superior space charge characteristics and corona resistance under high temperature with certain nano-doping concentration. The proposed theory is experimentally demonstrated by space charge analysis and thermally stimulated current (TSC) tests. This discovered effect is of profound significance on improving high-temperature dielectric properties of nanocomposites towards various applications. PMID:26597981
Improved Radio-Frequency Magneto-Optical Trap of SrF Molecules.
Steinecker, Matthew H; McCarron, Daniel J; Zhu, Yuqi; DeMille, David
2016-11-18
We report the production of ultracold, trapped strontium monofluoride (SrF) molecules with number density and phase-space density significantly higher than previously achieved. These improvements are enabled by three distinct changes to our recently-demonstrated scheme for radio-frequency magneto-optical trapping of SrF: modification of the slowing laser beam geometry, addition of an optical pumping laser, and incorporation of a compression stage to the magneto-optical trap. With these improvements, we observe a trapped sample of SrF molecules at density 2.5×10 5 cm -3 and phase-space density 6×10 -14 , each a factor of 4 greater than in previous work. Under different experimental conditions, we observe trapping of up to 10 4 molecules, a factor of 5 greater than in previous work. Finally, by reducing the intensity of the applied trapping light, we observe molecular temperatures as low as 250 μK. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dynamic Event Tree advancements and control logic improvements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less
25th Space Simulation Conference. Environmental Testing: The Earth-Space Connection
NASA Technical Reports Server (NTRS)
Packard, Edward
2008-01-01
Topics covered include: Methods of Helium Injection and Removal for Heat Transfer Augmentation; The ESA Large Space Simulator Mechanical Ground Support Equipment for Spacecraft Testing; Temperature Stability and Control Requirements for Thermal Vacuum/Thermal Balance Testing of the Aquarius Radiometer; The Liquid Nitrogen System for Chamber A: A Change from Original Forced Flow Design to a Natural Flow (Thermo Siphon) System; Return to Mercury: A Comparison of Solar Simulation and Flight Data for the MESSENGER Spacecraft; Floating Pressure Conversion and Equipment Upgrades of Two 3.5kw, 20k, Helium Refrigerators; Affect of Air Leakage into a Thermal-Vacuum Chamber on Helium Refrigeration Heat Load; Special ISO Class 6 Cleanroom for the Lunar Reconnaissance Orbiter (LRO) Project; A State-of-the-Art Contamination Effects Research and Test Facility Martian Dust Simulator; Cleanroom Design Practices and Their Influence on Particle Counts; Extra Terrestrial Environmental Chamber Design; Contamination Sources Effects Analysis (CSEA) - A Tool to Balance Cost/Schedule While Managing Facility Availability; SES and Acoustics at GSFC; HST Super Lightweight Interchangeable Carrier (SLIC) Static Test; Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance; Estimating Shock Spectra: Extensions beyond GEVS; Structural Dynamic Analysis of a Spacecraft Multi-DOF Shaker Table; Direct Field Acoustic Testing; Manufacture of Cryoshroud Surfaces for Space Simulation Chambers; The New LOTIS Test Facility; Thermal Vacuum Control Systems Options for Test Facilities; Extremely High Vacuum Chamber for Low Outgassing Processing at NASA Goddard; Precision Cleaning - Path to Premier; The New Anechoic Shielded Chambers Designed for Space and Commercial Applications at LIT; Extraction of Thermal Performance Values from Samples in the Lunar Dust Adhesion Bell Jar; Thermal (Silicon Diode) Data Acquisition System; Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports; Exhaustive Thresholds and Resistance Checkpoints; Reconfigurable HIL Testing of Earth Satellites; FPGA Control System for the Automated Test of MicroShutters; Ongoing Capabilities and Developments of Re-Entry Plasma Ground Tests at EADS-ASTRIUM; Operationally Responsive Space Standard Bus Battery Thermal Balance Testing and Heat Dissipation Analysis; Galileo - The Serial-Production AIT Challenge; The Space Systems Environmental Test Facility Database (SSETFD), Website Development Status; Simulated Reentry Heating by Torching; Micro-Vibration Measurements on Thermally Loaded Multi-Layer Insulation Samples in Vacuum; High Temperature Life Testing of 80Ni-20Cr Wire in a Simulated Mars Atmosphere for the Sample Analysis at Mars (SAM) Instrument Suit Gas Processing System (GPS) Carbon Dioxide Scrubber; The Planning and Implementation of Test Facility Improvements; and Development of a Silicon Carbide Molecular Beam Nozzle for Simulation Planetary Flybys and Low-Earth Orbit.
Gagnon, Jessica K.; Law, Sean M.; Brooks, Charles L.
2016-01-01
Protein-ligand docking is a commonly used method for lead identification and refinement. While traditional structure-based docking methods represent the receptor as a rigid body, recent developments have been moving toward the inclusion of protein flexibility. Proteins exist in an inter-converting ensemble of conformational states, but effectively and efficiently searching the conformational space available to both the receptor and ligand remains a well-appreciated computational challenge. To this end, we have developed the Flexible CDOCKER method as an extension of the family of complete docking solutions available within CHARMM. This method integrates atomically detailed side chain flexibility with grid-based docking methods, maintaining efficiency while allowing the protein and ligand configurations to explore their conformational space simultaneously. This is in contrast to existing approaches that use induced-fit like sampling, such as Glide or Autodock, where the protein or the ligand space is sampled independently in an iterative fashion. Presented here are developments to the CHARMM docking methodology to incorporate receptor flexibility and improvements to the sampling protocol as demonstrated with re-docking trials on a subset of the CCDC/Astex set. These developments within CDOCKER achieve docking accuracy competitive with or exceeding the performance of other widely utilized docking programs. PMID:26691274
Gagnon, Jessica K; Law, Sean M; Brooks, Charles L
2016-03-30
Protein-ligand docking is a commonly used method for lead identification and refinement. While traditional structure-based docking methods represent the receptor as a rigid body, recent developments have been moving toward the inclusion of protein flexibility. Proteins exist in an interconverting ensemble of conformational states, but effectively and efficiently searching the conformational space available to both the receptor and ligand remains a well-appreciated computational challenge. To this end, we have developed the Flexible CDOCKER method as an extension of the family of complete docking solutions available within CHARMM. This method integrates atomically detailed side chain flexibility with grid-based docking methods, maintaining efficiency while allowing the protein and ligand configurations to explore their conformational space simultaneously. This is in contrast to existing approaches that use induced-fit like sampling, such as Glide or Autodock, where the protein or the ligand space is sampled independently in an iterative fashion. Presented here are developments to the CHARMM docking methodology to incorporate receptor flexibility and improvements to the sampling protocol as demonstrated with re-docking trials on a subset of the CCDC/Astex set. These developments within CDOCKER achieve docking accuracy competitive with or exceeding the performance of other widely utilized docking programs. © 2015 Wiley Periodicals, Inc.
Integrated system for gathering, processing, and reporting data relating to site contamination
Long, D.D.; Goldberg, M.S.; Baker, L.A.
1997-11-11
An integrated screening system comprises an intrusive sampling subsystem, a field mobile laboratory subsystem, a computer assisted design/geographical information subsystem, and a telecommunication linkup subsystem, all integrated to provide synergistically improved data relating to the extent of site soil/groundwater contamination. According to the present invention, data samples related to the soil, groundwater or other contamination of the subsurface material are gathered and analyzed to measure contaminants. Based on the location of origin of the samples in three-dimensional space, the analyzed data are transmitted to a location display. The data from analyzing samples and the data from the locating the origin are managed to project the next probable sample location. The next probable sample location is then forwarded for use as a guide in the placement of ensuing sample location, whereby the number of samples needed to accurately characterize the site is minimized. 10 figs.
Integrated system for gathering, processing, and reporting data relating to site contamination
Long, Delmar D.; Goldberg, Mitchell S.; Baker, Lorie A.
1997-01-01
An integrated screening system comprises an intrusive sampling subsystem, a field mobile laboratory subsystem, a computer assisted design/geographical information subsystem, and a telecommunication linkup subsystem, all integrated to provide synergistically improved data relating to the extent of site soil/groundwater contamination. According to the present invention, data samples related to the soil, groundwater or other contamination of the subsurface material are gathered and analyzed to measure contaminants. Based on the location of origin of the samples in three-dimensional space, the analyzed data are transmitted to a location display. The data from analyzing samples and the data from the locating the origin are managed to project the next probable sample location. The next probable sample location is then forwarded for use as a guide in the placement of ensuing sample location, whereby the number of samples needed to accurately characterize the site is minimized.
Constructing a multidimensional free energy surface like a spider weaving a web.
Chen, Changjun
2017-10-15
Complete free energy surface in the collective variable space provides important information of the reaction mechanisms of the molecules. But, sufficient sampling in the collective variable space is not easy. The space expands quickly with the number of the collective variables. To solve the problem, many methods utilize artificial biasing potentials to flatten out the original free energy surface of the molecule in the simulation. Their performances are sensitive to the definitions of the biasing potentials. Fast-growing biasing potential accelerates the sampling speed but decreases the accuracy of the free energy result. Slow-growing biasing potential gives an optimized result but needs more simulation time. In this article, we propose an alternative method. It adds the biasing potential to a representative point of the molecule in the collective variable space to improve the conformational sampling. And the free energy surface is calculated from the free energy gradient in the constrained simulation, not given by the negative of the biasing potential as previous methods. So the presented method does not require the biasing potential to remove all the barriers and basins on the free energy surface exactly. Practical applications show that the method in this work is able to produce the accurate free energy surfaces for different molecules in a short time period. The free energy errors are small in the cases of various biasing potentials. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Experimental Methods in Reduced-gravity Soldering Research
NASA Technical Reports Server (NTRS)
Pettegrew, Richard D.; Struk, Peter M.; Watson, John K.; Haylett, Daniel R.
2002-01-01
The National Center for Microgravity Research, NASA Glenn Research Center, and NASA Johnson Space Center are conducting an experimental program to explore the influence of reduced gravity environments on the soldering process. An improved understanding of the effects of the acceleration environment is important to application of soldering during current and future human space missions. Solder joint characteristics that are being considered include solder fillet geometry, porosity, and microstructural features. Both through-hole and surface mounted devices are being investigated. This paper focuses on the experimental methodology employed in this project and the results of macroscopic sample examination. The specific soldering process, sample configurations, materials, and equipment were selected to be consistent with those currently on-orbit. Other apparatus was incorporated to meet requirements imposed by operation onboard NASA's KC-135 research aircraft and instrumentation was provided to monitor both the atmospheric and acceleration environments. The contingent of test operators was selected to include both highly skilled technicians and less skilled individuals to provide a population cross-section that would be representative of the skill mix that might be encountered in space mission crews.
NASA Astrophysics Data System (ADS)
Marlina, L.; Liliasari; Tjasyono, B.; Hendayana, S.
2018-05-01
Critical thinking skills need to be developed in students. With critical thinking skills, students will be able to understand the concept with more depth easily, be sensitive with problems that occur, understand and solve problems that occur in their surroundings, and apply the concepts in different situations. Earth and Space Science (ESS) material is part of the science subjects given from elementary school to college. This research is a test of research program with quantitative method. This study aims to investigate the improvement of critical thinking skills of students through training of science teachers in junior high school in designing learning media for teaching ESS. With samples of 24 science teachers and 32 students of grade 7th in junior high school which are chosen by purposive sampling in a school in Ogan Ilir District, South Sumatra, obtained average pre-test and post-test scores of students’ critical thinking skills are 52.26 and 67.06 with an average N-gain of 0.31. A survey and critical thinking skills based-test were conducted to get the data. The results show positive impact and an increase in students’ critical thinking skills on the ESS material.
Bieler, Noah S; Hünenberger, Philippe H
2015-08-15
In a recent article (Bieler et al., J. Chem. Theory Comput. 2014, 10, 3006), we introduced a combination of λ-dynamics and local-elevation umbrella-sampling termed λ-LEUS to calculate free-energy changes associated with alchemical processes using molecular dynamics simulations. This method was suggested to be more efficient than thermodynamic integration (TI), because the dynamical variation of the alchemical variable λ opens up pathways to circumvent barriers in the orthogonal space (defined by the N - 1 degrees of freedom that are not subjected to the sampling enhancement), a feature λ-LEUS shares with Hamiltonian replica-exchange (HR) approaches. However, the mutation considered, hydroquinone to benzene in water, was no real challenge in terms of orthogonal-space properties, which were restricted to solvent-relaxation processes. In the present article, we revisit the comparison between TI and λ-LEUS considering non-trivial mutations of the central residue X of a KXK tripeptide in water (with X = G, E, K, S, F, or Y). Side-chain interactions that may include salt bridges, hydrogen bonds or steric clashes lead to slow relaxation in the orthogonal space, mainly in the two-dimensional subspace spanned by the central φ and ψ dihedral angles of the peptide. The efficiency enhancement afforded by λ-LEUS is confirmed in this more complex test system and can be attributed explicitly to the improved sampling of the orthogonal space. The sensitivity of the results to the nontrivial choices of a mass parameter and of a thermostat coupling time for the alchemical variable is also investigated, resulting in recommended ranges of 50 to 100 u nm(2) and 0.2 to 0.5 ps, respectively. © 2015 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Wang, Yi; Pant, Kapil; Brenner, Martin J.; Ouellette, Jeffrey A.
2018-01-01
This paper presents a data analysis and modeling framework to tailor and develop linear parameter-varying (LPV) aeroservoelastic (ASE) model database for flexible aircrafts in broad 2D flight parameter space. The Kriging surrogate model is constructed using ASE models at a fraction of grid points within the original model database, and then the ASE model at any flight condition can be obtained simply through surrogate model interpolation. The greedy sampling algorithm is developed to select the next sample point that carries the worst relative error between the surrogate model prediction and the benchmark model in the frequency domain among all input-output channels. The process is iterated to incrementally improve surrogate model accuracy till a pre-determined tolerance or iteration budget is met. The methodology is applied to the ASE model database of a flexible aircraft currently being tested at NASA/AFRC for flutter suppression and gust load alleviation. Our studies indicate that the proposed method can reduce the number of models in the original database by 67%. Even so the ASE models obtained through Kriging interpolation match the model in the original database constructed directly from the physics-based tool with the worst relative error far below 1%. The interpolated ASE model exhibits continuously-varying gains along a set of prescribed flight conditions. More importantly, the selected grid points are distributed non-uniformly in the parameter space, a) capturing the distinctly different dynamic behavior and its dependence on flight parameters, and b) reiterating the need and utility for adaptive space sampling techniques for ASE model database compaction. The present framework is directly extendible to high-dimensional flight parameter space, and can be used to guide the ASE model development, model order reduction, robust control synthesis and novel vehicle design of flexible aircraft.
Lan, Gongpu; Li, Guoqiang
2017-03-07
Nonlinear sampling of the interferograms in wavenumber (k) space degrades the depth-dependent signal sensitivity in conventional spectral domain optical coherence tomography (SD-OCT). Here we report a linear-in-wavenumber (k-space) spectrometer for an ultra-broad bandwidth (760 nm-920 nm) SD-OCT, whereby a combination of a grating and a prism serves as the dispersion group. Quantitative ray tracing is applied to optimize the linearity and minimize the optical path differences for the dispersed wavenumbers. Zemax simulation is used to fit the point spread functions to the rectangular shape of the pixels of the line-scan camera and to improve the pixel sampling rates. An experimental SD-OCT is built to test and compare the performance of the k-space spectrometer with that of a conventional one. Design results demonstrate that this k-space spectrometer can reduce the nonlinearity error in k-space from 14.86% to 0.47% (by approximately 30 times) compared to the conventional spectrometer. The 95% confidence interval for RMS diameters is 5.48 ± 1.76 μm-significantly smaller than both the pixel size (14 μm × 28 μm) and the Airy disc (25.82 μm in diameter, calculated at the wavenumber of 7.548 μm -1 ). Test results demonstrate that the fall-off curve from the k-space spectrometer exhibits much less decay (maximum as -5.20 dB) than the conventional spectrometer (maximum as -16.84 dB) over the whole imaging depth (2.2 mm).
NASA Astrophysics Data System (ADS)
Lan, Gongpu; Li, Guoqiang
2017-03-01
Nonlinear sampling of the interferograms in wavenumber (k) space degrades the depth-dependent signal sensitivity in conventional spectral domain optical coherence tomography (SD-OCT). Here we report a linear-in-wavenumber (k-space) spectrometer for an ultra-broad bandwidth (760 nm-920 nm) SD-OCT, whereby a combination of a grating and a prism serves as the dispersion group. Quantitative ray tracing is applied to optimize the linearity and minimize the optical path differences for the dispersed wavenumbers. Zemax simulation is used to fit the point spread functions to the rectangular shape of the pixels of the line-scan camera and to improve the pixel sampling rates. An experimental SD-OCT is built to test and compare the performance of the k-space spectrometer with that of a conventional one. Design results demonstrate that this k-space spectrometer can reduce the nonlinearity error in k-space from 14.86% to 0.47% (by approximately 30 times) compared to the conventional spectrometer. The 95% confidence interval for RMS diameters is 5.48 ± 1.76 μm—significantly smaller than both the pixel size (14 μm × 28 μm) and the Airy disc (25.82 μm in diameter, calculated at the wavenumber of 7.548 μm-1). Test results demonstrate that the fall-off curve from the k-space spectrometer exhibits much less decay (maximum as -5.20 dB) than the conventional spectrometer (maximum as -16.84 dB) over the whole imaging depth (2.2 mm).
Materials Science Research Rack Onboard the International Space Station Hardware and Operations
NASA Technical Reports Server (NTRS)
Lehman, John R.; Frazier, Natalie C.; Johnson, Jimmie
2012-01-01
The Materials Science Research Rack (MSRR) is a research facility developed under a cooperative research agreement between NASA and ESA for materials science investigations on the International Space Station (ISS). MSRR was launched on STS-128 in August 2009, and is currently installed in the U.S. Destiny Laboratory Module. Since that time, MSRR has performed virtually flawlessly, logging more than 620 hours of operating time. The MSRR accommodates advanced investigations in the microgravity environment on the ISS for basic materials science research in areas such as solidification of metals and alloys. The purpose is to advance the scientific understanding of materials processing as affected by microgravity and to gain insight into the physical behavior of materials processing. MSRR allows for the study of a variety of materials including metals, ceramics, semiconductor crystals, and glasses. Materials science research benefits from the microgravity environment of space, where the researcher can better isolate chemical and thermal properties of materials from the effects of gravity. With this knowledge, reliable predictions can be made about the conditions required on Earth to achieve improved materials. MSRR is a highly automated facility with a modular design capable of supporting multiple types of investigations. Currently the NASA-provided Rack Support Subsystem provides services (power, thermal control, vacuum access, and command and data handling) to the ESA developed Materials Science Laboratory (MSL) which accommodates interchangeable Furnace Inserts (FI). Two ESA-developed FIs are presently available on the ISS: the Low Gradient Furnace (LGF) and the Solidification and Quenching Furnace (SQF). Sample-Cartridge Assemblies (SCAs), each containing one or more material samples, are installed in the FI by the crew and can be processed at temperatures up to 1400 C. Once an SCA is installed, the experiment can be run by automatic command or science conducted via telemetry commands from the ground. Initially, 12 SCAs were processed in the first furnace insert for a team of European and US investigators. After these samples were processed the Furnaces Inserts were exchanged and an additional single sample was processed. The processed samples have been returned to Earth for evaluation and comparison of their properties to samples similarly processed on the ground. A preliminary examination of the samples indicates that the majority of the desired science objectives have been successfully met leading to significant improvements in the understanding of alloy solidification processes. Six SCAs were launched on Space Shuttle Mission STS-135 in July 2011 for processing during the Fall of 2011. Additional batches are planned for future processing. This facility is available to support additional materials science investigations through programs such as the US National Laboratory, Technology Development, NASA Research Announcements, and others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Z; Pang, J; Tuli, R
Purpose: A recent 4D MRI technique based on 3D radial sampling and self-gating-based K-space sorting has shown promising results in characterizing respiratory motion. However due to continuous acquisition and potentially drastic k-space undersampling resultant images could suffer from low blood-to-tissue contrast and streaking artifacts. In this study 3D radial sampling with slab-selective excitation (SS) was proposed in attempt to enhance blood-to-tissue contrast by exploiting the in-flow effect and to suppress the excess signal from the peripheral structures particularly in the superior-inferior direction. The feasibility of improving image quality by using this approach was investigated through a comparison with the previouslymore » developed non-selective excitation (NS) approach. Methods: Two excitation approaches SS and NS were compared in 5 cancer patients (1 lung 1 liver 2 pancreas and 1 esophagus) at 3Tesla. Image artifact was assessed in all patients on a 4-point scale (0: poor; 3: excellent). Signal-tonoise ratio (SNR) of the blood vessel (aorta) at the center of field-of-view and its nearby tissue were measured in 3 of the 5 patients (1 liver 2 pancreas) and blood-to-tissue contrast-to-noise ratio (CNR) were then determined. Results: Compared with NS the image quality of SS was visually improved with overall higher signal in all patients (2.6±0.55 vs. 3.4±0.55). SS showed an approximately 2-fold increase of SNR in the blood (aorta: 16.39±1.95 vs. 32.19±7.93) and slight increase in the surrounding tissue (liver/pancreas: 16.91±1.82 vs. 22.31±3.03). As a result the blood-totissue CNR was dramatically higher in the SS method (1.20±1.20 vs. 9.87±6.67). Conclusion: The proposed 3D radial sampling with slabselective excitation allows for reduced image artifact and improved blood SNR and blood-to-tissue CNR. The success of this technique could potentially benefit patients with cancerous tumors that have invaded the surrounding blood vessels where radiation therapy is needed to remove tumor from those regions prior to surgical resection. This work is partially supported by NIH R03CA173273; and CTSI core voucher award.« less
Compact field color schlieren system for use in microgravity materials processing
NASA Technical Reports Server (NTRS)
Poteet, W. M.; Owen, R. B.
1986-01-01
A compact color schlieren system designed for field measurement of materials processing parameters has been built and tested in a microgravity environment. Improvements in the color filter design and a compact optical arrangement allowed the system described here to retain the traditional advantages of schlieren, such as simplicity, sensitivity, and ease of data interpretation. Testing was accomplished by successfully flying the instrument on a series of parabolic trajectories on the NASA KC-135 microgravity simulation aircraft. A variety of samples of interest in materials processing were examined. Although the present system was designed for aircraft use, the technique is well suited to space flight experimentation. A major goal of this effort was to accommodate the main optical system within a volume approximately equal to that of a Space Shuttle middeck locker. Future plans include the development of an automated space-qualified facility for use on the Shuttle and Space Station.
NASA Astrophysics Data System (ADS)
Cottin, Hervé; Kotler, Julia Michelle; Billi, Daniela; Cockell, Charles; Demets, René; Ehrenfreund, Pascale; Elsaesser, Andreas; d'Hendecourt, Louis; van Loon, Jack J. W. A.; Martins, Zita; Onofri, Silvano; Quinn, Richard C.; Rabbow, Elke; Rettberg, Petra; Ricco, Antonio J.; Slenzka, Klaus; de la Torre, Rosa; de Vera, Jean-Pierre; Westall, Frances; Carrasco, Nathalie; Fresneau, Aurélien; Kawaguchi, Yuko; Kebukawa, Yoko; Nguyen, Dara; Poch, Olivier; Saiagh, Kafila; Stalport, Fabien; Yamagishi, Akihiko; Yano, Hajime; Klamm, Benjamin A.
2017-07-01
The space environment is regularly used for experiments addressing astrobiology research goals. The specific conditions prevailing in Earth orbit and beyond, notably the radiative environment (photons and energetic particles) and the possibility to conduct long-duration measurements, have been the main motivations for developing experimental concepts to expose chemical or biological samples to outer space, or to use the reentry of a spacecraft on Earth to simulate the fall of a meteorite. This paper represents an overview of past and current research in astrobiology conducted in Earth orbit and beyond, with a special focus on ESA missions such as Biopan, STONE (on Russian FOTON capsules) and EXPOSE facilities (outside the International Space Station). The future of exposure platforms is discussed, notably how they can be improved for better science return, and how to incorporate the use of small satellites such as those built in cubesat format.
Learning process mapping heuristics under stochastic sampling overheads
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Wah, Benjamin W.
1991-01-01
A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.
A Simple Application of Compressed Sensing to Further Accelerate Partially Parallel Imaging
Miao, Jun; Guo, Weihong; Narayan, Sreenath; Wilson, David L.
2012-01-01
Compressed Sensing (CS) and partially parallel imaging (PPI) enable fast MR imaging by reducing the amount of k-space data required for reconstruction. Past attempts to combine these two have been limited by the incoherent sampling requirement of CS, since PPI routines typically sample on a regular (coherent) grid. Here, we developed a new method, “CS+GRAPPA,” to overcome this limitation. We decomposed sets of equidistant samples into multiple random subsets. Then, we reconstructed each subset using CS, and averaging the results to get a final CS k-space reconstruction. We used both a standard CS, and an edge and joint-sparsity guided CS reconstruction. We tested these intermediate results on both synthetic and real MR phantom data, and performed a human observer experiment to determine the effectiveness of decomposition, and to optimize the number of subsets. We then used these CS reconstructions to calibrate the GRAPPA complex coil weights. In vivo parallel MR brain and heart data sets were used. An objective image quality evaluation metric, Case-PDM, was used to quantify image quality. Coherent aliasing and noise artifacts were significantly reduced using two decompositions. More decompositions further reduced coherent aliasing and noise artifacts but introduced blurring. However, the blurring was effectively minimized using our new edge and joint-sparsity guided CS using two decompositions. Numerical results on parallel data demonstrated that the combined method greatly improved image quality as compared to standard GRAPPA, on average halving Case-PDM scores across a range of sampling rates. The proposed technique allowed the same Case-PDM scores as standard GRAPPA, using about half the number of samples. We conclude that the new method augments GRAPPA by combining it with CS, allowing CS to work even when the k-space sampling pattern is equidistant. PMID:22902065
Part Marking and Identification Materials on MISSE
NASA Technical Reports Server (NTRS)
Finckenor, Miria M.; Roxby, Donald L.
2008-01-01
Many different spacecraft materials were flown as part of the Materials on International Space Station Experiment (MISSE), including several materials used in part marking and identification. The experiment contained Data Matrix symbols applied using laser bonding, vacuum arc vapor deposition, gas assisted laser etch, chemical etch, mechanical dot peening, laser shot peening, and laser induced surface improvement. The effects of ultraviolet radiation on nickel acetate seal versus hot water seal on sulfuric acid anodized aluminum are discussed. These samples were exposed on the International Space Station to the low Earth orbital environment of atomic oxygen, ultraviolet radiation, thermal cycling, and hard vacuum, though atomic oxygen exposure was very limited for some samples. Results from the one-year exposure on MISSE-3 and MISSE-4 are compared to those from MISSE-1 and MISSE-2, which were exposed for four years. Part marking and identification materials on the current MISSE -6 experiment are also discussed.
NASA Astrophysics Data System (ADS)
Cao, Jin; Jiang, Zhibin; Wang, Kangzhou
2017-07-01
Many nonlinear customer satisfaction-related factors significantly influence the future customer demand for service-oriented manufacturing (SOM). To address this issue and enhance the prediction accuracy, this article develops a novel customer demand prediction approach for SOM. The approach combines the phase space reconstruction (PSR) technique with the optimized least square support vector machine (LSSVM). First, the prediction sample space is reconstructed by the PSR to enrich the time-series dynamics of the limited data sample. Then, the generalization and learning ability of the LSSVM are improved by the hybrid polynomial and radial basis function kernel. Finally, the key parameters of the LSSVM are optimized by the particle swarm optimization algorithm. In a real case study, the customer demand prediction of an air conditioner compressor is implemented. Furthermore, the effectiveness and validity of the proposed approach are demonstrated by comparison with other classical predication approaches.
NASA Technical Reports Server (NTRS)
Wallace, William T.; Limero, Thomas F.; Gazda, Daniel B.; Macatangay, Ariel V.; Dwivedi, Prabha; Fernandez, Facundo M.
2014-01-01
In the history of manned spaceflight, environmental monitoring has relied heavily on archival sampling. For short missions, this type of sample collection was sufficient; returned samples provided a snapshot of the presence of chemical and biological contaminants in the spacecraft air and water. However, with the construction of the International Space Station (ISS) and the subsequent extension of mission durations, soon to be up to one year, the need for enhanced, real-time environmental monitoring became more pressing. The past several years have seen the implementation of several real-time monitors aboard the ISS, complemented with reduced archival sampling. The station air is currently monitored for volatile organic compounds (VOCs) using gas chromatography-differential mobility spectrometry (Air Quality Monitor [AQM]). The water on ISS is analyzed to measure total organic carbon and biocide concentrations using the Total Organic Carbon Analyzer (TOCA) and the Colorimetric Water Quality Monitoring Kit (CWQMK), respectively. The current air and water monitors provide important data, but the number and size of the different instruments makes them impractical for future exploration missions. It is apparent that there is still a need for improvements in environmental monitoring capabilities. One such improvement could be realized by modifying a single instrument to analyze both air and water. As the AQM currently provides quantitative, compound-specific information for target compounds present in air samples, and many of the compounds are also targets for water quality monitoring, this instrument provides a logical starting point to evaluate the feasibility of this approach. In this presentation, we will discuss our recent studies aimed at determining an appropriate method for introducing VOCs from water samples into the gas phase and our current work, in which an electro-thermal vaporization unit has been interfaced with the AQM to analyze target analytes at the relevant concentrations at which they are routinely detected in archival water samples from the ISS.
Fast and Adaptive Lossless On-Board Hyperspectral Data Compression System for Space Applications
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh; Bakhshi, Alireza; Keymeulen, Didier; Klimesh, Matthew
2009-01-01
Efficient on-board lossless hyperspectral data compression reduces the data volume necessary to meet NASA and DoD limited downlink capabilities. The techniques also improves signature extraction, object recognition and feature classification capabilities by providing exact reconstructed data on constrained downlink resources. At JPL a novel, adaptive and predictive technique for lossless compression of hyperspectral data was recently developed. This technique uses an adaptive filtering method and achieves a combination of low complexity and compression effectiveness that far exceeds state-of-the-art techniques currently in use. The JPL-developed 'Fast Lossless' algorithm requires no training data or other specific information about the nature of the spectral bands for a fixed instrument dynamic range. It is of low computational complexity and thus well-suited for implementation in hardware, which makes it practical for flight implementations of pushbroom instruments. A prototype of the compressor (and decompressor) of the algorithm is available in software, but this implementation may not meet speed and real-time requirements of some space applications. Hardware acceleration provides performance improvements of 10x-100x vs. the software implementation (about 1M samples/sec on a Pentium IV machine). This paper describes a hardware implementation of the JPL-developed 'Fast Lossless' compression algorithm on a Field Programmable Gate Array (FPGA). The FPGA implementation targets the current state of the art FPGAs (Xilinx Virtex IV and V families) and compresses one sample every clock cycle to provide a fast and practical real-time solution for Space applications.
NASA Astrophysics Data System (ADS)
Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe
2017-07-01
This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.
Reconstruction of reflectance data using an interpolation technique.
Abed, Farhad Moghareh; Amirshahi, Seyed Hossein; Abed, Mohammad Reza Moghareh
2009-03-01
A linear interpolation method is applied for reconstruction of reflectance spectra of Munsell as well as ColorChecker SG color chips from the corresponding colorimetric values under a given set of viewing conditions. Hence, different types of lookup tables (LUTs) have been created to connect the colorimetric and spectrophotometeric data as the source and destination spaces in this approach. To optimize the algorithm, different color spaces and light sources have been used to build different types of LUTs. The effects of applied color datasets as well as employed color spaces are investigated. Results of recovery are evaluated by the mean and the maximum color difference values under other sets of standard light sources. The mean and the maximum values of root mean square (RMS) error between the reconstructed and the actual spectra are also calculated. Since the speed of reflectance reconstruction is a key point in the LUT algorithm, the processing time spent for interpolation of spectral data has also been measured for each model. Finally, the performance of the suggested interpolation technique is compared with that of the common principal component analysis method. According to the results, using the CIEXYZ tristimulus values as a source space shows priority over the CIELAB color space. Besides, the colorimetric position of a desired sample is a key point that indicates the success of the approach. In fact, because of the nature of the interpolation technique, the colorimetric position of the desired samples should be located inside the color gamut of available samples in the dataset. The resultant spectra that have been reconstructed by this technique show considerable improvement in terms of RMS error between the actual and the reconstructed reflectance spectra as well as CIELAB color differences under the other light source in comparison with those obtained from the standard PCA technique.
Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods
NASA Technical Reports Server (NTRS)
Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark
2002-01-01
Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.
NASA Technical Reports Server (NTRS)
1976-01-01
Twelve aerothermodynamic space technology needs were identified to reduce the design uncertainties in aerodynamic heating and forces experienced by heavy lift launch vehicles, orbit transfer vehicles, and advanced single stage to orbit vehicles for the space transportation system, and for probes, planetary surface landers, and sample return vehicles for solar system exploration vehicles. Research and technology needs identified include: (1) increasing the fluid dynamics capability by at least two orders of magnitude by developing an advanced computer processor for the solution of fluid dynamic problems with improved software; (2) predicting multi-engine base flow fields for launch vehicles; and (3) developing methods to conserve energy in aerothermodynamic ground test facilities.
Glover, Gary H.
2011-01-01
T2*-weighted Blood Oxygen Level Dependent (BOLD) functional magnetic resonance imaging (fMRI) requires efficient acquisition methods in order to fully sample the brain in a several second time period. The most widely used approach is Echo Planar Imaging (EPI), which utilizes a Cartesian trajectory to cover k-space. This trajectory is subject to ghosts from off-resonance and gradient imperfections and is intrinsically sensitive to cardiac-induced pulsatile motion from substantial first- and higher order moments of the gradient waveform near the k-space origin. In addition, only the readout direction gradient contributes significant energy to the trajectory. By contrast, the Spiral method samples k-space with an Archimedean or similar trajectory that begins at the k-space center and spirals to the edge (Spiral-out), or its reverse, ending at the origin (Spiral-in). Spiral methods have reduced sensitivity to motion, shorter readout times, improved signal recovery in most frontal and parietal brain regions, and exhibit blurring artifacts instead of ghosts or geometric distortion. Methods combining Spiral-in and Spiral-out trajectories have further advantages in terms of diminished susceptibility-induced signal dropout and increased BOLD signal. In measurements of temporal signal to noise ratio measured in 8 subjects, Spiral-in/out exhibited significant increases over EPI in voxel volumes recovered in frontal and whole brain regions (18% and 10%, respectively). PMID:22036995
Häyrynen, Teppo; Osterkryger, Andreas Dyhl; de Lasson, Jakob Rosenkrantz; Gregersen, Niels
2017-09-01
Recently, an open geometry Fourier modal method based on a new combination of an open boundary condition and a non-uniform k-space discretization was introduced for rotationally symmetric structures, providing a more efficient approach for modeling nanowires and micropillar cavities [J. Opt. Soc. Am. A33, 1298 (2016)JOAOD61084-752910.1364/JOSAA.33.001298]. Here, we generalize the approach to three-dimensional (3D) Cartesian coordinates, allowing for the modeling of rectangular geometries in open space. The open boundary condition is a consequence of having an infinite computational domain described using basis functions that expand the whole space. The strength of the method lies in discretizing the Fourier integrals using a non-uniform circular "dartboard" sampling of the Fourier k space. We show that our sampling technique leads to a more accurate description of the continuum of the radiation modes that leak out from the structure. We also compare our approach to conventional discretization with direct and inverse factorization rules commonly used in established Fourier modal methods. We apply our method to a variety of optical waveguide structures and demonstrate that the method leads to a significantly improved convergence, enabling more accurate and efficient modeling of open 3D nanophotonic structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Y; Mutic, S; Du, D
Purpose: To evaluate the feasibility of using the weighted hybrid iterative spiral k-space encoded estimation (WHISKEE) technique to improve spatial resolution of tracking images for onboard MR image guided radiation therapy (MR-IGRT). Methods: MR tracking images of abdomen and pelvis had been acquired from healthy volunteers using the ViewRay onboard MRIGRT system (ViewRay Inc. Oakwood Village, OH) at a spatial resolution of 2.0mm*2.0mm*5.0mm. The tracking MR images were acquired using the TrueFISP sequence. The temporal resolution had to be traded off to 2 frames per second (FPS) to achieve the 2.0mm in-plane spatial resolution. All MR images were imported intomore » the MATLAB software. K-space data were synthesized through the Fourier Transform of the MR images. A mask was created to selected k-space points that corresponded to the under-sampled spiral k-space trajectory with an acceleration (or undersampling) factor of 3. The mask was applied to the fully sampled k-space data to synthesize the undersampled k-space data. The WHISKEE method was applied to the synthesized undersampled k-space data to reconstructed tracking MR images at 6 FPS. As a comparison, the undersampled k-space data were also reconstructed using the zero-padding technique. The reconstructed images were compared to the original image. The relatively reconstruction error was evaluated using the percentage of the norm of the differential image over the norm of the original image. Results: Compared to the zero-padding technique, the WHISKEE method was able to reconstruct MR images with better image quality. It significantly reduced the relative reconstruction error from 39.5% to 3.1% for the pelvis image and from 41.5% to 4.6% for the abdomen image at an acceleration factor of 3. Conclusion: We demonstrated that it was possible to use the WHISKEE method to expedite MR image acquisition for onboard MR-IGRT systems to achieve good spatial and temporal resolutions simultaneously. Y. Hu and O. green receive travel reimbursement from ViewRay. S. Mutic has consulting and research agreements with ViewRay. Q. Zeng, R. Nana, J.L. Patrick, S. Shvartsman and J.F. Dempsey are ViewRay employees.« less
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Mazars, Christian; Brière, Christian; Grat, Sabine; Pichereaux, Carole; Rossignol, Michel; Pereda-Loth, Veronica; Eche, Brigitte; Boucheron-Dubuisson, Elodie; Le Disquet, Isabel; Medina, Francisco-Javier; Graziana, Annick; Carnero-Diaz, Eugénie
2014-01-01
Growing plants in space for using them in bioregenerative life support systems during long-term human spaceflights needs improvement of our knowledge in how plants can adapt to space growth conditions. In a previous study performed on board the International Space Station (GENARA A experiment STS-132) we evaluate the global changes that microgravity can exert on the membrane proteome of Arabidopsis seedlings. Here we report additional data from this space experiment, taking advantage of the availability in the EMCS of a centrifuge to evaluate the effects of cues other than microgravity on the relative distribution of membrane proteins. Among the 1484 membrane proteins quantified, 227 proteins displayed no abundance differences between µ g and 1 g in space, while their abundances significantly differed between 1 g in space and 1 g on ground. A majority of these proteins (176) were over-represented in space samples and mainly belong to families corresponding to protein synthesis, degradation, transport, lipid metabolism, or ribosomal proteins. In the remaining set of 51 proteins that were under-represented in membranes, aquaporins and chloroplastic proteins are majority. These sets of proteins clearly appear as indicators of plant physiological processes affected in space by stressful factors others than microgravity.
Mazars, Christian; Brière, Christian; Grat, Sabine; Pichereaux, Carole; Rossignol, Michel; Pereda-Loth, Veronica; Eche, Brigitte; Boucheron-Dubuisson, Elodie; Le Disquet, Isabel; Medina, Francisco-Javier; Graziana, Annick; Carnero-Diaz, Eugénie
2014-07-16
Growing plants in space for using them in bioregenerative life support systems during long-term human spaceflights needs improvement of our knowledge in how plants can adapt to space growth conditions. In a previous study performed on board the International Space Station (GENARA A experiment STS-132) we evaluate the global changes that microgravity can exert on the membrane proteome of Arabidopsis seedlings. Here we report additional data from this space experiment, taking advantage of the availability in the EMCS of a centrifuge to evaluate the effects of cues other than microgravity on the relative distribution of membrane proteins. Among the 1484 membrane proteins quantified, 227 proteins displayed no abundance differences between µ g and 1 g in space, while their abundances significantly differed between 1 g in space and 1 g on ground. A majority of these proteins (176) were over-represented in space samples and mainly belong to families corresponding to protein synthesis, degradation, transport, lipid metabolism, or ribosomal proteins. In the remaining set of 51 proteins that were under-represented in membranes, aquaporins and chloroplastic proteins are majority. These sets of proteins clearly appear as indicators of plant physiological processes affected in space by stressful factors others than microgravity.
Characteristics and Trade-Offs of Doppler Lidar Global Wind Profiling
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Emmitt, G David
2004-01-01
Accurate, global profiling of wind velocity is highly desired by NASA, NOAA, the DOD/DOC/NASA Integrated Program Office (IPO)/NPOESS, DOD, and others for many applications such as validation and improvement of climate models, and improved weather prediction. The most promising technology to deliver this measurement from space is Doppler Wind Lidar (DWL). The NASA/NOAA Global Tropospheric Wind Sounder (GTWS) program is currently in the process of generating the science requirements for a space-based sensor. In order to optimize the process of defining science requirements, it is important for the scientific and user community to understand the nature of the wind measurements that DWL can make. These measurements are very different from those made by passive imaging sensors or by active radar sensors. The purpose of this paper is to convey the sampling characteristics and data product trade-offs of an orbiting DWL.
NASA Technical Reports Server (NTRS)
Hinton, David A.
2001-01-01
A ground-based system has been developed to demonstrate the feasibility of automating the process of collecting relevant weather data, predicting wake vortex behavior from a data base of aircraft, prescribing safe wake vortex spacing criteria, estimating system benefit, and comparing predicted and observed wake vortex behavior. This report describes many of the system algorithms, features, limitations, and lessons learned, as well as suggested system improvements. The system has demonstrated concept feasibility and the potential for airport benefit. Significant opportunities exist however for improved system robustness and optimization. A condensed version of the development lab book is provided along with samples of key input and output file types. This report is intended to document the technical development process and system architecture, and to augment archived internal documents that provide detailed descriptions of software and file formats.
JAXA protein crystallization in space: ongoing improvements for growing high-quality crystals
Takahashi, Sachiko; Ohta, Kazunori; Furubayashi, Naoki; Yan, Bin; Koga, Misako; Wada, Yoshio; Yamada, Mitsugu; Inaka, Koji; Tanaka, Hiroaki; Miyoshi, Hiroshi; Kobayashi, Tomoyuki; Kamigaichi, Shigeki
2013-01-01
The Japan Aerospace Exploration Agency (JAXA) started a high-quality protein crystal growth project, now called JAXA PCG, on the International Space Station (ISS) in 2002. Using the counter-diffusion technique, 14 sessions of experiments have been performed as of 2012 with 580 proteins crystallized in total. Over the course of these experiments, a user-friendly interface framework for high accessibility has been constructed and crystallization techniques improved; devices to maximize the use of the microgravity environment have been designed, resulting in some high-resolution crystal growth. If crystallization conditions were carefully fixed in ground-based experiments, high-quality protein crystals grew in microgravity in many experiments on the ISS, especially when a highly homogeneous protein sample and a viscous crystallization solution were employed. In this article, the current status of JAXA PCG is discussed, and a rational approach to high-quality protein crystal growth in microgravity based on numerical analyses is explained. PMID:24121350
Creating targeted initial populations for genetic product searches in heterogeneous markets
NASA Astrophysics Data System (ADS)
Foster, Garrett; Turner, Callaway; Ferguson, Scott; Donndelinger, Joseph
2014-12-01
Genetic searches often use randomly generated initial populations to maximize diversity and enable a thorough sampling of the design space. While many of these initial configurations perform poorly, the trade-off between population diversity and solution quality is typically acceptable for small-scale problems. Navigating complex design spaces, however, often requires computationally intelligent approaches that improve solution quality. This article draws on research advances in market-based product design and heuristic optimization to strategically construct 'targeted' initial populations. Targeted initial designs are created using respondent-level part-worths estimated from discrete choice models. These designs are then integrated into a traditional genetic search. Two case study problems of differing complexity are presented to illustrate the benefits of this approach. In both problems, targeted populations lead to computational savings and product configurations with improved market share of preferences. Future research efforts to tailor this approach and extend it towards multiple objectives are also discussed.
Compositional Solution Space Quantification for Probabilistic Software Analysis
NASA Technical Reports Server (NTRS)
Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem
2014-01-01
Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.
Hazardous Gas Leak Analysis in the Space Shuttle
NASA Technical Reports Server (NTRS)
Barile, Ronald G.
1991-01-01
Helium tests of the main propulsion system in the Space Shuttle and on hydrogen leaks are examined. The hazardous gas detection system (HGDS) in the mobile launch pad uses mass spectrometers (MS) to monitor the shuttle environment for leaks. The mass spectrometers are fed by long tubes to sample gas from the payload bay, mid-body, aft engine compartment, and external tank. The purpose is to improve the HGDS, especially in its potential for locating cryogen leaks. Pre-existing leak data was analyzed for transient information to determine if the leak location could be pinpointed from test data. A rapid response leak detection experiment was designed, built, and tested. Large eddies and vortices were visually seen with Schlieren imaging, and they were detected in the time plots of the various instruments. The response time of the MS was found in the range of 0.05 to 0.1 sec. Pulsed concentration waves were clearly detected at 25 cycles per sec by spectral analysis of MS data. One conclusion is that the backup HGDS sampling frequency should be increased above the present rate of 1 sample per second.
Laboratory Experiments on Bentonite Samples: FY16 Progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth M. Tinnacher; Tournassat, Christophe; James A. Davis
2016-08-22
The primary goal of this study is to improve the understanding of U(VI) sorption and diffusion behavior in sodium-montmorillonite in order to support the development of realistic conceptual models describing these processes in performance assessment models while (1) accounting for potential changes in system conditions over time and space, (2) avoiding overly conservative transport predictions, and (3) using a minimum number of fitting parameters.
Sample selection via angular distance in the space of the arguments of an artificial neural network
NASA Astrophysics Data System (ADS)
Fernández Jaramillo, J. M.; Mayerle, R.
2018-05-01
In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.
An Advanced Actuator Line Method for Wind Energy Applications and Beyond
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, Matthew J.; Schreck, Scott; Martinez-Tossas, Luis A.
The actuator line method to represent rotor aerodynamics within computational fluid dynamics has been in use for over a decade. This method applies a body force to the flow field along rotating lines corresponding to the individual rotor blades and employs tabular airfoil data to compute the force distribution. The actuator line method is attractive because compared to blade-resolved simulations, the required mesh is much simpler and the computational cost is lower. This work proposes a higher fidelity variant of the actuator line method meant to fill the space between current actuator line and blade-resolved simulations. It contains modifications inmore » two key areas. The first is that of freestream velocity vector estimation along the line, which is necessary to compute the lift and drag along the line using tabular airfoil data. Most current methods rely on point sampling in which the location of sampling is ambiguous. Here we test a velocity sampling method that uses a properly weighted integral over space, removing this ambiguity. The second area of improvement is the function used to project the one-dimensional actuator line force onto the three-dimensional fluid mesh as a body force. We propose and test a projection function that spreads the force over a region that looks something like a real blade with the hope that it will produce the blade local and near wake flow features with more accuracy and higher fidelity. Our goal is that between these two improvements, not only will the flow field predictions be enhanced, but also the spanwise loading will be made more accurate. We refer to this combination of improvements as the advanced actuator line method. We apply these improvements to two different wind turbine cases. Although there is a strong wind energy motivation in our work, there is no reason these advanced actuator line ideas cannot be used in other applications, such as helicopter rotors.« less
An Advanced Actuator Line Method for Wind Energy Applications and Beyond: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, Matthew; Schreck, Scott; Martinez-Tossas, Luis A.
The actuator line method to represent rotor aerodynamics within computational fluid dynamics has been in use for over a decade. This method applies a body force to the flow field along rotating lines corresponding to the individual rotor blades and employs tabular airfoil data to compute the force distribution. The actuator line method is attractive because compared to blade-resolved simulations, the required mesh is much simpler and the computational cost is lower. This work proposes a higher fidelity variant of the actuator line method meant to fill the space between current actuator line and blade-resolved simulations. It contains modifications inmore » two key areas. The first is that of freestream velocity vector estimation along the line, which is necessary to compute the lift and drag along the line using tabular airfoil data. Most current methods rely on point sampling in which the location of sampling is ambiguous. Here we test a velocity sampling method that uses a properly weighted integral over space, removing this ambiguity. The second area of improvement is the function used to project the one-dimensional actuator line force onto the three-dimensional fluid mesh as a body force. We propose and test a projection function that spreads the force over a region that looks something like a real blade with the hope that it will produce the blade local and near wake flow features with more accuracy and higher fidelity. Our goal is that between these two improvements, not only will the flow field predictions be enhanced, but also the spanwise loading will be made more accurate. We refer to this combination of improvements as the advanced actuator line method. We apply these improvements to two different wind turbine cases. Although there is a strong wind energy motivation in our work, there is no reason these advanced actuator line ideas cannot be used in other applications, such as helicopter rotors.« less
NASA Astrophysics Data System (ADS)
Hamza, Karim; Shalaby, Mohamed
2014-09-01
This article presents a framework for simulation-based design optimization of computationally expensive problems, where economizing the generation of sample designs is highly desirable. One popular approach for such problems is efficient global optimization (EGO), where an initial set of design samples is used to construct a kriging model, which is then used to generate new 'infill' sample designs at regions of the search space where there is high expectancy of improvement. This article attempts to address one of the limitations of EGO, where generation of infill samples can become a difficult optimization problem in its own right, as well as allow the generation of multiple samples at a time in order to take advantage of parallel computing in the evaluation of the new samples. The proposed approach is tested on analytical functions, and then applied to the vehicle crashworthiness design of a full Geo Metro model undergoing frontal crash conditions.
NASA Astrophysics Data System (ADS)
Peng, D. J.; Wu, B.
2012-01-01
With the availability of precise GPS ephemeris and clock solution, the ionospheric range delay is left as the dominant error sources in the post-processing of space-borne GPS data from single-frequency receivers. Thus, the removal of ionospheric effects is a major prerequisite for an improved orbit reconstruction of LEO satellites equipped with low cost single-frequency GPS receivers. In this paper, the use of Global Ionospheric Maps (GIM) in kinematic and dynamic orbit determination for LEO satellites with single-frequency GPS measurements is discussed first,and then, estimating the scale factor of ionosphere to remove the ionospheric effects in C/A code pseudo-range measurements in both kinematic and adynamia orbit defemination approaches is addressed. As it is known the ionospheric path delay of space-borne GPS signals is strongly dependent on the orbit altitudes of LEO satellites, we selected real space-borne GPS data from CHAMP, GRACE, TerraSAR-X and SAC-C satellites with altitudes between 300 km and 800 km as sample data in this paper. It is demonstrated that the approach of eliminating ionospheric effects in space-borne C/A code pseudo-range by estimating the scale factor of ionosphere is highly effective. Employing this approach, the accuracy of both kinematic and dynamic orbits can be improved notably. Among those five LEO satellites, CHAMP with the lowest orbit altitude has the most remarkable orbit accuracy improvements, which are 55.6% and 47.6% for kinematic and dynamic approaches, respectively. SAC-C with the highest orbit altitude has the least orbit accuracy improvements accordingly, which are 47.8% and 38.2%, respectively.
NASA Space Technology Draft Roadmap Area 13: Ground and Launch Systems Processing
NASA Technical Reports Server (NTRS)
Clements, Greg
2011-01-01
This slide presentation reviews the technology development roadmap for the area of ground and launch systems processing. The scope of this technology area includes: (1) Assembly, integration, and processing of the launch vehicle, spacecraft, and payload hardware (2) Supply chain management (3) Transportation of hardware to the launch site (4) Transportation to and operations at the launch pad (5) Launch processing infrastructure and its ability to support future operations (6) Range, personnel, and facility safety capabilities (7) Launch and landing weather (8) Environmental impact mitigations for ground and launch operations (9) Launch control center operations and infrastructure (10) Mission integration and planning (11) Mission training for both ground and flight crew personnel (12) Mission control center operations and infrastructure (13) Telemetry and command processing and archiving (14) Recovery operations for flight crews, flight hardware, and returned samples. This technology roadmap also identifies ground, launch and mission technologies that will: (1) Dramatically transform future space operations, with significant improvement in life-cycle costs (2) Improve the quality of life on earth, while exploring in co-existence with the environment (3) Increase reliability and mission availability using low/zero maintenance materials and systems, comprehensive capabilities to ascertain and forecast system health/configuration, data integration, and the use of advanced/expert software systems (4) Enhance methods to assess safety and mission risk posture, which would allow for timely and better decision making. Several key technologies are identified, with a couple of slides devoted to one of these technologies (i.e., corrosion detection and prevention). Development of these technologies can enhance life on earth and have a major impact on how we can access space, eventually making routine commercial space access and improve building and manufacturing, and weather forecasting for example for the effect of these process improvements on our daily lives.
Calculating Free Energies Using Scaled-Force Molecular Dynamics Algorithm
NASA Technical Reports Server (NTRS)
Darve, Eric; Wilson, Micahel A.; Pohorille, Andrew
2000-01-01
One common objective of molecular simulations in chemistry and biology is to calculate the free energy difference between different states of the system of interest. Examples of problems that have such an objective are calculations of receptor-ligand or protein-drug interactions, associations of molecules in response to hydrophobic, and electrostatic interactions or partition of molecules between immiscible liquids. Another common objective is to describe evolution of the system towards a low energy (possibly the global minimum energy), 'native' state. Perhaps the best example of such a problem is folding of proteins or short RNA molecules. Both types of problems share the same difficulty. Often, different states of the system are separated by high energy barriers, which implies that transitions between these states are rare events. This, in turn, can greatly impede exploration of phase space. In some instances this can lead to 'quasi non-ergodicity', whereby a part of phase space is inaccessible on timescales of the simulation. A host of strategies has been developed to improve efficiency of sampling the phase space. For example, some Monte Carlo techniques involve large steps which move the system between low-energy regions in phase space without the need for sampling the configurations corresponding to energy barriers (J-walking). Most strategies, however, rely on modifying probabilities of sampling low and high-energy regions in phase space such that transitions between states of interest are encouraged. Perhaps the simplest implementation of this strategy is to increase the temperature of the system. This approach was successfully used to identify denaturation pathways in several proteins, but it is clearly not applicable to protein folding. It is also not a successful method for determining free energy differences. Finally, the approach is likely to fail for systems with co-existing phases, such as water-membrane systems, because it may lead to spontaneous mixing. A similar difficulty may be encountered in any method relying on global modifications of phase space.
NASA Astrophysics Data System (ADS)
Brigitte Neuland, Maike; Allenbach, Marc; Föhn, Martina; Wurz, Peter
2017-04-01
The detection of energetic neutral atoms is a substantial requirement on every space mission mapping particle populations of a planetary magnetosphere or plasma of the interstellar medium. For imaging neutrals, these first have to be ionised. Regarding the constraints of weight, volume and power consumption, the technique of surface ionisation complies with all specifications of a space mission. Particularly low energy neutral atoms, which cannot be ionised by passing through a foil, are ionised by scattering on a charge state conversion surface [1]. Since more than 30 years intense research work is done to find and optimise suitable materials for use as charge state conversion surfaces for space application. Crucial parameters are the ionisation efficiency of the surface material and the scattering properties. Regarding these parameters, diamond-like carbon was proven advantageously: While efficiently ionising incoming neutral atoms, diamond stands out by its durability and chemical inertness [2]. In the IBEX-Lo sensor, a diamond-like carbon surface is used for ionisation of neutral atoms. Building on the successes of the IBEX mission [3], the follow up mission IMAP (InterstellarMApping Probe) will take up to further explore the boundaries of the heliosphere. The IMAP mission is planned to map neutral atoms in a larger energy range and with a distinct better angular resolution and sensitivity than IBEX [4]. The aspired performance of the IMAP sensors implies also for charge state conversion surfaces with improved characteristics. We investigated samples of diamond-like carbon, manufactured by the chemical vapour deposition (CVD) method, regarding their ionisation efficiency, scattering and reflexion properties. Experiments were carried out at the ILENA facility at the University of Bern [5] with hydrogen and oxygen atoms, which are the species of main interest in magnetospheric research [1]. We compare the results of earlier investigations of a metallised CVD sample [6] to our latest measurements of a Boron-doped CVD diamond sample. We additionally measured the B-concentration in the sample to prove our predictions of the B-concentration needed to reach sufficient conductibility for the sample not getting electrostatically charged during instrument operation. The results of narrower scattering cones and higher ionisation efficiency show that diamond-like carbon still is the preferred material for charge state conversion surfaces and that new surface technologies offer improved diamond conversion surfaces with different properties and hence the possibility for improvement of the performance of neutral atom imaging instruments. References: [1] P. Wurz, Detection of Energetic Neutral Atoms, in The Outer Heliosphere: Beyond the Planets, Copernicus Gesellschaft e.V., Katlenburg-Lindau, Germany, 2000, p. 251-288. [2] P. Wurz, R. Schletti, M.R. Aellig, Surf. Sci. 373(1997), 56-66. [3] D.J. McComas et al., Geophys. Res. Lett. 38(2011), L18101. [4] N.A. Schwadron et al., J. of Phys.. Conf. Series 767(2016): 012025 [5] P. Wahlström, J.A. Scheer, A. Riedo, P. Wurz and M. Wieser, J. Spacecr. Rockets 50 (2013): 402-410. [6] M.B. Neuland, J.A. Scheer, A. Riedo and P. Wurz, Appl. Surf. Sci. 313(2014):293-303.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team Survey robot retrieves a sample during a demonstration of the level two challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Sample Return Robot Challenge staff members confer before the team Survey robots makes it's attempt at the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The team Mountaineers robot is seen after picking up the sample during a rerun of the level one challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
A team KuuKulgur robot approaches the sample as it attempts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
NASA Technical Reports Server (NTRS)
Douglas, G. L.; Zwart, S. R.; Young, M.; Kloeris, V.; Crucian, B.; Smith, S. M.; Lorenzi, H.
2018-01-01
Spaceflight impacts human physiology, including well documented immune system dysregulation. Diet, immune function, and the microbiome are interlinked, but diet is the only one of these factors that we have the ability to easily, and significantly, alter on Earth or during flight. As we understand dietary impacts on physiology more thoroughly, we may then improve the spaceflight diet to improve crew health and potentially reduce spaceflight-associated physiological alterations. It is expected that increasing the consumption of fruits and vegetables and bioactive compounds (e.g., omega-3 fatty acids, lycopene, flavonoids) and therefore enhancing overall nutritional intake from the nominal shelf-stable, fully-processed space food system could serve as a countermeasure to improve human immunological profiles, the taxonomic profile of the gut microbiota, and nutritional status, especially where currently dysregulated during spaceflight. This interdisciplinary study will determine the effect of the current shelf-stable spaceflight diet compared to an "enhanced" shelf-stable spaceflight diet (25% more foods rich in omega-3 fatty acids, lycopene, flavonoids, and more fruits, and vegetables in general). The NASA Human Exploration Research Analog (HERA) 2017 missions, consisting of four 45-day missions with closed chamber confinement and realistic mission simulation in a high-fidelity mock space vehicle, will serve as a platform to replicate mission stressors and the effects on crew biochemistry, immunology, and the gut microbiome. Bio sampling of crewmembers is scheduled for selected intervals pre- and in-mission. Data collection also includes dietary intake recording. Outcome measures will include immune markers (e.g., peripheral leukocyte distribution, inflammatory cytokine profiles, T cell function), the taxonomic and metatranscriptomic profile of the gut microbiome, and nutritional status biomarkers and metabolites. Statistical evaluations will determine physiological and biochemical shifts in relation to nutrient intake and study phase. To date, sample collection has been completed for 2 crewmembers from the first mission, aka Campaign 4 Mission 1. Mission 2 was terminated after 22 days due to effects of Hurricane Harvey, and sample collection was not completed. Sample collection will continue for Campaign 4 Mission 3 and 4 prior to comprehensive sample analysis. Beneficial improvements will provide evidence of the impact of diet on crew health and adaptation to this spaceflight analog, and will aid in the design and development of more-efficient targeted dietary interventions for exploration missions.
Protein–protein docking by fast generalized Fourier transforms on 5D rotational manifolds
Padhorny, Dzmitry; Kazennov, Andrey; Zerbe, Brandon S.; Porter, Kathryn A.; Xia, Bing; Mottarella, Scott E.; Kholodov, Yaroslav; Ritchie, David W.; Vajda, Sandor; Kozakov, Dima
2016-01-01
Energy evaluation using fast Fourier transforms (FFTs) enables sampling billions of putative complex structures and hence revolutionized rigid protein–protein docking. However, in current methods, efficient acceleration is achieved only in either the translational or the rotational subspace. Developing an efficient and accurate docking method that expands FFT-based sampling to five rotational coordinates is an extensively studied but still unsolved problem. The algorithm presented here retains the accuracy of earlier methods but yields at least 10-fold speedup. The improvement is due to two innovations. First, the search space is treated as the product manifold SO(3)×(SO(3)∖S1), where SO(3) is the rotation group representing the space of the rotating ligand, and (SO(3)∖S1) is the space spanned by the two Euler angles that define the orientation of the vector from the center of the fixed receptor toward the center of the ligand. This representation enables the use of efficient FFT methods developed for SO(3). Second, we select the centers of highly populated clusters of docked structures, rather than the lowest energy conformations, as predictions of the complex, and hence there is no need for very high accuracy in energy evaluation. Therefore, it is sufficient to use a limited number of spherical basis functions in the Fourier space, which increases the efficiency of sampling while retaining the accuracy of docking results. A major advantage of the method is that, in contrast to classical approaches, increasing the number of correlation function terms is computationally inexpensive, which enables using complex energy functions for scoring. PMID:27412858
Innovations in air sampling to detect plant pathogens
West, JS; Kimber, RBE
2015-01-01
Many innovations in the development and use of air sampling devices have occurred in plant pathology since the first description of the Hirst spore trap. These include improvements in capture efficiency at relatively high air-volume collection rates, methods to enhance the ease of sample processing with downstream diagnostic methods and even full automation of sampling, diagnosis and wireless reporting of results. Other innovations have been to mount air samplers on mobile platforms such as UAVs and ground vehicles to allow sampling at different altitudes and locations in a short space of time to identify potential sources and population structure. Geographical Information Systems and the application to a network of samplers can allow a greater prediction of airborne inoculum and dispersal dynamics. This field of technology is now developing quickly as novel diagnostic methods allow increasingly rapid and accurate quantifications of airborne species and genetic traits. Sampling and interpretation of results, particularly action-thresholds, is improved by understanding components of air dispersal and dilution processes and can add greater precision in the application of crop protection products as part of integrated pest and disease management decisions. The applications of air samplers are likely to increase, with much greater adoption by growers or industry support workers to aid in crop protection decisions. The same devices are likely to improve information available for detection of allergens causing hay fever and asthma or provide valuable metadata for regional plant disease dynamics. PMID:25745191
Lan, Gongpu; Li, Guoqiang
2017-01-01
Nonlinear sampling of the interferograms in wavenumber (k) space degrades the depth-dependent signal sensitivity in conventional spectral domain optical coherence tomography (SD-OCT). Here we report a linear-in-wavenumber (k-space) spectrometer for an ultra-broad bandwidth (760 nm–920 nm) SD-OCT, whereby a combination of a grating and a prism serves as the dispersion group. Quantitative ray tracing is applied to optimize the linearity and minimize the optical path differences for the dispersed wavenumbers. Zemax simulation is used to fit the point spread functions to the rectangular shape of the pixels of the line-scan camera and to improve the pixel sampling rates. An experimental SD-OCT is built to test and compare the performance of the k-space spectrometer with that of a conventional one. Design results demonstrate that this k-space spectrometer can reduce the nonlinearity error in k-space from 14.86% to 0.47% (by approximately 30 times) compared to the conventional spectrometer. The 95% confidence interval for RMS diameters is 5.48 ± 1.76 μm—significantly smaller than both the pixel size (14 μm × 28 μm) and the Airy disc (25.82 μm in diameter, calculated at the wavenumber of 7.548 μm−1). Test results demonstrate that the fall-off curve from the k-space spectrometer exhibits much less decay (maximum as −5.20 dB) than the conventional spectrometer (maximum as –16.84 dB) over the whole imaging depth (2.2 mm). PMID:28266502
Patient‐centred improvements in health‐care built environments: perspectives and design indicators
Douglas, Calbert H.; Douglas, Mary R.
2005-01-01
Abstract Objective To explore patients’ perceptions of health‐care built environments, to assess how they perceived health‐care built facilities and designs. To develop a set of patient‐centred indicators by which to appraise future health‐care designs. Design Qualitative and quantitative methodologies, including futures group conferencing, autophotographic study, novice‐expert exchanges and a questionnaire survey of a representative sample of past patients. Setting and participants The research was carried out at Salford Royal Hospitals NHS Trust (SRHT), Greater Manchester, UK, selected for the study because of planned comprehensive redevelopment based on the new NHS vision for hospital care and service delivery for the 21st century. Participants included 35 patients who took part in an autophotographic study, eight focus groups engaged in futures conferencing, a sample of past inpatients from the previous 12 months that returned 785 completed postal questionnaires. Results The futures group provided suggestions for radical improvements which were categorized into transport issues; accessibility and mobility; ground and landscape designs; social and public spaces; homeliness and assurance; cultural diversity; safety and security; personal space and access to outside. Patients’ autophotographic study centred on: the quality of the ward design, human interactions, the state and quality of personal space, and facilities for recreation and leisure. The novices’ suggestions were organized into categories of elemental factors representing patient‐friendly designs. Experts from the architectural and surveying professions and staff at SRHT in turn considered these categories and respective subsets of factors. They agreed with the novices in terms of the headings but differed in prioritizing the elemental factors. The questionnaire survey of past patients provided opinions about ward designs that varied according to where they stayed, single room, bay ward or long open ward. The main concerns were limitation of private space around the bed area, supportive of privacy and dignity, ward noise and other disturbances. Conclusions Patients perceived sustainable health‐care environments to be supportive of their health and recovery. The design indicators developed from their perspectives and from their considerations for improvements to the health‐care built environment were based on their visions of the role of the health‐care facilities. These were homely environments that supported normal lifestyle and family functioning and designs that were supportive of accessibility and travel movements through transitional spaces. PMID:16098156
Fish tracking by combining motion based segmentation and particle filtering
NASA Astrophysics Data System (ADS)
Bichot, E.; Mascarilla, L.; Courtellemont, P.
2006-01-01
In this paper, we suggest a new importance sampling scheme to improve a particle filtering based tracking process. This scheme relies on exploitation of motion segmentation. More precisely, we propagate hypotheses from particle filtering to blobs of similar motion to target. Hence, search is driven toward regions of interest in the state space and prediction is more accurate. We also propose to exploit segmentation to update target model. Once the moving target has been identified, a representative model is learnt from its spatial support. We refer to this model in the correction step of the tracking process. The importance sampling scheme and the strategy to update target model improve the performance of particle filtering in complex situations of occlusions compared to a simple Bootstrap approach as shown by our experiments on real fish tank sequences.
Optimization of sampling pattern and the design of Fourier ptychographic illuminator.
Guo, Kaikai; Dong, Siyuan; Nanda, Pariksheet; Zheng, Guoan
2015-03-09
Fourier ptychography (FP) is a recently developed imaging approach that facilitates high-resolution imaging beyond the cutoff frequency of the employed optics. In the original FP approach, a periodic LED array is used for sample illumination, and therefore, the scanning pattern is a uniform grid in the Fourier space. Such a uniform sampling scheme leads to 3 major problems for FP, namely: 1) it requires a large number of raw images, 2) it introduces the raster grid artefacts in the reconstruction process, and 3) it requires a high-dynamic-range detector. Here, we investigate scanning sequences and sampling patterns to optimize the FP approach. For most biological samples, signal energy is concentrated at low-frequency region, and as such, we can perform non-uniform Fourier sampling in FP by considering the signal structure. In contrast, conventional ptychography perform uniform sampling over the entire real space. To implement the non-uniform Fourier sampling scheme in FP, we have designed and built an illuminator using LEDs mounted on a 3D-printed plastic case. The advantages of this illuminator are threefold in that: 1) it reduces the number of image acquisitions by at least 50% (68 raw images versus 137 in the original FP setup), 2) it departs from the translational symmetry of sampling to solve the raster grid artifact problem, and 3) it reduces the dynamic range of the captured images 6 fold. The results reported in this paper significantly shortened acquisition time and improved quality of FP reconstructions. It may provide new insights for developing Fourier ptychographic imaging platforms and find important applications in digital pathology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaikuad, Apirat, E-mail: apirat.chaikuad@sgc.ox.ac.uk; Knapp, Stefan; Johann Wolfgang Goethe-University, Building N240 Room 3.03, Max-von-Laue-Strasse 9, 60438 Frankfurt am Main
An alternative strategy for PEG sampling is suggested through the use of four newly defined PEG smears to enhance chemical space in reduced screens with a benefit towards protein crystallization. The quest for an optimal limited set of effective crystallization conditions remains a challenge in macromolecular crystallography, an issue that is complicated by the large number of chemicals which have been deemed to be suitable for promoting crystal growth. The lack of rational approaches towards the selection of successful chemical space and representative combinations has led to significant overlapping conditions, which are currently present in a multitude of commercially availablemore » crystallization screens. Here, an alternative approach to the sampling of widely used PEG precipitants is suggested through the use of PEG smears, which are mixtures of different PEGs with a requirement of either neutral or cooperatively positive effects of each component on crystal growth. Four newly defined smears were classified by molecular-weight groups and enabled the preservation of specific properties related to different polymer sizes. These smears not only allowed a wide coverage of properties of these polymers, but also reduced PEG variables, enabling greater sampling of other parameters such as buffers and additives. The efficiency of the smear-based screens was evaluated on more than 220 diverse recombinant human proteins, which overall revealed a good initial crystallization success rate of nearly 50%. In addition, in several cases successful crystallizations were only obtained using PEG smears, while various commercial screens failed to yield crystals. The defined smears therefore offer an alternative approach towards PEG sampling, which will benefit the design of crystallization screens sampling a wide chemical space of this key precipitant.« less
Accelerated Dimension-Independent Adaptive Metropolis
Chen, Yuxin; Keyes, David E.; Law, Kody J.; ...
2016-10-27
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
Accelerated Dimension-Independent Adaptive Metropolis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuxin; Keyes, David E.; Law, Kody J.
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
Embrittlement of MISSE 5 Polymers After 13 Months of Space Exposure
NASA Technical Reports Server (NTRS)
Guo, Aobo; Yi, Grace T.; Ashmead, Claire C.; Mitchell, Gianna G.; deGroh, Kim K.
2012-01-01
Understanding space environment induced degradation of spacecraft materials is essential when designing durable and stable spacecraft components. As a result of space radiation, debris impacts, atomic oxygen interaction, and thermal cycling, the outer surfaces of space materials degrade when exposed to low Earth orbit (LEO). The objective of this study was to measure the embrittlement of 37 thin film polymers after LEO space exposure. The polymers were flown aboard the International Space Station and exposed to the LEO space environment as part of the Materials International Space Station Experiment 5 (MISSE 5). The samples were flown in a nadir-facing position for 13 months and were exposed to thermal cycling along with low doses of atomic oxygen, direct solar radiation and omnidirectional charged particle radiation. The samples were analyzed for space-induced embrittlement using a bend-test procedure in which the strain necessary to induce surface cracking was determined. Bend-testing was conducted using successively smaller mandrels to apply a surface strain to samples placed on a semi-suspended pliable platform. A pristine sample was also tested for each flight sample. Eighteen of the 37 flight samples experienced some degree of surface cracking during bend-testing, while none of the pristine samples experienced any degree of cracking. The results indicate that 49 percent of the MISSE 5 thin film polymers became embrittled in the space environment even though they were exposed to low doses (approx.2.75 krad (Si) dose through 127 mm Kapton) of ionizing radiation.
Checinska, Aleksandra; Probst, Alexander J; Vaishampayan, Parag; White, James R; Kumar, Deepika; Stepanov, Victor G; Fox, George E; Nilsson, Henrik R; Pierson, Duane L; Perry, Jay; Venkateswaran, Kasthuri
2015-10-27
The International Space Station (ISS) is a unique built environment due to the effects of microgravity, space radiation, elevated carbon dioxide levels, and especially continuous human habitation. Understanding the composition of the ISS microbial community will facilitate further development of safety and maintenance practices. The primary goal of this study was to characterize the viable microbiome of the ISS-built environment. A second objective was to determine if the built environments of Earth-based cleanrooms associated with space exploration are an appropriate model of the ISS environment. Samples collected from the ISS and two cleanrooms at the Jet Propulsion Laboratory (JPL, Pasadena, CA) were analyzed by traditional cultivation, adenosine triphosphate (ATP), and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR) assays to estimate viable microbial populations. The 16S rRNA gene Illumina iTag sequencing was used to elucidate microbial diversity and explore differences between ISS and cleanroom microbiomes. Statistical analyses showed that members of the phyla Actinobacteria, Firmicutes, and Proteobacteria were dominant in the samples examined but varied in abundance. Actinobacteria were predominant in the ISS samples whereas Proteobacteria, least abundant in the ISS, dominated in the cleanroom samples. The viable bacterial populations seen by PMA treatment were greatly decreased. However, the treatment did not appear to have an effect on the bacterial composition (diversity) associated with each sampling site. The results of this study provide strong evidence that specific human skin-associated microorganisms make a substantial contribution to the ISS microbiome, which is not the case in Earth-based cleanrooms. For example, Corynebacterium and Propionibacterium (Actinobacteria) but not Staphylococcus (Firmicutes) species are dominant on the ISS in terms of viable and total bacterial community composition. The results obtained will facilitate future studies to determine how stable the ISS environment is over time. The present results also demonstrate the value of measuring viable cell diversity and population size at any sampling site. This information can be used to identify sites that can be targeted for more stringent cleaning. Finally, the results will allow comparisons with other built sites and facilitate future improvements on the ISS that will ensure astronaut health.
Low-gravity homogenization and solidification of aluminum antimonide. [Apollo-Soyuz test project
NASA Technical Reports Server (NTRS)
Ang, C.-Y.; Lacy, L. L.
1976-01-01
The III-V semiconducting compound AlSb shows promise as a highly efficient solar cell material, but it has not been commercially exploited because of difficulties in compound synthesis. Liquid state homogenization and solidification of AlSb were carried out in the Apollo-Soyuz Test Project Experiment MA-044 in the hope that compositional homogeneity would be improved by negating the large density difference between the two constituents. Post-flight analysis and comparative characterization of the space-processed and ground-processed samples indicate that there are major homogeneity improvements in the low-gravity solidified material.
Multilevel Mixture Kalman Filter
NASA Astrophysics Data System (ADS)
Guo, Dong; Wang, Xiaodong; Chen, Rong
2004-12-01
The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS) and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.
Doshi, Urmi; Hamelberg, Donald
2012-11-13
In enhanced sampling techniques, the precision of the reweighted ensemble properties is often decreased due to large variation in statistical weights and reduction in the effective sampling size. To abate this reweighting problem, here, we propose a general accelerated molecular dynamics (aMD) approach in which only the rotatable dihedrals are subjected to aMD (RaMD), unlike the typical implementation wherein all dihedrals are boosted (all-aMD). Nonrotatable and improper dihedrals are marginally important to conformational changes or the different rotameric states. Not accelerating them avoids the sharp increases in the potential energies due to small deviations from their minimum energy conformations and leads to improvement in the precision of RaMD. We present benchmark studies on two model dipeptides, Ace-Ala-Nme and Ace-Trp-Nme, simulated with normal MD, all-aMD, and RaMD. We carry out a systematic comparison between the performances of both forms of aMD using a theory that allows quantitative estimation of the effective number of sampled points and the associated uncertainty. Our results indicate that, for the same level of acceleration and simulation length, as used in all-aMD, RaMD results in significantly less loss in the effective sample size and, hence, increased accuracy in the sampling of φ-ψ space. RaMD yields an accuracy comparable to that of all-aMD, from simulation lengths 5 to 1000 times shorter, depending on the peptide and the acceleration level. Such improvement in speed and accuracy over all-aMD is highly remarkable, suggesting RaMD as a promising method for sampling larger biomolecules.
Effect of an Auxiliary Plate on Passive Heat Dissipation of Carbon Nanotube-Based Materials.
Yu, Wei; Duan, Zheng; Zhang, Guang; Liu, Changhong; Fan, Shoushan
2018-03-14
Carbon nanotubes (CNTs) and other related CNT-based materials with a high thermal conductivity can be used as promising heat dissipation materials. Meanwhile, the miniaturization and high functionality of portable electronics, such as laptops and mobile phones, are achieved at the cost of overheating the high power-density components. The heat removal for hot spots occurring in a relatively narrow space requires simple and effective cooling methods. Here, an auxiliary passive cooling approach by the aid of a flat plate (aluminum-magnesium alloy) is investigated to accommodate heat dissipation in a narrow space. The cooling efficiency can be raised to 43.5%. The cooling performance of several CNT-based samples is compared under such circumstances. Heat dissipation analyses show that, when there is a nearby plate for cooling assistance, the heat radiation is weakened and natural convection is largely improved. Thus, improving heat radiation by increasing emissivity without reducing natural convection can effectively enhance the cooling performance. Moreover, the decoration of an auxiliary cooling plate with sprayed CNTs can further improve the cooling performance of the entire setup.
An Improved Nested Sampling Algorithm for Model Selection and Assessment
NASA Astrophysics Data System (ADS)
Zeng, X.; Ye, M.; Wu, J.; WANG, D.
2017-12-01
Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.
Estimation of critical behavior from the density of states in classical statistical models
NASA Astrophysics Data System (ADS)
Malakis, A.; Peratzakis, A.; Fytas, N. G.
2004-12-01
We present a simple and efficient approximation scheme which greatly facilitates the extension of Wang-Landau sampling (or similar techniques) in large systems for the estimation of critical behavior. The method, presented in an algorithmic approach, is based on a very simple idea, familiar in statistical mechanics from the notion of thermodynamic equivalence of ensembles and the central limit theorem. It is illustrated that we can predict with high accuracy the critical part of the energy space and by using this restricted part we can extend our simulations to larger systems and improve the accuracy of critical parameters. It is proposed that the extensions of the finite-size critical part of the energy space, determining the specific heat, satisfy a scaling law involving the thermal critical exponent. The method is applied successfully for the estimation of the scaling behavior of specific heat of both square and simple cubic Ising lattices. The proposed scaling law is verified by estimating the thermal critical exponent from the finite-size behavior of the critical part of the energy space. The density of states of the zero-field Ising model on these lattices is obtained via a multirange Wang-Landau sampling.
Advances in locally constrained k-space-based parallel MRI.
Samsonov, Alexey A; Block, Walter F; Arunachalam, Arjun; Field, Aaron S
2006-02-01
In this article, several theoretical and methodological developments regarding k-space-based, locally constrained parallel MRI (pMRI) reconstruction are presented. A connection between Parallel MRI with Adaptive Radius in k-Space (PARS) and GRAPPA methods is demonstrated. The analysis provides a basis for unified treatment of both methods. Additionally, a weighted PARS reconstruction is proposed, which may absorb different weighting strategies for improved image reconstruction. Next, a fast and efficient method for pMRI reconstruction of data sampled on non-Cartesian trajectories is described. In the new technique, the computational burden associated with the numerous matrix inversions in the original PARS method is drastically reduced by limiting direct calculation of reconstruction coefficients to only a few reference points. The rest of the coefficients are found by interpolating between the reference sets, which is possible due to the similar configuration of points participating in reconstruction for highly symmetric trajectories, such as radial and spirals. As a result, the time requirements are drastically reduced, which makes it practical to use pMRI with non-Cartesian trajectories in many applications. The new technique was demonstrated with simulated and actual data sampled on radial trajectories. Copyright 2006 Wiley-Liss, Inc.
Space flight effects on antioxidant molecules in dry tardigrades: the TARDIKISS experiment.
Rizzo, Angela Maria; Altiero, Tiziana; Corsetto, Paola Antonia; Montorfano, Gigliola; Guidetti, Roberto; Rebecchi, Lorena
2015-01-01
The TARDIKISS (Tardigrades in Space) experiment was part of the Biokon in Space (BIOKIS) payload, a set of multidisciplinary experiments performed during the DAMA (Dark Matter) mission organized by Italian Space Agency and Italian Air Force in 2011. This mission supported the execution of experiments in short duration (16 days) taking the advantage of the microgravity environment on board of the Space Shuttle Endeavour (its last mission STS-134) docked to the International Space Station. TARDIKISS was composed of three sample sets: one flight sample and two ground control samples. These samples provided the biological material used to test as space stressors, including microgravity, affected animal survivability, life cycle, DNA integrity, and pathways of molecules working as antioxidants. In this paper we compared the molecular pathways of some antioxidant molecules, thiobarbituric acid reactive substances, and fatty acid composition between flight and control samples in two tardigrade species, namely, Paramacrobiotus richtersi and Ramazzottius oberhaeuseri. In both species, the activities of ROS scavenging enzymes, the total content of glutathione, and the fatty acids composition between flight and control samples showed few significant differences. TARDIKISS experiment, together with a previous space experiment (TARSE), further confirms that both desiccated and hydrated tardigrades represent useful animal tool for space research.
2016-12-01
tiple dimensions (20). Hu et al. employed pseudo-random phase-encoding blips during the EPSI readout to create nonuniform sampling along the spatial...resolved MRSI with Nonuniform Undersampling and Compressed Sensing 514 30.5 Prior-knowledge Fitting for Metabolite Quantitation 515 30.6 Future Directions... NONUNIFORM UNDERSAMPLING AND COMPRESSED SENSING Nonuniform undersampling (NUS) of k-space and subsequent reconstruction using compressed sensing (CS
Products Derived from Thinning Two Hardwood Timber Stands in the Appalachians
E. Paul Craft; John E. Baumgras
1978-01-01
Two sample plots in poletimber-small sawtimber stands of Allegheny hardwoods were thinned to improve crop-tree spacing. Thinning produced nearly 35 tons per acre of wood fiber, including 13 tons of sawable boltwood, 3-l/2 tons of standard sawlogs, 18 tons of pulpwood, and 1 ton of fuelwood. Nearly 3,700 board feet of lumber and cants were produced from the sawbolts and...
Consistent Alignment of World Embedding Models
2017-03-02
propose a solution that aligns variations of the same model (or different models) in a joint low-dimensional la- tent space leveraging carefully...representations of linguistic enti- ties, most often referred to as embeddings. This includes techniques that rely on matrix factoriza- tion (Levy & Goldberg ...higher, the variation is much higher as well. As we increase the size of the neighborhood, or improve the quality of our sample by only picking the most
Evaluation, development, and characterization of superconducting materials for space applications
NASA Technical Reports Server (NTRS)
Thorpe, Arthur N.
1990-01-01
The anisotropic electromagnetic features of a grain-aligned YBa2Cu3O(x) bulk sample derived from a process of long-time partial melt growth were investigated by the measurements of direct current magnetization (at 77 K) and alternating current susceptibility as a function of temperature, with the fields applied parallel and perpendicular to the c axis, respectively. The extended Bean model was further studied and applied to explain the experimental results. Upon comparison of the grain-aligned sample with pure single crystal materials, it is concluded that because of the existence of more effective pinning sites in the grain-aligned sample, not only its critical current density perpendicular to the c axis is improved, but the one parallel to the c axis is improved even more significantly. The anisotropy in the critical current densities in the grain-aligned sample at 77 K is at least one to two orders of magnitude smaller than in the pure single crystal. The measurement of anisotropy of alternating current susceptibility as a function of temperature, especially its imaginary part, shows that there are still some residues of interlayer weak links in the grain-aligned samples, but they are quite different from and far less serious than the weak links in the sintered sample.
Novignon, Jacob; Nonvignon, Justice
2017-06-12
Health centers in Ghana play an important role in health care delivery especially in deprived communities. They usually serve as the first line of service and meet basic health care needs. Unfortunately, these facilities are faced with inadequate resources. While health policy makers seek to increase resources committed to primary healthcare, it is important to understand the nature of inefficiencies that exist in these facilities. Therefore, the objectives of this study are threefold; (i) estimate efficiency among primary health facilities (health centers), (ii) examine the potential fiscal space from improved efficiency and (iii) investigate the efficiency disparities in public and private facilities. Data was from the 2015 Access Bottlenecks, Cost and Equity (ABCE) project conducted by the Institute for Health Metrics and Evaluation. The Stochastic Frontier Analysis (SFA) was used to estimate efficiency of health facilities. Efficiency scores were then used to compute potential savings from improved efficiency. Outpatient visits was used as output while number of personnel, hospital beds, expenditure on other capital items and administration were used as inputs. Disparities in efficiency between public and private facilities was estimated using the Nopo matching decomposition procedure. Average efficiency score across all health centers included in the sample was estimated to be 0.51. Also, average efficiency was estimated to be about 0.65 and 0.50 for private and public facilities, respectively. Significant disparities in efficiency were identified across the various administrative regions. With regards to potential fiscal space, we found that, on average, facilities could save about GH₵11,450.70 (US$7633.80) if efficiency was improved. We also found that fiscal space from efficiency gains varies across rural/urban as well as private/public facilities, if best practices are followed. The matching decomposition showed an efficiency gap of 0.29 between private and public facilities. There is need for primary health facility managers to improve productivity via effective and efficient resource use. Efforts to improve efficiency should focus on training health workers and improving facility environment alongside effective monitoring and evaluation exercises.
NASA Astrophysics Data System (ADS)
Cohen, Luchino
Immune functions are altered during space flights. Latent virus reactivation, reduction in the number of immune cells, decreased cell activation and increased sensitivity of astronauts to infections following their return on Earth demonstrate that the immune system is less efficient during space flight. The causes of this immune deficiency are not fully understood and this dysfunction during long-term missions could result in the appearance of opportunistic infections or a decrease in the immuno-surveillance mechanisms that eradicate cancer cells. Therefore, the immune functions of astronauts will have to be monitored continuously during long-term missions in space, using miniature and semi-automated diagnostic systems. The objectives of this project are to study the causes of space-related immunodeficiency, to develop countermeasures to maintain an optimal immune function and to improve our capacity to detect infectious diseases during space missions through the monitoring of astronauts' immune system. In order to achieve these objectives, an Immune Function Diagnostic System (IFDS) will be designed to perform a set of immunological assays on board spacecrafts or on planet-bound bases. Through flow cytometric assays and molecular biology analyses, this diagnostic system could improve medical surveillance of astronauts and could be used to test countermeasures aimed at preventing immune deficiency during space missions. The capacity of the instrument to assess cellular fluorescence and to quantify the presence of soluble molecules in biological samples would support advanced molecular studies in space life sciences. Finally, such diagnostic system could also be used on Earth in remote areas or in mobile hospitals following natural disasters to fight against infectious diseases and other pathologies.
Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...
2016-02-05
Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less
Gao, Yi Qin
2008-04-07
Here, we introduce a simple self-adaptive computational method to enhance the sampling in energy, configuration, and trajectory spaces. The method makes use of two strategies. It first uses a non-Boltzmann distribution method to enhance the sampling in the phase space, in particular, in the configuration space. The application of this method leads to a broad energy distribution in a large energy range and a quickly converged sampling of molecular configurations. In the second stage of simulations, the configuration space of the system is divided into a number of small regions according to preselected collective coordinates. An enhanced sampling of reactive transition paths is then performed in a self-adaptive fashion to accelerate kinetics calculations.
Pan, Feng; Tao, Guohua
2013-03-07
Full semiclassical (SC) initial value representation (IVR) for time correlation functions involves a double phase space average over a set of two phase points, each of which evolves along a classical path. Conventionally, the two initial phase points are sampled independently for all degrees of freedom (DOF) in the Monte Carlo procedure. Here, we present an efficient importance sampling scheme by including the path correlation between the two initial phase points for the bath DOF, which greatly improves the performance of the SC-IVR calculations for large molecular systems. Satisfactory convergence in the study of quantum coherence in vibrational relaxation has been achieved for a benchmark system-bath model with up to 21 DOF.
Use of microwaves to improve nutritional value of soybeans for future space inhabitants
NASA Technical Reports Server (NTRS)
Singh, G.
1983-01-01
Whole soybeans from four different varieties at different moisture contents were microwaved for varying times to determine the conditions for maximum destruction of trypsin inhibitor and lipoxygenase activities, and optimal growth of chicks. Microwaving 150 gm samples of soybeans (at 14 to 28% moisture) for 1.5 min was found optimal for reduction of trypsin inhibitor and lipoxygenase activities. Microwaving 1 kgm samples of soybeans for 9 minutes destroyed 82% of the trypsin inhibitor activity and gave optimal chick growth. It should be pointed out that the microwaving time would vary according to the weight of the sample and the power of the microwave oven. The microwave oven used in the above experiments was rated at 650 watts 2450 MHz.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team KuuKulgur watches as their robots attempt the level one competition during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Sam Ortega, NASA program manager for Centennial Challenges, is seen during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Retrievers team robot is seen as it attempts the level one challenge the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
United States benefits of improved worldwide wheat crop information from a LANDSAT system
NASA Technical Reports Server (NTRS)
Heiss, K. P.; Sand, F.; Seidel, A.; Warner, D.; Sheflin, N.; Bhattacharyya, R.; Andrews, J.
1975-01-01
The value of worldwide information improvements on wheat crops, promised by LANDSAT, is measured in the context of world wheat markets. These benefits are based on current LANDSAT technical goals and assume that information is made available to all (United States and other countries) at the same time. A detailed empirical sample demonstration of the effect of improved information is given; the history of wheat commodity prices for 1971-72 is reconstructed and the price changes from improved vs. historical information are compared. The improved crop forecasting from a LANDSAT system assumed include wheat crop estimates of 90 percent accuracy for each major wheat producing region. Accurate, objective worldwide wheat crop information using space systems may have a very stabilizing influence on world commodity markets, in part making possible the establishment of long-term, stable trade relationships.
Integrating diffusion maps with umbrella sampling: Application to alanine dipeptide
NASA Astrophysics Data System (ADS)
Ferguson, Andrew L.; Panagiotopoulos, Athanassios Z.; Debenedetti, Pablo G.; Kevrekidis, Ioannis G.
2011-04-01
Nonlinear dimensionality reduction techniques can be applied to molecular simulation trajectories to systematically extract a small number of variables with which to parametrize the important dynamical motions of the system. For molecular systems exhibiting free energy barriers exceeding a few kBT, inadequate sampling of the barrier regions between stable or metastable basins can lead to a poor global characterization of the free energy landscape. We present an adaptation of a nonlinear dimensionality reduction technique known as the diffusion map that extends its applicability to biased umbrella sampling simulation trajectories in which restraining potentials are employed to drive the system into high free energy regions and improve sampling of phase space. We then propose a bootstrapped approach to iteratively discover good low-dimensional parametrizations by interleaving successive rounds of umbrella sampling and diffusion mapping, and we illustrate the technique through a study of alanine dipeptide in explicit solvent.
NASA Astrophysics Data System (ADS)
Shan, Xuchen; Zhang, Bei; Lan, Guoqiang; Wang, Yiqiao; Liu, Shugang
2015-11-01
Biology and medicine sample measurement takes an important role in the microscopic optical technology. Optical tweezer has the advantage of accurate capture and non-pollution of the sample. The SPR(surface plasmon resonance) sensor has so many advantages include high sensitivity, fast measurement, less consumption of sample and label-free detection of biological sample that the SPR sensing technique has been used for surface topography, analysis of biochemical and immune, drug screening and environmental monitoring. If they combine, they will play an important role in the biological, chemical and other subjects. The system we propose use the multi-axis cage system, by using the methods of reflection and transmiss ion to improve the space utilization. The SPR system and optical tweezer were builtup and combined in one system. The cage of multi-axis system gives full play to its accuracy, simplicity and flexibility. The size of the system is 20 * 15 * 40 cm3 and thus the sample can be replaced to switch between the optical tweezers system and the SPR system in the small space. It means that we get the refractive index of the sample and control the particle in the same system. In order to control the revolving stage, get the picture and achieve the data stored automatically, we write a LabVIEW procedure. Then according to the data from the back focal plane calculate the refractive index of the sample. By changing the slide we can trap the particle as optical tweezer, which makes us measurement and trap the sample at the same time.
Fiber glass pulling. [in space
NASA Technical Reports Server (NTRS)
Workman, Gary L.
1987-01-01
Experiments were conducted to determine the viability of performing containerless glass fiber pulling in space. The optical transmission properties and glass-forming capabilities of the heavy metal fluorides are reviewed and the acoustic characteristics required for a molten glass levitation system are examined. The design limitations of, and necessary modifications to the acoustic levitation furnace used in the experiments are discussed in detail. Acoustic levitator force measurements were performed and a thermal map of the furnace was generated from thermocouple data. It was determined that the thermal capability of the furnace was inadequate to melt a glass sample in the center. The substitution of a 10 KW carbon monoxide laser for the original furnace heating elements resulted in improved melt heating.
NASA Technical Reports Server (NTRS)
Williams, James G.
1992-01-01
Asteroid families are clusters of asteroids in proper element space which are thought to be fragments from former collisions. Studies of families promise to improve understanding of large collision events and a large event can open up the interior of a former parent body to view. While a variety of searches for families have found the same heavily populated families, and some searches have found the same families of lower population, there is much apparent disagreement between proposed families of lower population of different investigations. Indicators of reliability, factors compromising reliability, an illustration of the influence of different data samples, and a discussion of how several investigations perceived families in the same region of proper element space are given.
Space Processing Applications Rocket (SPAR) project: SPAR 10
NASA Technical Reports Server (NTRS)
Poorman, R. (Compiler)
1986-01-01
The Space Processing Applications Rocket Project (SPAR) X Final Report contains the compilation of the post-flight reports from each of the Principal Investigators (PIs) on the four selected science payloads, in addition to the engineering report as documented by the Marshall Space Flight Center (MSFC). This combined effort also describes pertinent portions of ground-based research leading to the ultimate selection of the flight sample composition, including design, fabrication and testing, all of which are expected to contribute to an improved comprehension of materials processing in space. The SPAR project was coordinated and managed by MSFC as part of the Microgravity Science and Applications (MSA) program of the Office of Space Science and Applications (OSSA) of NASA Headquarters. This technical memorandum is directed entirely to the payload manifest flown in the tenth of a series of SPAR flights conducted at the White Sands Missile Range (WSMR) and includes the experiments entitled, Containerless Processing Technology, SPAR Experiment 76-20/3; Directional Solidification of Magnetic Composites, SPAR Experiment 76-22/3; Comparative Alloy Solidification, SPAR Experiment 76-36/3; and Foam Copper, SPAR Experiment 77-9/1R.
NASA Technical Reports Server (NTRS)
Todd, N. S.; Evans, C.
2015-01-01
The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (JSC) is the designated facility for curating all of NASA's extraterrestrial samples. The suite of collections includes the lunar samples from the Apollo missions, cosmic dust particles falling into the Earth's atmosphere, meteorites collected in Antarctica, comet and interstellar dust particles from the Stardust mission, asteroid particles from the Japanese Hayabusa mission, and solar wind atoms collected during the Genesis mission. To support planetary science research on these samples, NASA's Astromaterials Curation Office hosts the Astromaterials Curation Digital Repository, which provides descriptions of the missions and collections, and critical information about each individual sample. Our office is implementing several informatics initiatives with the goal of better serving the planetary research community. One of these initiatives aims to increase the availability and discoverability of sample data and images through the use of a newly designed common architecture for Astromaterials Curation databases.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
A sample can be seen on the competition field as the team Survey robot conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Optimizing Design Parameters for Sets of Concentric Tube Robots using Sampling-based Motion Planning
Baykal, Cenk; Torres, Luis G.; Alterovitz, Ron
2015-01-01
Concentric tube robots are tentacle-like medical robots that can bend around anatomical obstacles to access hard-to-reach clinical targets. The component tubes of these robots can be swapped prior to performing a task in order to customize the robot’s behavior and reachable workspace. Optimizing a robot’s design by appropriately selecting tube parameters can improve the robot’s effectiveness on a procedure-and patient-specific basis. In this paper, we present an algorithm that generates sets of concentric tube robot designs that can collectively maximize the reachable percentage of a given goal region in the human body. Our algorithm combines a search in the design space of a concentric tube robot using a global optimization method with a sampling-based motion planner in the robot’s configuration space in order to find sets of designs that enable motions to goal regions while avoiding contact with anatomical obstacles. We demonstrate the effectiveness of our algorithm in a simulated scenario based on lung anatomy. PMID:26951790
Baykal, Cenk; Torres, Luis G; Alterovitz, Ron
2015-09-28
Concentric tube robots are tentacle-like medical robots that can bend around anatomical obstacles to access hard-to-reach clinical targets. The component tubes of these robots can be swapped prior to performing a task in order to customize the robot's behavior and reachable workspace. Optimizing a robot's design by appropriately selecting tube parameters can improve the robot's effectiveness on a procedure-and patient-specific basis. In this paper, we present an algorithm that generates sets of concentric tube robot designs that can collectively maximize the reachable percentage of a given goal region in the human body. Our algorithm combines a search in the design space of a concentric tube robot using a global optimization method with a sampling-based motion planner in the robot's configuration space in order to find sets of designs that enable motions to goal regions while avoiding contact with anatomical obstacles. We demonstrate the effectiveness of our algorithm in a simulated scenario based on lung anatomy.
Experimental investigation on selective laser melting of 17-4PH stainless steel
NASA Astrophysics Data System (ADS)
Hu, Zhiheng; Zhu, Haihong; Zhang, Hu; Zeng, Xiaoyan
2017-01-01
Selective laser melting (SLM) is an additive manufacturing (AM) technique that uses powders to fabricate 3Dparts directly. The objective of this paper is to perform an experimental investigation of selective laser melted 17-4PH stainless steel. The investigation involved the influence of separate processing parameters on the density, defect, microhardness and the influence of heat-treatment on the mechanical properties. The outcomes of this study show that scan velocity and slice thickness have significant effects on the density and the characteristics of pores of the SLMed parts. The effect of hatch spacing depends on scan velocity. The processing parameters, such as scan velocity, hatch spacing and slice thickness, have effect on microhardness. Compared to the samples with no heat-treatment, the yield strength of the heat-treated sample increases significantly and the elongation decreases due to the transformation of microstructure and the changes in the precipitation strengthening phases. By a combination of changes in composition and precipitation strengthening, microhardness improved.
Cantrell, Keri B; Martin, Jerry H
2012-02-01
The concept of a designer biochar that targets the improvement of a specific soil property imposes the need for production processes to generate biochars with both high consistency and quality. These important production parameters can be affected by variations in process temperature that must be taken into account when controlling the pyrolysis of agricultural residues such as manures and other feedstocks. A novel stochastic state-space temperature regulator was developed to accurately match biochar batch production to a defined temperature input schedule. This was accomplished by describing the system's state-space with five temperature variables--four directly measured and one change in temperature. Relationships were derived between the observed state and the desired, controlled state. When testing the unit at two different temperatures, the actual pyrolytic temperature was within 3 °C of the control with no overshoot. This state-space regulator simultaneously controlled the indirect heat source and sample temperature by employing difficult-to-measure variables such as temperature stability in the description of the pyrolysis system's state-space. These attributes make a state-space controller an optimum control scheme for the production of a predictable, repeatable designer biochar. Published 2011 by John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
James, John T.
2012-01-01
One mini-grab sample container (m-GSC) was returned aboard Space X1 because of the importance of quickly knowing first-entry conditions in this new commercial module. This sample was analyzed alongside samples of the portable clean room (PCR) used in the Space X complex at KSC. The recoveries of C-13-acetone, fluorobenzene, and chlorobenzene from the GSCs averaged 130, 129, and 132 %, respectively.
Evaluation of Space Power Materials Flown on the Passive Optical Sample Assembly
NASA Technical Reports Server (NTRS)
Jaworske, Donald A.; deGroh, Kim K.; Skowronski, Timothy J.; McCollum, Tim; Pippin, Gary; Bungay, Corey
1999-01-01
Evaluating the performance of materials on the exterior of spacecraft is of continuing interest, particularly in anticipation of those applications that will require a long duration in low Earth orbit. The Passive Optical Sample Assembly (POSA) experiment flown on the exterior of Mir as a risk mitigation experiment for the International Space Station was designed to better understand the interaction of materials with the low Earth orbit environment and to better understand the potential contamination threats that may be present in the vicinity of spacecraft. Deterioration in the optical performance of candidate space power materials due to the low Earth orbit environment, the contamination environment, or both, must be evaluated in order to propose measures to mitigate such deterioration. The thirty two samples of space power materials studied here include solar array blanket materials such as polyimide Kapton H and SiO(x) coated polyimide Kapton H, front surface aluminized sapphire, solar dynamic concentrator materials such as silver on spin coated polyimide and aluminum on spin coated polyimide, CV 1144 silicone, and the thermal control paint Z-93-P. The physical and optical properties that were evaluated prior to and after the POSA flight include mass, total, diffuse, and specular reflectance, solar absorptance, and infrared emittance. Additional post flight evaluation included scanning electron microscopy to observe surface features caused by the low Earth orbit environment and the contamination environment, and variable angle spectroscopic ellipsometry to identify contaminant type and thickness. This paper summarizes the results of pre- and post-flight measurements, identifies the mechanisms responsible for optical properties deterioration, and suggests improvements for the durability of materials in future missions.
Jáčová, Jaroslava; Gardlo, Alžběta; Friedecký, David; Adam, Tomáš; Dimandja, Jean-Marie D
2017-08-18
Orthogonality is a key parameter that is used to evaluate the separation power of chromatography-based two-dimensional systems. It is necessary to scale the separation data before the assessment of the orthogonality. Current scaling approaches are sample-dependent, and the extent of the retention space that is converted into a normalized retention space is set according to the retention times of the first and last analytes contained in a unique sample to elute. The presence or absence of a highly retained analyte in a sample can thus significantly influence the amount of information (in terms of the total amount of separation space) contained in the normalized retention space considered for the calculation of the orthogonality. We propose a Whole Separation Space Scaling (WOSEL) approach that accounts for the whole separation space delineated by the analytical method, and not the sample. This approach enables an orthogonality-based evaluation of the efficiency of the analytical system that is independent of the sample selected. The WOSEL method was compared to two currently used orthogonality approaches through the evaluation of in silico-generated chromatograms and real separations of human biofluids and petroleum samples. WOSEL exhibits sample-to-sample stability values of 3.8% on real samples, compared to 7.0% and 10.1% for the two other methods, respectively. Using real analyses, we also demonstrate that some previously developed approaches can provide misleading conclusions on the overall orthogonality of a two-dimensional chromatographic system. Copyright © 2017 Elsevier B.V. All rights reserved.
Characteristics of speaking style and implications for speech recognition.
Shinozaki, Takahiro; Ostendorf, Mari; Atlas, Les
2009-09-01
Differences in speaking style are associated with more or less spectral variability, as well as different modulation characteristics. The greater variation in some styles (e.g., spontaneous speech and infant-directed speech) poses challenges for recognition but possibly also opportunities for learning more robust models, as evidenced by prior work and motivated by child language acquisition studies. In order to investigate this possibility, this work proposes a new method for characterizing speaking style (the modulation spectrum), examines spontaneous, read, adult-directed, and infant-directed styles in this space, and conducts pilot experiments in style detection and sampling for improved speech recognizer training. Speaking style classification is improved by using the modulation spectrum in combination with standard pitch and energy variation. Speech recognition experiments on a small vocabulary conversational speech recognition task show that sampling methods for training with a small amount of data benefit from the new features.
Keane, M P; McGee, M; O'Riordan, E G; Kelly, A K; Earley, B
2017-12-01
Accommodating cattle indoors during the winter is widely practiced throughout Europe. There is currently no legislation surrounding the space allowance and floor type that should be provided to cattle during this time, however, concerns have been raised regarding the type of housing systems currently in use. The objective of the study was to investigate the effect of space allowance and floor type on performance and welfare of finishing beef heifers. Continental crossbred heifers (n=240: mean initial live; weight, 504 (SD 35.8) kg) were blocked by breed, weight and age and randomly assigned to one of four treatments; (i) 3.0 m2, (ii) 4.5 m2 and (iii) 6.0 m2 space allowance per animal on a fully slatted concrete floor and (iv) 6.0 m2 space allowance per animal on a straw-bedded floor, for 105 days. Heifers were offered a total mixed ration ad libitum. Dry matter intake was recorded on a pen basis and refusals were weighed back twice weekly. Heifers were weighed, dirt scored and blood sampled every 3 weeks. Whole blood was analysed for complete cell counts and serum samples were assayed for metabolite concentrations. Behaviour was recorded continuously using IR cameras from days 70 to 87. Heifers' hooves were inspected for lesions at the start of the study and again after slaughter. Post-slaughter, carcass weight, conformation and fat scores and hide weight were recorded. Heifers housed at 4.5 m2 had a greater average daily live weight gain (ADG) than those on both of the other concrete slat treatments; however, space allowance had no effect on carcass weight. Heifers accommodated on straw had a greater ADG (0.15 kg) (P<0.05), hide weight (P<0.01) better feed conversion ratio (P<0.05) and had greater dirt scores (P<0.05) at slaughter than heifers accommodated on concrete slats at 6.0 m2. The number of heifers lying at any one time was greater (P<0.001) on straw than on concrete slats. Space allowance and floor type had no effect on the number of hoof lesions gained or on any of the haematological or metabolic variables measured. It was concluded that increasing space allowance above 3.0 m2/animal on concrete slats was of no benefit to animal performance but it did improve animal cleanliness. Housing heifers on straw instead of concrete slats improved ADG and increased lying time; however carcass weight was not affected.
Design, Fabrication and Characterization of Micro Opto-Electro-Mechanical Systems.
1995-12-01
interference problems (see Fig. 3-6). Improvements in the lithography of the MCNC process would allow for grating spaces of less than 2 gm and therefore...A micro-spectrometer has been fabricated using LIGA, an acronym for lithography , electroforming, and micromolding (the acronym came from the German...location for test samples and an adjustable mirror. The beams are brought back together to form an interference pattern. At an observation screen the
An efficient genetic algorithm for maximum coverage deployment in wireless sensor networks.
Yoon, Yourim; Kim, Yong-Hyuk
2013-10-01
Sensor networks have a lot of applications such as battlefield surveillance, environmental monitoring, and industrial diagnostics. Coverage is one of the most important performance metrics for sensor networks since it reflects how well a sensor field is monitored. In this paper, we introduce the maximum coverage deployment problem in wireless sensor networks and analyze the properties of the problem and its solution space. Random deployment is the simplest way to deploy sensor nodes but may cause unbalanced deployment and therefore, we need a more intelligent way for sensor deployment. We found that the phenotype space of the problem is a quotient space of the genotype space in a mathematical view. Based on this property, we propose an efficient genetic algorithm using a novel normalization method. A Monte Carlo method is adopted to design an efficient evaluation function, and its computation time is decreased without loss of solution quality using a method that starts from a small number of random samples and gradually increases the number for subsequent generations. The proposed genetic algorithms could be further improved by combining with a well-designed local search. The performance of the proposed genetic algorithm is shown by a comparative experimental study. When compared with random deployment and existing methods, our genetic algorithm was not only about twice faster, but also showed significant performance improvement in quality.
Infusion of innovative technologies for mission operations
NASA Astrophysics Data System (ADS)
Donati, Alessandro
2010-11-01
The Advanced Mission Concepts and Technologies Office (Mission Technologies Office, MTO for short) at the European Space Operations Centre (ESOC) of ESA is entrusted with research and development of innovative mission operations concepts systems and provides operations support to special projects. Visions of future missions and requests for improvements from currently flying missions are the two major sources of inspiration to conceptualize innovative or improved mission operations processes. They include monitoring and diagnostics, planning and scheduling, resource management and optimization. The newly identified operations concepts are then proved by means of prototypes, built with embedded, enabling technology and deployed as shadow applications in mission operations for an extended validation phase. The technology so far exploited includes informatics, artificial intelligence and operational research branches. Recent outstanding results include artificial intelligence planning and scheduling applications for Mars Express, advanced integrated space weather monitoring system for the Integral space telescope and a suite of growing client applications for MUST (Mission Utilities Support Tools). The research, development and validation activities at the Mission technologies office are performed together with a network of research institutes across Europe. The objective is narrowing the gap between enabling and innovative technology and space mission operations. The paper first addresses samples of technology infusion cases with their lessons learnt. The second part is focused on the process and the methodology used at the Mission technologies office to fulfill its objectives.
Improved FFT-based numerical inversion of Laplace transforms via fast Hartley transform algorithm
NASA Technical Reports Server (NTRS)
Hwang, Chyi; Lu, Ming-Jeng; Shieh, Leang S.
1991-01-01
The disadvantages of numerical inversion of the Laplace transform via the conventional fast Fourier transform (FFT) are identified and an improved method is presented to remedy them. The improved method is based on introducing a new integration step length Delta(omega) = pi/mT for trapezoidal-rule approximation of the Bromwich integral, in which a new parameter, m, is introduced for controlling the accuracy of the numerical integration. Naturally, this method leads to multiple sets of complex FFT computations. A new inversion formula is derived such that N equally spaced samples of the inverse Laplace transform function can be obtained by (m/2) + 1 sets of N-point complex FFT computations or by m sets of real fast Hartley transform (FHT) computations.
Nowroozi, Amin; Shahlaei, Mohsen
2017-02-01
In this study, a computational pipeline was therefore devised to overcome homology modeling (HM) bottlenecks. The coupling of HM with molecular dynamics (MD) simulation is useful in that it tackles the sampling deficiency of dynamics simulations by providing good-quality initial guesses for the native structure. Indeed, HM also relaxes the severe requirement of force fields to explore the huge conformational space of protein structures. In this study, the interaction between the human bombesin receptor subtype-3 and MK-5046 was investigated integrating HM, molecular docking, and MD simulations. To improve conformational sampling in typical MD simulations of GPCRs, as in other biomolecules, multiple trajectories with different initial conditions can be employed rather than a single long trajectory. Multiple MD simulations of human bombesin receptor subtype-3 with different initial atomic velocities are applied to sample conformations in the vicinity of the structure generated by HM. The backbone atom conformational space distribution of replicates is analyzed employing principal components analysis. As a result, the averages of structural and dynamic properties over the twenty-one trajectories differ significantly from those obtained from individual trajectories.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The NASA Centennial Challenges prize, level one, is presented to team Mountaineers for successfully completing level one of the NASA 2014 Sample Return Robot Challenge, from left, Ryan Watson, Team Mountaineers; Lucas Behrens, Team Mountaineers; Jarred Strader, Team Mountaineers; Yu Gu, Team Mountaineers; Scott Harper, Team Mountaineers; Dorothy Rasco, NASA Deputy Associate Administrator for the Space Technology Mission Directorate; Laurie Leshin, Worcester Polytechnic Institute (WPI) President; David Miller, NASA Chief Technologist; Alexander Hypes, Team Mountaineers; Nick Ohi,Team Mountaineers; Marvin Cheng, Team Mountaineers; Sam Ortega, NASA Program Manager for Centennial Challenges; and Tanmay Mandal, Team Mountaineers;, Saturday, June 14, 2014, at Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Mountaineers was the only team to complete the level one challenge. During the competition, teams were required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge was to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The NASA Centennial Challenges prize, level one, is presented to team Mountaineers for successfully completing level one of the NASA 2014 Sample Return Robot Challenge, from left, Ken Stafford, WPI Challenge technical advisor; Colleen Shaver, WPI Challenge Manager; Ryan Watson, Team Mountaineers; Marvin Cheng, Team Mountaineers; Alexander Hypes, Team Mountaineers; Jarred Strader, Team Mountaineers; Lucas Behrens, Team Mountaineers; Yu Gu, Team Mountaineers; Nick Ohi, Team Mountaineers; Dorothy Rasco, NASA Deputy Associate Administrator for the Space Technology Mission Directorate; Scott Harper, Team Mountaineers; Tanmay Mandal, Team Mountaineers; David Miller, NASA Chief Technologist; Sam Ortega, NASA Program Manager for Centennial Challenges, Saturday, June 14, 2014, at Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Mountaineers was the only team to complete the level one challenge. During the competition, teams were required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge was to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
NASA Astrophysics Data System (ADS)
Grayver, Alexander V.; Kuvshinov, Alexey V.
2016-05-01
This paper presents a methodology to sample equivalence domain (ED) in nonlinear partial differential equation (PDE)-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low-misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of magneotelluric, controlled-source electromagnetic (EM) and global EM induction data.
Attitudes towards personal and shared space during the flight.
Ahmadpour, N; Kühne, M; Robert, J-M; Vink, P
2016-07-25
Aircraft passenger comfort experience was previously defined based on its underlying thematic components representing passengers' perception of the environmental elements and their link to their concerns. This paper aims to 1) identify aircraft passengers' attitudes towards their personal and shared space in the cabin environment during the flight which are linked to their comfort experience and 2) highlight passenger concerns associated with those attitudes. A sample involving 16 participants was conducted, collecting full accounts of their real-time flight experiences onboard commercial aircrafts, using questionnaires. Four types of attitudes were identified in reaction to participants' personal and shared space during the flight. Those were described as adjust, avoid, approach, and shield. Passengers' concerns associated with those attitudes were respectively: control, privacy, connectedness and tolerance. It is concluded that passenger comfort can be improved once the identified concerns and attitudes are addressed in the design of the aircraft seat and interior. Design recommendations are provided accordingly.
Specimen Sample Preservation for Cell and Tissue Cultures
NASA Technical Reports Server (NTRS)
Meeker, Gabrielle; Ronzana, Karolyn; Schibner, Karen; Evans, Robert
1996-01-01
The era of the International Space Station with its longer duration missions will pose unique challenges to microgravity life sciences research. The Space Station Biological Research Project (SSBRP) is responsible for addressing these challenges and defining the science requirements necessary to conduct life science research on-board the International Space Station. Space Station will support a wide range of cell and tissue culture experiments for durations of 1 to 30 days. Space Shuttle flights to bring experimental samples back to Earth for analyses will only occur every 90 days. Therefore, samples may have to be retained for periods up to 60 days. This presents a new challenge in fresh specimen sample storage for cell biology. Fresh specimen samples are defined as samples that are preserved by means other than fixation and cryopreservation. The challenge of long-term storage of fresh specimen samples includes the need to suspend or inhibit proliferation and metabolism pending return to Earth-based laboratories. With this challenge being unique to space research, there have not been any ground based studies performed to address this issue. It was decided hy SSBRP that experiment support studies to address the following issues were needed: Fixative Solution Management; Media Storage Conditions; Fresh Specimen Sample Storage of Mammalian Cell/Tissue Cultures; Fresh Specimen Sample Storage of Plant Cell/Tissue Cultures; Fresh Specimen Sample Storage of Aquatic Cell/Tissue Cultures; and Fresh Specimen Sample Storage of Microbial Cell/Tissue Cultures. The objective of these studies was to derive a set of conditions and recommendations that can be used in a long duration microgravity environment such as Space Station that will permit extended storage of cell and tissue culture specimens in a state consistent with zero or minimal growth, while at the same time maintaining their stability and viability.
NASA GRC and MSFC Space-Plasma Arc Testing Procedures
NASA Technical Reports Server (NTRS)
Ferguson, Dale C.; Vayner, Boris V.; Galofaro, Joel T.; Hillard, G. Barry; Vaughn, Jason; Schneider, Todd
2007-01-01
Tests of arcing and current collection in simulated space plasma conditions have been performed at the NASA Glenn Research Center (GRC) in Cleveland, Ohio, for over 30 years and at the Marshall Space Flight Center (MSFC) in Huntsville, Alabama, for almost as long. During this period, proper test conditions for accurate and meaningful space simulation have been worked out, comparisons with actual space performance in spaceflight tests and with real operational satellites have been made, and NASA has achieved our own internal standards for test protocols. It is the purpose of this paper to communicate the test conditions, test procedures, and types of analysis used at NASA GRC and MSFC to the space environmental testing community at large, to help with international space-plasma arcing-testing standardization. Discussed herein are neutral gas conditions, plasma densities and uniformity, vacuum chamber sizes, sample sizes and Debye lengths, biasing samples versus self-generated voltages, floating samples versus grounded samples, test electrical conditions, arc detection, preventing sustained discharges during testing, real samples versus idealized samples, validity of LEO tests for GEO samples, extracting arc threshold information from arc rate versus voltage tests, snapover, current collection, and glows at positive sample bias, Kapton pyrolysis, thresholds for trigger arcs, sustained arcs, dielectric breakdown and Paschen discharge, tether arcing and testing in very dense plasmas (i.e. thruster plumes), arc mitigation strategies, charging mitigation strategies, models, and analysis of test results. Finally, the necessity of testing will be emphasized, not to the exclusion of modeling, but as part of a complete strategy for determining when and if arcs will occur, and preventing them from occurring in space.
Resolving the Aerosol Piece of the Global Climate Picture
NASA Astrophysics Data System (ADS)
Kahn, R. A.
2017-12-01
Factors affecting our ability to calculate climate forcing and estimate model predictive skill include direct radiative effects of aerosols and their indirect effects on clouds. Several decades of Earth-observing satellite observations have produced a global aerosol column-amount (AOD) record, but an aerosol microphysical property record required for climate and many air quality applications is lacking. Surface-based photometers offer qualitative aerosol-type classification, and several space-based instruments map aerosol air-mass types under favorable conditions. However, aerosol hygroscopicity, mass extinction efficiency (MEE), and quantitative light absorption, must be obtained from in situ measurements. Completing the aerosol piece of the climate picture requires three elements: (1) continuing global AOD and qualitative type mapping from space-based, multi-angle imagers and aerosol vertical distribution from near-source stereo imaging and downwind lidar, (2) systematic, quantitative in situ observations of particle properties unobtainable from space, and (3) continuing transport modeling to connect observations to sources, and extrapolate limited sampling in space and time. At present, the biggest challenges to producing the needed aerosol data record are: filling gaps in particle property observations, maintaining global observing capabilities, and putting the pieces together. Obtaining the PDFs of key particle properties, adequately sampled, is now the leading observational deficiency. One simplifying factor is that, for a given aerosol source and season, aerosol amounts often vary, but particle properties tend to be repeatable. SAM-CAAM (Systematic Aircraft Measurements to Characterize Aerosol Air Masses), a modest aircraft payload deployed frequently could fill this gap, adding value to the entire satellite data record, improving aerosol property assumptions in retrieval algorithms, and providing MEEs to translate between remote-sensing optical constraints and aerosol mass book-kept in climate models [Kahn et al., BAMS 2017]. This will also improve connections between remote-sensing particle types and those defined in models. The third challenge, maintaining global observing capabilities, requires continued community effort and good budgetary fortune.
Gilbertson, Jan; Stevens, Maryjane; Stiell, Bernadette; Thorogood, Nicki
2006-08-01
This paper reports the results of research carried out as part of the national health impact evaluation of the Warm Front Scheme, a government initiative aimed at alleviating fuel poverty in England. Semi-structured interviews were carried out in a purposive sample of 49 households which received home energy improvements under the Scheme from five urban areas (Birmingham, Liverpool, Manchester, Newcastle, Southampton). Each household had received installation, replacement or refurbishment of the heating system and, in some cases, also insulation of the cavity wall or loft or both, and draught-proofing measures. Most householders reported improved and more controllable warmth and hot water. Many also reported perceptions of improved physical health and comfort, especially of mental health and emotional well-being and, in several cases, the easing of symptoms of chronic illness. There were reports of improved family relations, an expansion of the domestic space used during cold months, greater use of kitchens and improved nutrition, increased privacy, improved social interaction, and an increase in comfort and atmosphere within the home. Greater warmth and comfort also enhanced emotional security, and recipients were more content and at ease in their homes. However there was little evidence of substantially lower heating bills. These results provide evidence that Warm Front home energy improvements are accompanied by appreciable benefits in terms of use of living space, comfort and quality of life, physical and mental well-being, although there is only limited evidence of change in health behaviour.
Hood, Maureen N; Ho, Vincent B; Foo, Thomas K F; Marcos, Hani B; Hess, Sandra L; Choyke, Peter L
2002-09-01
Peripheral magnetic resonance angiography (MRA) is growing in use. However, methods of performing peripheral MRA vary widely and continue to be optimized, especially for improvement in illustration of infrapopliteal arteries. The main purpose of this project was to identify imaging factors that can improve arterial visualization in the lower leg using bolus chase peripheral MRA. Eighteen healthy adults were imaged on a 1.5T MR scanner. The calf was imaged using conventional three-station bolus chase three-dimensional (3D) MRA, two dimensional (2D) time-of-flight (TOF) MRA and single-station Gadolinium (Gd)-enhanced 3D MRA. Observer comparisons of vessel visualization, signal to noise ratios (SNR), contrast to noise ratios (CNR) and spatial resolution comparisons were performed. Arterial SNR and CNR were similar for all three techniques. However, arterial visualization was dramatically improved on dedicated, arterial-phase Gd-enhanced 3D MRA compared with the multi-station bolus chase MRA and 2D TOF MRA. This improvement was related to optimization of Gd-enhanced 3D MRA parameters (fast injection rate of 2 mL/sec, high spatial resolution imaging, the use of dedicated phased array coils, elliptical centric k-space sampling and accurate arterial phase timing for image acquisition). The visualization of the infrapopliteal arteries can be substantially improved in bolus chase peripheral MRA if voxel size, contrast delivery, and central k-space data acquisition for arterial enhancement are optimized. Improvements in peripheral MRA should be directed at these parameters.
1985-03-01
comparison of samples would be difficult. (5) A restrictive random sample allows the sample to be irregularly spaced throughout the auxiliary variable space ...looking or downward-looking probes and the very low background radiation from space contribute to high signal-to-noise ratio and allow the...sunshine and earthshine, chemiluminescent processes, and radiation to space , in addition to collisional processes, determine the vibrational
METAL RESISTIVITY MEASURING DEVICE
Renken, J. Jr.; Myers, R.G.
1960-12-20
An eddy current device is offered for detecting discontinuities in metal samples. Alternate short and long duration pulses are inductively applied to a metal sample via the outer coil of a probe. The long pulses give a resultant signal from the metal sample responsive to probe-tosample spacing and discontinuities within the sample and the shont pulses give a resultant signal responsive only to probe -to-sample spacing. The inner coil of the probe detects the two resultant signals and transmits them to a separation network where the two signals are separated. The two separated signals are then transmitted to a compensation network where the detected signals due to the short pulses are used to compensate for variations due to probe-to-sample spacing contained in the detected signals from the long pulses. Thus, a resultant signal is obtained responsive to discontinuities within the sample and independent of probe-to- sample spacing.
Metal Resistivity Measuring Device
Renken, Jr, C. J.; Myers, R. G.
1960-12-20
An eddy current device is designed for detecting discontinuities in metal samples. Alternate short and long duration pulses are inductively applied to a metal sample via the outer coil of a probe. The lorg pulses give a resultant signal from the metal sample responsive to probe-tosample spacing and discontinuities with the sample, and the short pulses give a resultant signal responsive only to probe-to-sample spacing. The inner coil of the probe detects the two resultant signals and transmits them to a separation network where the two signals are separated. The two separated signals are then transmitted to a compensation network where the detected signals due to the short pulses are used to compensate for variations due to probeto-sample spacing contained in the detected signals from the long pulses. Thus a resultant signal is obtained responsive to discontinuities within the sample and independent of probe-to- sample spacing.
Generalized probabilistic scale space for image restoration.
Wong, Alexander; Mishra, Akshaya K
2010-10-01
A novel generalized sampling-based probabilistic scale space theory is proposed for image restoration. We explore extending the definition of scale space to better account for both noise and observation models, which is important for producing accurately restored images. A new class of scale-space realizations based on sampling and probability theory is introduced to realize this extended definition in the context of image restoration. Experimental results using 2-D images show that generalized sampling-based probabilistic scale-space theory can be used to produce more accurate restored images when compared with state-of-the-art scale-space formulations, particularly under situations characterized by low signal-to-noise ratios and image degradation.
Polarization Signals of Common Spacecraft Materials
NASA Technical Reports Server (NTRS)
Gravseth, Ian; Culp, Robert D.; King, Nicole
1996-01-01
This is the final report documenting the results of the polarization testing of near-planar objects with various reflectance properties. The purpose of this investigation was to determine the portion of the reflected signal which is polarized for materials commonly used in space applications. Tests were conducted on several samples, with surface characteristics ranging from highly reflective to relatively dark. The measurements were obtained by suspending the test object in a beam of collimated light. The amount of light falling on the sample was controlled by a circular aperture placed in the light field. The polarized reflectance at various phase angles was then measured. A nonlinear least squares fitting program was used for analysis. For the specular test objects, the reflected signals were measured in one degree increments near the specular point. Otherwise, measurements were taken every five degrees in phase angle. Generally, the more diffuse surfaces had lower polarized reflectances than their more specular counterparts. The reflected signals for the more diffuse surfaces were spread over a larger phase angle range, while the signals from the more specular samples were reflected almost entirely within five degrees of angular deviation from the specular point. The method used to test all the surfaces is presented. The results of this study will be used to support the NASA Orbital Debris Optical Signature Tests. These tests are intended to help better understand the reflectance properties of materials often used in space applications. This data will then be used to improve the capabilities for identification and tracking of space debris.
International Space Station Air Quality Assessed According to Toxicologically-Grouped Compounds
NASA Technical Reports Server (NTRS)
James, John T.; Limero, Thomas F.; Beck, Steve; Cheng, Patti F.; deVera, Vanessa J.; Hand, Jennifer; Macatangay, Ariel
2010-01-01
Scores of compounds are found in the International Space Station (ISS) atmospheric samples that are returned to the Johnson Space Center Toxicology Laboratory for analysis. Spacecraft Maximum Allowable Concentrations (SMACs) are set with the view that each compound is present as if there were no other compounds present. In order to apply SMACs to the interpretation of the analytical data, the toxicologist must employ some method of combining the potential effects of the aggregate of compounds found in the atmospheric samples. The simplest approach is to assume that each quantifiable compound has the potential for some effect in proportion to the applicable SMAC, and then add all the proportions. This simple paradigm disregards the fact that most compounds have potential to adversely affect only a few physiological systems, and their effects would be independent rather than additive. An improved approach to dealing with exposure to mixtures is to add the proportions only for compounds that adversely affect the same physiological system. For example, toxicants that cause respiratory irritation are separated from those that cause neurotoxicity or cardio-toxicity. Herein we analyze ISS air quality data according to toxicological groups with a view that this could be used for understanding any crew symptoms occurring at the time of the sample acquisition. In addition, this approach could be useful in post-flight longitudinal surveys where the flight surgeon may need to identify post-flight, follow-up medical studies because of on-orbit exposures that target specific physiological systems.
International Space Station Air Quality Assessed According to Toxicologically-Grouped Compounds
NASA Technical Reports Server (NTRS)
James, John T.; Limero, Tom; DeVera, Vanessa; Cheng, Patti; Hand, Jennifer; Macatangay, Ariel; Beck, Steve
2009-01-01
Scores of compounds are found in the International Space Station (ISS) atmospheric samples that are returned to the Johnson Space Center Toxicology Laboratory for analysis. Spacecraft Maximum Allowable Concentrations (SMACs) are set with the view that each compound is present as if there were no other compounds present. In order to apply SMACs to the interpretation of the analytical data, the toxicologist must employ some method of combining the potential effects of the aggregate of compounds found in the atmospheric samples. The simplest approach is to assume that each quantifiable compound has the potential for some effect in proportion to the applicable SMAC, and then add all the proportions. This simple paradigm disregards the fact that most compounds have potential to adversely affect only a few physiological systems, and their effects would be independent rather than additive. An improved approach to dealing with exposure to mixtures is to add the proportions only for compounds that adversely affect the same physiological system. For example, toxicants that cause respiratory irritation are separated from those that cause neurotoxicity or cardio-toxicity. Herein we analyze ISS air quality data according to toxicological groups with a view that this could be used for understanding any crew symptoms occurring at the time of the sample. In addition, this approach could be useful in post-flight longitudinal surveys where the flight surgeon may need to identify post-flight, follow-up medical studies because of on-orbit exposures that target specific physiological systems.
NASA Astrophysics Data System (ADS)
Stall, S.
2016-12-01
The story of a sample starts with a proposal, a data management plan, and funded research. The sample is created, given a unique identifier (IGSN) and properly cared for during its journey to an appropriate storage location. Through its metadata, and publication information, the sample can become well known and shared with other researchers. Ultimately, a valuable sample can tell its entire story through its IGSN, associated ORCIDs, associated publication DOIs, and DOIs of data generated from sample analysis. This journey, or workflow, is in many ways still manual. Tools exist to generate IGSNs for the sample and subsamples. Publishers are committed to making IGSNs machine readable in their journals, but the connection back to the IGSN management system, specifically the System for Earth Sample Registration (SESAR) is not fully complete. Through encouragement of publishers, like AGU, and improved data management practices, such as those promoted by AGU's Data Management Assessment program, the complete lifecycle of a sample can and will be told through the journey it takes from creation, documentation (metadata), analysis, subsamples, publication, and sharing. Publishers and data facilities are using efforts like the Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) to "implement and promote common policies and procedures for the publication and citation of data across Earth Science journals", including IGSNs. As our community improves its data management practices and publishers adopt and enforce machine readable use of unique sample identifiers, the ability to tell the entire story of a sample is close at hand. Better Data Management results in Better Science.
Geological trainings for analogue astronauts: Lessons learned from MARS2013 expedition, Morocco
NASA Astrophysics Data System (ADS)
Orgel, C.; Achorner, I.; Losiak, A.; Gołębiowska, I.; Rampey, M.; Groemer, G.
2013-09-01
The Austrian Space Forum (OeWF) is a national organisation for space professionals and space enthusiasts. In collaboration with internal partner organisations, the OeWF focuses on Mars analogue research with their space volunteers and organises space-related outreach/education activities and conducts field tests with the Aouda.X and Aouda.S spacesuit simulators in Mars analogue environment. The main project of OeWF is called "PolAres" [1]. As the result of lessons learned from the Río Tinto 2011 expedition [4], we started to organise geological training sessions for the analogue astronauts. The idea was to give them basic geological background to perform more efficiently in the field. This was done in close imitation of the Apollo astronaut trainings that included theoretical lectures (between Jan. 1963-Nov. 1972) about impact geology, igneous petrology of the Moon, geophysics and geochemistry as well as several field trips to make them capable to collect useful samples for the geoscientists on Earth [3] [5]. In the last year the OeWF has organised three geoscience workshops for analogue astronauts as the part of their "astronaut" training. The aim was to educate the participants to make them understand the fundamentals in geology in theory and in the field (Fig. 1.). We proposed the "Geological Experiment Sampling Usefulness" (GESU) experiment for the MARS2013 simulation to improve the efficiency of the geological trainings. This simulation was conducted during February 2013, a one month Mars analogue research was conducted in the desert of Morocco [2] (Fig. 2.).
NASA Astrophysics Data System (ADS)
Francesco, Canganella; Giovanna, Bianconi
2007-09-01
The present work was mainly focused to study the response of representative non pathogenic microorganisms to the environment inside the space vehicle at different mission stages (10, 56, and 226 days) within the frame of the Italian ENEIDE mission, from Feb to Oct 2005. Microorganisms were chosen according to their phylogenetic position and cell structures; they were representatives of the three taxonomic domains and belonged to different ecosystems (food, soil, intestinal tract, plants, deep-sea). They were the followings: Thermococcus guaymasensis (Domain Archaea); Saccharomyces cerevisiae (Domain Eucarya); Escherichia coli, Bacillus subtilis, Lactobacillus acidophilus, Enterococcus faecium, Pseudomonas fluorescens, and Rhizobium tropici (Domain Bacteria). As main environmental parameters we were interested in: a) space radiations; b) microgravity; c) temperature. The response of microorganisms was investigated in terms of survival rates, cell structure modifications, and genomic damages. The survival of cells was affected by both radiation doses and intrinsec cell features. As expected, only samples kept on the ISS for 226 days showed significant levels of mortality. Asfar as regard the effect on cell structures, these samples showed also remarkable morphological changes, particularly for Escherichia coli, Enterococcus faecium, and Saccharomyces cerevisiae. The data collected allowed to get new insights into the biological traits of microorganisms exposed to space environment during the flight on a spacecraft. Moreover, the result obtained may be important for the improvement of human conditions aboard space vehicles (nutraceuticals for astronauts and disinfections of ISS modules) and also for the potential development of closed systems devoted to vegetable productions and organic recycling.
NASA Astrophysics Data System (ADS)
Ngom, Ndèye Fatou; Monga, Olivier; Ould Mohamed, Mohamed Mahmoud; Garnier, Patricia
2012-02-01
This paper focuses on the modeling of soil microstructures using generalized cylinders, with a specific application to pore space. The geometric modeling of these microstructures is a recent area of study, made possible by the improved performance of computed tomography techniques. X-scanners provide very-high-resolution 3D volume images ( 3-5μm) of soil samples in which pore spaces can be extracted by thresholding. However, in most cases, the pore space defines a complex volume shape that cannot be approximated using simple analytical functions. We propose representing this shape using a compact, stable, and robust piecewise approximation by means of generalized cylinders. This intrinsic shape representation conserves its topological and geometric properties. Our algorithm includes three main processing stages. The first stage consists in describing the volume shape using a minimum number of balls included within the shape, such that their union recovers the shape skeleton. The second stage involves the optimum extraction of simply connected chains of balls. The final stage copes with the approximation of each simply optimal chain using generalized cylinders: circular generalized cylinders, tori, cylinders, and truncated cones. This technique was applied to several data sets formed by real volume computed tomography soil samples. It was possible to demonstrate that our geometric representation supplied a good approximation of the pore space. We also stress the compactness and robustness of this method with respect to any changes affecting the initial data, as well as its coherence with the intuitive notion of pores. During future studies, this geometric pore space representation will be used to simulate biological dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
David, Aurelien; Fini, Paul T.; Houser, Kevin W.
We have developed a two-measure system for evaluating light sources’ color rendition that builds upon conceptual progress of numerous researchers over the last two decades. The system quantifies the color fidelity and color gamut (change in object chroma) of a light source in comparison to a reference illuminant. The calculations are based on a newly developed set of reflectance data from real samples uniformly distributed in color space (thereby fairly representing all colors) and in wavelength space (thereby precluding artificial optimization of the color rendition scores by spectral engineering). The color fidelity score R f is an improved version ofmore » the CIE color rendering index. The color gamut score R g is an improved version of the Gamut Area Index. In combination, they provide two complementary assessments to guide the optimization of future light sources. This method summarizes the findings of the Color Metric Task Group of the Illuminating Engineering Society of North America (IES). It is adopted in the upcoming IES TM-30-2015, and is proposed for consideration with the International Commission on Illumination (CIE).« less
Research on Robot Pose Control Technology Based on Kinematics Analysis Model
NASA Astrophysics Data System (ADS)
Liu, Dalong; Xu, Lijuan
2018-01-01
In order to improve the attitude stability of the robot, proposes an attitude control method of robot based on kinematics analysis model, solve the robot walking posture transformation, grasping and controlling the motion planning problem of robot kinematics. In Cartesian space analytical model, using three axis accelerometer, magnetometer and the three axis gyroscope for the combination of attitude measurement, the gyroscope data from Calman filter, using the four element method for robot attitude angle, according to the centroid of the moving parts of the robot corresponding to obtain stability inertia parameters, using random sampling RRT motion planning method, accurate operation to any position control of space robot, to ensure the end effector along a prescribed trajectory the implementation of attitude control. The accurate positioning of the experiment is taken using MT-R robot as the research object, the test robot. The simulation results show that the proposed method has better robustness, and higher positioning accuracy, and it improves the reliability and safety of robot operation.
Exploiting the wavelet structure in compressed sensing MRI.
Chen, Chen; Huang, Junzhou
2014-12-01
Sparsity has been widely utilized in magnetic resonance imaging (MRI) to reduce k-space sampling. According to structured sparsity theories, fewer measurements are required for tree sparse data than the data only with standard sparsity. Intuitively, more accurate image reconstruction can be achieved with the same number of measurements by exploiting the wavelet tree structure in MRI. A novel algorithm is proposed in this article to reconstruct MR images from undersampled k-space data. In contrast to conventional compressed sensing MRI (CS-MRI) that only relies on the sparsity of MR images in wavelet or gradient domain, we exploit the wavelet tree structure to improve CS-MRI. This tree-based CS-MRI problem is decomposed into three simpler subproblems then each of the subproblems can be efficiently solved by an iterative scheme. Simulations and in vivo experiments demonstrate the significant improvement of the proposed method compared to conventional CS-MRI algorithms, and the feasibleness on MR data compared to existing tree-based imaging algorithms. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans
2015-02-01
The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this relative improvement decreases with increasing number of sample points and input parameter dimensions. Since the computational time and efforts for generating the sample designs in the two approaches are identical, the use of midpoint LHS as the initial design in OLHS is thus recommended.
Rapid sampling of stochastic displacements in Brownian dynamics simulations
NASA Astrophysics Data System (ADS)
Fiore, Andrew M.; Balboa Usabiaga, Florencio; Donev, Aleksandar; Swan, James W.
2017-03-01
We present a new method for sampling stochastic displacements in Brownian Dynamics (BD) simulations of colloidal scale particles. The method relies on a new formulation for Ewald summation of the Rotne-Prager-Yamakawa (RPY) tensor, which guarantees that the real-space and wave-space contributions to the tensor are independently symmetric and positive-definite for all possible particle configurations. Brownian displacements are drawn from a superposition of two independent samples: a wave-space (far-field or long-ranged) contribution, computed using techniques from fluctuating hydrodynamics and non-uniform fast Fourier transforms; and a real-space (near-field or short-ranged) correction, computed using a Krylov subspace method. The combined computational complexity of drawing these two independent samples scales linearly with the number of particles. The proposed method circumvents the super-linear scaling exhibited by all known iterative sampling methods applied directly to the RPY tensor that results from the power law growth of the condition number of tensor with the number of particles. For geometrically dense microstructures (fractal dimension equal three), the performance is independent of volume fraction, while for tenuous microstructures (fractal dimension less than three), such as gels and polymer solutions, the performance improves with decreasing volume fraction. This is in stark contrast with other related linear-scaling methods such as the force coupling method and the fluctuating immersed boundary method, for which performance degrades with decreasing volume fraction. Calculations for hard sphere dispersions and colloidal gels are illustrated and used to explore the role of microstructure on performance of the algorithm. In practice, the logarithmic part of the predicted scaling is not observed and the algorithm scales linearly for up to 4 ×106 particles, obtaining speed ups of over an order of magnitude over existing iterative methods, and making the cost of computing Brownian displacements comparable to the cost of computing deterministic displacements in BD simulations. A high-performance implementation employing non-uniform fast Fourier transforms implemented on graphics processing units and integrated with the software package HOOMD-blue is used for benchmarking.
The astrobiological mission EXPOSE-R on board of the International Space Station
NASA Astrophysics Data System (ADS)
Rabbow, Elke; Rettberg, Petra; Barczyk, Simon; Bohmeier, Maria; Parpart, Andre; Panitz, Corinna; Horneck, Gerda; Burfeindt, Jürgen; Molter, Ferdinand; Jaramillo, Esther; Pereira, Carlos; Weiß, Peter; Willnecker, Rainer; Demets, René; Dettmann, Jan
2015-01-01
EXPOSE-R flew as the second of the European Space Agency (ESA) EXPOSE multi-user facilities on the International Space Station. During the mission on the external URM-D platform of the Zvezda service module, samples of eight international astrobiology experiments selected by ESA and one Russian guest experiment were exposed to low Earth orbit space parameters from March 10th, 2009 to January 21st, 2011. EXPOSE-R accommodated a total of 1220 samples for exposure to selected space conditions and combinations, including space vacuum, temperature cycles through 273 K, cosmic radiation, solar electromagnetic radiation at >110, >170 or >200 nm at various fluences up to GJ m-2. Samples ranged from chemical compounds via unicellular organisms and multicellular mosquito larvae and seeds to passive radiation dosimeters. Additionally, one active radiation measurement instrument was accommodated on EXPOSE-R and commanded from ground in accordance with the facility itself. Data on ultraviolet radiation, cosmic radiation and temperature were measured every 10 s and downlinked by telemetry and data carrier every few months. The EXPOSE-R trays and samples returned to Earth on March 9th, 2011 with Shuttle flight, Space Transportation System (STS)-133/ULF 5, Discovery, after successful total mission duration of 27 months in space. The samples were analysed in the individual investigators laboratories. A parallel Mission Ground Reference experiment was performed on ground with a parallel set of hardware and samples under simulated space conditions following to the data transmitted from the flight mission.
Reduced aliasing artifacts using shaking projection k-space sampling trajectory
NASA Astrophysics Data System (ADS)
Zhu, Yan-Chun; Du, Jiang; Yang, Wen-Chao; Duan, Chai-Jie; Wang, Hao-Yu; Gao, Song; Bao, Shang-Lian
2014-03-01
Radial imaging techniques, such as projection-reconstruction (PR), are used in magnetic resonance imaging (MRI) for dynamic imaging, angiography, and short-T2 imaging. They are less sensitive to flow and motion artifacts, and support fast imaging with short echo times. However, aliasing and streaking artifacts are two main sources which degrade radial imaging quality. For a given fixed number of k-space projections, data distributions along radial and angular directions will influence the level of aliasing and streaking artifacts. Conventional radial k-space sampling trajectory introduces an aliasing artifact at the first principal ring of point spread function (PSF). In this paper, a shaking projection (SP) k-space sampling trajectory was proposed to reduce aliasing artifacts in MR images. SP sampling trajectory shifts the projection alternately along the k-space center, which separates k-space data in the azimuthal direction. Simulations based on conventional and SP sampling trajectories were compared with the same number projections. A significant reduction of aliasing artifacts was observed using the SP sampling trajectory. These two trajectories were also compared with different sampling frequencies. A SP trajectory has the same aliasing character when using half sampling frequency (or half data) for reconstruction. SNR comparisons with different white noise levels show that these two trajectories have the same SNR character. In conclusion, the SP trajectory can reduce the aliasing artifact without decreasing SNR and also provide a way for undersampling reconstruction. Furthermore, this method can be applied to three-dimensional (3D) hybrid or spherical radial k-space sampling for a more efficient reduction of aliasing artifacts.
Li, Shuo; Zhu, Yanchun; Xie, Yaoqin; Gao, Song
2018-01-01
Dynamic magnetic resonance imaging (DMRI) is used to noninvasively trace the movements of organs and the process of drug delivery. The results can provide quantitative or semiquantitative pathology-related parameters, thus giving DMRI great potential for clinical applications. However, conventional DMRI techniques suffer from low temporal resolution and long scan time owing to the limitations of the k-space sampling scheme and image reconstruction algorithm. In this paper, we propose a novel DMRI sampling scheme based on a golden-ratio Cartesian trajectory in combination with a compressed sensing reconstruction algorithm. The results of two simulation experiments, designed according to the two major DMRI techniques, showed that the proposed method can improve the temporal resolution and shorten the scan time and provide high-quality reconstructed images.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
A team KuuKulgur Robot from Estonia is seen on the practice field during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team KuuKulgur is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Sam Ortega, NASA program manager of Centennial Challenges, watches as robots attempt the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The University of California Santa Cruz Rover Team prepares their rover for the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Worcester Polytechnic Institute (WPI) President Laurie Leshin, speaks at a breakfast opening the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The team AERO robot drives off the starting platform during the level one competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Team Cephal's robot is seen on the starting platform during a rerun of the level one challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
David Miller, NASA Chief Technologist, speaks at a breakfast opening the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Oregon State University Mars Rover Team's robot is seen during level one competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
Jerry Waechter of team Middleman from Dunedin, Florida, works on their robot named Ro-Bear during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Middleman is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
A robot from the Intrepid Systems team is seen during the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
A team KuuKulgur robot is seen as it begins the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The team Mountaineers robot is seen as it attempts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Members of the Oregon State University Mars Rover Team prepare their robot to attempt the level one competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Stellar Automation Systems team poses for a picture with their robot after attempting the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team Survey robot is seen as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
All four of team KuuKulgur's robots are seen as they attempt the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Spectators watch as the team Survey robot conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team Middleman's robot, Ro-Bear, is seen as it starts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Two of team KuuKulgur's robots are seen as they attempt a rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
A robot from the University of Waterloo Robotics Team is seen during the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Members of team Survey follow their robot as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The entrance to Institute Park is seen during the level one challenge as during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Sam Ortega, NASA Centennial Challenges Program Manager, speaks at a breakfast opening the TouchTomorrow Festival, held in conjunction with the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
James Leopore, of team Fetch, from Alexandria, Virginia, speaks with judges as he prepares for the NASA 2014 Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team Fetch is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team survey robot is seen on the starting platform before begging it's attempt at the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Mountaineers team from West Virginia University, watches as their robot attempts the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
The team Survey robot is seen as it conducts a demonstration of the level two challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Team Survey's robot is seen as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Improving time-delay cosmography with spatially resolved kinematics
NASA Astrophysics Data System (ADS)
Shajib, Anowar J.; Treu, Tommaso; Agnello, Adriano
2018-01-01
Strongly gravitational lensed quasars can be used to measure the so-called time-delay distance DΔt, and thus the Hubble constant H0 and other cosmological parameters. Stellar kinematics of the deflector galaxy play an essential role in this measurement by: (i) helping break the mass-sheet degeneracy; (ii) determining in principle the angular diameter distance Dd to the deflector and thus further improving the cosmological constraints. In this paper we simulate observations of lensed quasars with integral field spectrographs and show that spatially resolved kinematics of the deflector enables further progress by helping break the mass-anisotropy degeneracy. Furthermore, we use our simulations to obtain realistic error estimates with current/upcoming instruments like OSIRIS on Keck and NIRSPEC on the James Webb Space Telescope for both distances (typically ∼6 per cent on DΔt and ∼10 per cent on Dd). We use the error estimates to compute cosmological forecasts for the sample of nine lenses that currently have well-measured time delays and deep Hubble Space Telescope images and for a sample of 40 lenses that is projected to be available in a few years through follow-up of candidates found in ongoing wide field surveys. We find that H0 can be measured with 2 per cent (1 per cent) precision from nine (40) lenses in a flat Λcold dark matter cosmology. We study several other cosmological models beyond the flat Λcold dark matter model and find that time-delay lenses with spatially resolved kinematics can greatly improve the precision of the cosmological parameters measured by cosmic microwave background data.
NASA Astrophysics Data System (ADS)
Blum, M. D.; Umbarger, K.
2017-12-01
The Triassic Chinle Formation is a fluvial succession deposited in a backarc setting across the present-day Colorado Plateau of the southwestern United States. Existing studies have proposed various mechanisms responsible for the unique stratigraphic architecture and depositional sequences of the Chinle. However, these studies lack necessary age control to correlate stratigraphic patterns with contemporaneous mechanisms. This study will collect new samples for detrital zircon analysis, as well as upgrade existing samples (to n=300) from Dickinson and Gehrels (2008), to improve the resolution of Triassic sediment provenance from source-to-sink. The improved dataset allows appraisal of the multiple provenance terranes that contributed to the Chinle depositional system to delineate and reconstruct paleodrainage patterns. The additional samples will be collected systematically from the base of the Chinle, and vertically throughout the section to capture a regional story of how the continental scale drainage reorganized through time. U-Pb ages of detrital zircons will be utilized to provide quantitative fingerprinting information to constrain interpretations for the origin and transport history of the Chinle fluvial succession in time and space.
Using isotopes to investigate hydrological flow pathways and sources in a remote Arctic catchment
NASA Astrophysics Data System (ADS)
Lessels, Jason; Tetzlaff, Doerthe; Dinsmore, Kerry; Street, Lorna; Billet, Mike; Baxter, Robert; Subke, Jens-Arne; Wookey, Phillip
2014-05-01
Stable water isotopes allow for the identification of flow paths and stream water sources. This ability is beneficial in improving the understanding in catchments with dynamic spatial and temporal sources. Arctic catchments are characterised with strong seasonality where the dominant flow paths change throughout the short summer season. Therefore, the identification of stream water sources through time and space is necessary in order to accurately quantify these dynamics. Stable isotope tracers are incredibly useful tools which integrate processes of time and space and therefore, particularly useful in identifying flow pathways and runoff sources at remote sites. This work presents stable isotope data collected from a small (1km2) catchment in Northwest Canada. The aims of this study are to 1) identify sources of stream water through time and space, 2) provide information which will be incorporated into hydrological and transit time models Sampling of snowmelt, surface runoff, ice-wedge polygons, stream and soil water was undertaken throughout the 2013 summer. The results of this sampling reveal the dominant flow paths in the catchment and the strong influence of aspect in controlling these processes. After the spring freshet, late lying snow packs on north facing slopes and thawing permafrost on south facing slopes are the dominant sources of stream water. Progressively through the season the thawing permafrost and precipitation become the largest contributing sources. The depth of the thawing aspect layer and consequently the contribution to the stream is heavily dependent on aspect. The collection of precipitation, soil and stream isotope samples throughout the summer period provide valuable information for transit time estimates. The combination of spatial and temporal sampling of stable isotopes has revealed clear differences between the main stream sources in the studied catchment and reinforced the importance of slope aspect in these catchments.
Park, Jinil; Shin, Taehoon; Yoon, Soon Ho; Goo, Jin Mo; Park, Jang-Yeon
2016-05-01
The purpose of this work was to develop a 3D radial-sampling strategy which maintains uniform k-space sample density after retrospective respiratory gating, and demonstrate its feasibility in free-breathing ultrashort-echo-time lung MRI. A multi-shot, interleaved 3D radial sampling function was designed by segmenting a single-shot trajectory of projection views such that each interleaf samples k-space in an incoherent fashion. An optimal segmentation factor for the interleaved acquisition was derived based on an approximate model of respiratory patterns such that radial interleaves are evenly accepted during the retrospective gating. The optimality of the proposed sampling scheme was tested by numerical simulations and phantom experiments using human respiratory waveforms. Retrospectively, respiratory-gated, free-breathing lung MRI with the proposed sampling strategy was performed in healthy subjects. The simulation yielded the most uniform k-space sample density with the optimal segmentation factor, as evidenced by the smallest standard deviation of the number of neighboring samples as well as minimal side-lobe energy in the point spread function. The optimality of the proposed scheme was also confirmed by minimal image artifacts in phantom images. Human lung images showed that the proposed sampling scheme significantly reduced streak and ring artifacts compared with the conventional retrospective respiratory gating while suppressing motion-related blurring compared with full sampling without respiratory gating. In conclusion, the proposed 3D radial-sampling scheme can effectively suppress the image artifacts due to non-uniform k-space sample density in retrospectively respiratory-gated lung MRI by uniformly distributing gated radial views across the k-space. Copyright © 2016 John Wiley & Sons, Ltd.
Plasma Immersion Ion Implantation with Solid Targets for Space and Aerospace Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, R. M.; Goncalves, J. A. N.; Ueda, M.
2009-01-05
This paper describes successful results obtained by a new type of plasma source, named as Vaporization of Solid Targets (VAST), for treatment of materials for space and aerospace applications, by means of plasma immersion ion implantation and deposition (PIII and D). Here, the solid element is vaporized in a high pressure glow discharge, being further ionized and implanted/deposited in a low pressure cycle, with the aid of an extra electrode. First experiments in VAST were run using lithium as the solid target. Samples of silicon and aluminum alloy (2024) were immersed into highly ionized lithium plasma, whose density was measuredmore » by a double Langmuir probe. Measurements performed with scanning electron microscopy (SEM) showed clear modification of the cross-sectioned treated silicon samples. X-ray photoelectron spectroscopy (XPS) analysis revealed that lithium was implanted/deposited into/onto the surface of the silicon. Implantation depth profiles may vary according to the condition of operation of VAST. One direct application of this treatment concerns the protection against radiation damage for silicon solar cells. For the case of the aluminum alloy, X-ray diffraction analysis indicated the appearance of prominent new peaks. Surface modification of A12024 by lithium implantation/deposition can lower the coefficient of friction and improve the resistance to fatigue of this alloy. Recently, cadmium was vaporized and ionized in VAST. The main benefit of this element is associated with the improvement of corrosion resistance of metallic substrates. Besides lithium and cadmium, VAST allows to performing PIII and D with other species, leading to the modification of the near-surface of materials for distinct purposes, including applications in the space and aerospace areas.« less
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Russel Howe of team Survey speaks with Sample Return Robot Challenge staff members after the team's robot failed to leave the starting platform during it's attempt at the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Kenneth Stafford, Assistant Director of Robotics Engineering and Director of the Robotics Resource Center at the Worcester Polytechnic Institute (WPI), verifies the location of the target sample during the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Members of the Mountaineers team from West Virginia University celebrate after their robot returned to the starting platform after picking up the sample during a rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Machine Learning Toolkit for Extreme Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-03-31
Support Vector Machines (SVM) is a popular machine learning technique, which has been applied to a wide range of domains such as science, finance, and social networks for supervised learning. MaTEx undertakes the challenge of designing a scalable parallel SVM training algorithm for large scale systems, which includes commodity multi-core machines, tightly connected supercomputers and cloud computing systems. Several techniques are proposed for improved speed and memory space usage including adaptive and aggressive elimination of samples for faster convergence , and sparse format representation of data samples. Several heuristics for earliest possible to lazy elimination of non-contributing samples are consideredmore » in MaTEx. In many cases, where an early sample elimination might result in a false positive, low overhead mechanisms for reconstruction of key data structures are proposed. The proposed algorithm and heuristics are implemented and evaluated on various publicly available datasets« less
Compact Microscope Imaging System With Intelligent Controls Improved
NASA Technical Reports Server (NTRS)
McDowell, Mark
2004-01-01
The Compact Microscope Imaging System (CMIS) with intelligent controls is a diagnostic microscope analysis tool with intelligent controls for use in space, industrial, medical, and security applications. This compact miniature microscope, which can perform tasks usually reserved for conventional microscopes, has unique advantages in the fields of microscopy, biomedical research, inline process inspection, and space science. Its unique approach integrates a machine vision technique with an instrumentation and control technique that provides intelligence via the use of adaptive neural networks. The CMIS system was developed at the NASA Glenn Research Center specifically for interface detection used for colloid hard spheres experiments; biological cell detection for patch clamping, cell movement, and tracking; and detection of anode and cathode defects for laboratory samples using microscope technology.
Holcomb, David A; Messier, Kyle P; Serre, Marc L; Rowny, Jakob G; Stewart, Jill R
2018-06-25
Predictive modeling is promising as an inexpensive tool to assess water quality. We developed geostatistical predictive models of microbial water quality that empirically modeled spatiotemporal autocorrelation in measured fecal coliform (FC) bacteria concentrations to improve prediction. We compared five geostatistical models featuring different autocorrelation structures, fit to 676 observations from 19 locations in North Carolina's Jordan Lake watershed using meteorological and land cover predictor variables. Though stream distance metrics (with and without flow-weighting) failed to improve prediction over the Euclidean distance metric, incorporating temporal autocorrelation substantially improved prediction over the space-only models. We predicted FC throughout the stream network daily for one year, designating locations "impaired", "unimpaired", or "unassessed" if the probability of exceeding the state standard was ≥90%, ≤10%, or >10% but <90%, respectively. We could assign impairment status to more of the stream network on days any FC were measured, suggesting frequent sample-based monitoring remains necessary, though implementing spatiotemporal predictive models may reduce the number of concurrent sampling locations required to adequately assess water quality. Together, these results suggest that prioritizing sampling at different times and conditions using geographically sparse monitoring networks is adequate to build robust and informative geostatistical models of water quality impairment.
Braze Development of Graphite Fiber for Use in Phase Change Material Heat Sinks
NASA Technical Reports Server (NTRS)
Quinn, Gregory; Gleason, Brian; Beringer, Woody; Stephen, Ryan
2010-01-01
Hamilton Sundstrand (HS), together with NASA Johnson Space Center, developed methods to metallurgically join graphite fiber to aluminum. The goal of the effort was to demonstrate improved thermal conductance, tensile strength and manufacturability compared to existing epoxy bonded techniques. These improvements have the potential to increase the performance and robustness of phase change material heat sinks that use graphite fibers as an interstitial material. Initial work focused on evaluating joining techniques from 4 suppliers, each consisting of a metallization step followed by brazing or soldering of one inch square blocks of Fibercore graphite fiber material to aluminum end sheets. Results matched the strength and thermal conductance of the epoxy bonded control samples, so two suppliers were down-selected for a second round of braze development. The second round of braze samples had up to a 300% increase in strength and up to a 132% increase in thermal conductance over the bonded samples. However, scalability and repeatability proved to be significant hurdles with the metallization approach. An alternative approach was pursued which used nickel and active braze allows to prepare the carbon fibers for joining with aluminum. This approach was repeatable and scalable with improved strength and thermal conductance when compared with epoxy bonding.
Weighted statistical parameters for irregularly sampled time series
NASA Astrophysics Data System (ADS)
Rimoldini, Lorenzo
2014-01-01
Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.
Investigation on the Cracking Character of Jointed Rock Mass Beneath TBM Disc Cutter
NASA Astrophysics Data System (ADS)
Yang, Haiqing; Liu, Junfeng; Liu, Bolong
2018-04-01
With the purpose to investigate the influence of joint dip angle and spacing on the TBM rock-breaking efficacy and cracking behaviour, experiments that include miniature cutter head tests are carried out on sandstone rock material. In the experiment, prefabricated joints of different forms are made in rock samples. Then theoretical analysis is conducted to improve the calculating models of the fractured work and crack length of rock in the TBM process. The experimental results indicate that lower rupture angles appear for specimens with joint dip angles between 45° and 60°. Meanwhile, rock-breaking efficacy for rock mass with joint dip angles in this interval is also higher. Besides, the fracture patterns are transformed from compressive shear mode to tensile shear mode as the joint spacing decreases. As a result, failure in a greater extent is resulted for specimens with smaller joint spacings. The results above suggest that joint dip angle between 45° and 60° and joint spacing of 1 cm are the optimal rock-breaking conditions for the tested specimens. Combining the present experimental data and taking the joint dip angle and spacing into consideration, the calculating model for rock fractured work that proposed by previous scholars is improved. Finally, theoretical solution of rock median and side crack length is also derived based on the analytical method of elastoplastic invasion fracture for indenter. The result of the analytical solution is also in good agreement with the actual measured experimental result. The present study may provide some primary knowledge about the rock cracking character and breaking efficacy under different engineering conditions.
NASA Astrophysics Data System (ADS)
Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.
2017-12-01
Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the potential to improve forest AGB accounting certainty and provide maps for post-model fitting analysis of the spatial distribution of AGB.
An improved pulse sequence and inversion algorithm of T2 spectrum
NASA Astrophysics Data System (ADS)
Ge, Xinmin; Chen, Hua; Fan, Yiren; Liu, Juntao; Cai, Jianchao; Liu, Jianyu
2017-03-01
The nuclear magnetic resonance transversal relaxation time is widely applied in geological prospecting, both in laboratory and downhole environments. However, current methods used for data acquisition and inversion should be reformed to characterize geological samples with complicated relaxation components and pore size distributions, such as samples of tight oil, gas shale, and carbonate. We present an improved pulse sequence to collect transversal relaxation signals based on the CPMG (Carr, Purcell, Meiboom, and Gill) pulse sequence. The echo spacing is not constant but varies in different windows, depending on prior knowledge or customer requirements. We use the entropy based truncated singular value decomposition (TSVD) to compress the ill-posed matrix and discard small singular values which cause the inversion instability. A hybrid algorithm combining the iterative TSVD and a simultaneous iterative reconstruction technique is implemented to reach the global convergence and stability of the inversion. Numerical simulations indicate that the improved pulse sequence leads to the same result as CPMG, but with lower echo numbers and computational time. The proposed method is a promising technique for geophysical prospecting and other related fields in future.
NASA Astrophysics Data System (ADS)
Dudak, J.; Zemlicka, J.; Karch, J.; Hermanova, Z.; Kvacek, J.; Krejci, F.
2017-01-01
Photon counting detectors Timepix are known for their unique properties enabling X-ray imaging with extremely high contrast-to-noise ratio. Their applicability has been recently further improved since a dedicated technique for assembling large area Timepix detector arrays was introduced. Despite the fact that the sensitive area of Timepix detectors has been significantly increased, the pixel pitch is kept unchanged (55 microns). This value is much larger compared to widely used and popular X-ray imaging cameras utilizing scintillation crystals and CCD-based read-out. On the other hand, photon counting detectors provide steeper point-spread function. Therefore, with given effective pixel size of an acquired radiography, Timepix detectors provide higher spatial resolution than X-ray cameras with scintillation-based devices unless the image is affected by penumbral blur. In this paper we take an advance of steep PSF of photon counting detectors and test the possibility to improve the quality of computed tomography reconstruction using finer sampling of reconstructed voxel space. The achieved results are presented in comparison with data acquired under the same conditions using a commercially available state-of-the-art CCD X-ray camera.
Kim, Ki Hwan; Do, Won-Joon; Park, Sung-Hong
2018-05-04
The routine MRI scan protocol consists of multiple pulse sequences that acquire images of varying contrast. Since high frequency contents such as edges are not significantly affected by image contrast, down-sampled images in one contrast may be improved by high resolution (HR) images acquired in another contrast, reducing the total scan time. In this study, we propose a new deep learning framework that uses HR MR images in one contrast to generate HR MR images from highly down-sampled MR images in another contrast. The proposed convolutional neural network (CNN) framework consists of two CNNs: (a) a reconstruction CNN for generating HR images from the down-sampled images using HR images acquired with a different MRI sequence and (b) a discriminator CNN for improving the perceptual quality of the generated HR images. The proposed method was evaluated using a public brain tumor database and in vivo datasets. The performance of the proposed method was assessed in tumor and no-tumor cases separately, with perceptual image quality being judged by a radiologist. To overcome the challenge of training the network with a small number of available in vivo datasets, the network was pretrained using the public database and then fine-tuned using the small number of in vivo datasets. The performance of the proposed method was also compared to that of several compressed sensing (CS) algorithms. Incorporating HR images of another contrast improved the quantitative assessments of the generated HR image in reference to ground truth. Also, incorporating a discriminator CNN yielded perceptually higher image quality. These results were verified in regions of normal tissue as well as tumors for various MRI sequences from pseudo k-space data generated from the public database. The combination of pretraining with the public database and fine-tuning with the small number of real k-space datasets enhanced the performance of CNNs in in vivo application compared to training CNNs from scratch. The proposed method outperformed the compressed sensing methods. The proposed method can be a good strategy for accelerating routine MRI scanning. © 2018 American Association of Physicists in Medicine.
2003-09-08
KENNEDY SPACE CENTER, FLA. - The Minus Eighty Lab Freezer for ISS (MELFI), provided as Laboratory Support Equipment by the European Space Agency for the International Space Station, is seen in the Space Station Processing Facility. The lab will provide cooling and storage for reagents, samples and perishable materials in four insulated containers called dewars with independently selectable temperatures of -80°C, -26°C, and +4°C. It also will be used to transport samples to and from the station. The MELFI is planned for launch on the ULF-1 mission.
2003-09-08
KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, technicians remove the cover from the Minus Eighty Lab Freezer for ISS(MELFI) provided as Laboratory Support Equipment by the European Space Agency for the International Space Station. The lab will provide cooling and storage for reagents, samples and perishable materials in four insulated containers called dewars with independently selectable temperatures of -80°C, -26°C, and +4°C. It also will be used to transport samples to and from the station. The MELFI is planned for launch on the ULF-1 mission.
A nonvoxel-based dose convolution/superposition algorithm optimized for scalable GPU architectures.
Neylon, J; Sheng, K; Yu, V; Chen, Q; Low, D A; Kupelian, P; Santhanam, A
2014-10-01
Real-time adaptive planning and treatment has been infeasible due in part to its high computational complexity. There have been many recent efforts to utilize graphics processing units (GPUs) to accelerate the computational performance and dose accuracy in radiation therapy. Data structure and memory access patterns are the key GPU factors that determine the computational performance and accuracy. In this paper, the authors present a nonvoxel-based (NVB) approach to maximize computational and memory access efficiency and throughput on the GPU. The proposed algorithm employs a ray-tracing mechanism to restructure the 3D data sets computed from the CT anatomy into a nonvoxel-based framework. In a process that takes only a few milliseconds of computing time, the algorithm restructured the data sets by ray-tracing through precalculated CT volumes to realign the coordinate system along the convolution direction, as defined by zenithal and azimuthal angles. During the ray-tracing step, the data were resampled according to radial sampling and parallel ray-spacing parameters making the algorithm independent of the original CT resolution. The nonvoxel-based algorithm presented in this paper also demonstrated a trade-off in computational performance and dose accuracy for different coordinate system configurations. In order to find the best balance between the computed speedup and the accuracy, the authors employed an exhaustive parameter search on all sampling parameters that defined the coordinate system configuration: zenithal, azimuthal, and radial sampling of the convolution algorithm, as well as the parallel ray spacing during ray tracing. The angular sampling parameters were varied between 4 and 48 discrete angles, while both radial sampling and parallel ray spacing were varied from 0.5 to 10 mm. The gamma distribution analysis method (γ) was used to compare the dose distributions using 2% and 2 mm dose difference and distance-to-agreement criteria, respectively. Accuracy was investigated using three distinct phantoms with varied geometries and heterogeneities and on a series of 14 segmented lung CT data sets. Performance gains were calculated using three 256 mm cube homogenous water phantoms, with isotropic voxel dimensions of 1, 2, and 4 mm. The nonvoxel-based GPU algorithm was independent of the data size and provided significant computational gains over the CPU algorithm for large CT data sizes. The parameter search analysis also showed that the ray combination of 8 zenithal and 8 azimuthal angles along with 1 mm radial sampling and 2 mm parallel ray spacing maintained dose accuracy with greater than 99% of voxels passing the γ test. Combining the acceleration obtained from GPU parallelization with the sampling optimization, the authors achieved a total performance improvement factor of >175 000 when compared to our voxel-based ground truth CPU benchmark and a factor of 20 compared with a voxel-based GPU dose convolution method. The nonvoxel-based convolution method yielded substantial performance improvements over a generic GPU implementation, while maintaining accuracy as compared to a CPU computed ground truth dose distribution. Such an algorithm can be a key contribution toward developing tools for adaptive radiation therapy systems.
A nonvoxel-based dose convolution/superposition algorithm optimized for scalable GPU architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neylon, J., E-mail: jneylon@mednet.ucla.edu; Sheng, K.; Yu, V.
Purpose: Real-time adaptive planning and treatment has been infeasible due in part to its high computational complexity. There have been many recent efforts to utilize graphics processing units (GPUs) to accelerate the computational performance and dose accuracy in radiation therapy. Data structure and memory access patterns are the key GPU factors that determine the computational performance and accuracy. In this paper, the authors present a nonvoxel-based (NVB) approach to maximize computational and memory access efficiency and throughput on the GPU. Methods: The proposed algorithm employs a ray-tracing mechanism to restructure the 3D data sets computed from the CT anatomy intomore » a nonvoxel-based framework. In a process that takes only a few milliseconds of computing time, the algorithm restructured the data sets by ray-tracing through precalculated CT volumes to realign the coordinate system along the convolution direction, as defined by zenithal and azimuthal angles. During the ray-tracing step, the data were resampled according to radial sampling and parallel ray-spacing parameters making the algorithm independent of the original CT resolution. The nonvoxel-based algorithm presented in this paper also demonstrated a trade-off in computational performance and dose accuracy for different coordinate system configurations. In order to find the best balance between the computed speedup and the accuracy, the authors employed an exhaustive parameter search on all sampling parameters that defined the coordinate system configuration: zenithal, azimuthal, and radial sampling of the convolution algorithm, as well as the parallel ray spacing during ray tracing. The angular sampling parameters were varied between 4 and 48 discrete angles, while both radial sampling and parallel ray spacing were varied from 0.5 to 10 mm. The gamma distribution analysis method (γ) was used to compare the dose distributions using 2% and 2 mm dose difference and distance-to-agreement criteria, respectively. Accuracy was investigated using three distinct phantoms with varied geometries and heterogeneities and on a series of 14 segmented lung CT data sets. Performance gains were calculated using three 256 mm cube homogenous water phantoms, with isotropic voxel dimensions of 1, 2, and 4 mm. Results: The nonvoxel-based GPU algorithm was independent of the data size and provided significant computational gains over the CPU algorithm for large CT data sizes. The parameter search analysis also showed that the ray combination of 8 zenithal and 8 azimuthal angles along with 1 mm radial sampling and 2 mm parallel ray spacing maintained dose accuracy with greater than 99% of voxels passing the γ test. Combining the acceleration obtained from GPU parallelization with the sampling optimization, the authors achieved a total performance improvement factor of >175 000 when compared to our voxel-based ground truth CPU benchmark and a factor of 20 compared with a voxel-based GPU dose convolution method. Conclusions: The nonvoxel-based convolution method yielded substantial performance improvements over a generic GPU implementation, while maintaining accuracy as compared to a CPU computed ground truth dose distribution. Such an algorithm can be a key contribution toward developing tools for adaptive radiation therapy systems.« less
NASA Astrophysics Data System (ADS)
Heras-Juaristi, Gemma; Pérez-Coll, Domingo; Mather, Glenn C.
2016-11-01
The effects of sintering temperature and addition of 4 mol.% ZnO as sintering additive on the crystal structure, microstructure and electrical properties of SrZr0.9Y0.1O3-δ are reported. The presence of ZnO as sintering aid brings about high densification at 1300 °C (relative density ∼97%); gas-tightness is not achieved for ZnO-free samples sintered below 1600 °C. Bulk conductivity (σB) is considerably higher in wet and dry O2 on doping with ZnO, but only slight variations of σB with sintering temperature are observed for the Zn-containing phases. Similarly, the apparent grain-boundary conductivities are much greater for the Zn-doped samples. The grain-boundary volume and accompanying resistances are much reduced on sintering at 1500 °C with ZnO addition in comparison to Zn-modified samples sintered below 1500 °C, with only minor changes in grain-boundary relaxation frequency observed. Conversely, in comparison to the undoped sample with sintering temperature of 1600 °C, there is an enormous improvement in the specific grain-boundary conductivity of two orders of magnitude for the ZnO-containing samples. Analysis on the basis of the core space-charge-layer model relates the enhancement of the grain-boundary transport to a higher concentration of charge carriers in the space-charge layer and associated lower potential barrier heights.
Sample-space-based feature extraction and class preserving projection for gene expression data.
Wang, Wenjun
2013-01-01
In order to overcome the problems of high computational complexity and serious matrix singularity for feature extraction using Principal Component Analysis (PCA) and Fisher's Linear Discrinimant Analysis (LDA) in high-dimensional data, sample-space-based feature extraction is presented, which transforms the computation procedure of feature extraction from gene space to sample space by representing the optimal transformation vector with the weighted sum of samples. The technique is used in the implementation of PCA, LDA, Class Preserving Projection (CPP) which is a new method for discriminant feature extraction proposed, and the experimental results on gene expression data demonstrate the effectiveness of the method.
Space Environment Effects on Materials at Different Positions and Operational Periods of ISS
NASA Astrophysics Data System (ADS)
Kimoto, Yugo; Ichikawa, Shoichi; Miyazaki, Eiji; Matsumoto, Koji; Ishizawa, Junichiro; Shimamura, Hiroyuki; Yamanaka, Riyo; Suzuki, Mineo
2009-01-01
A space materials exposure experiment was condcuted on the exterior of the Russian Service Module (SM) of the International Space Station (ISS) using the Micro-Particles Capturer and Space Environment Exposure Device (MPAC&SEED) of the Japan Aerospace Exploration Agency (JAXA). Results reveal artificial environment effects such as sample contamination, attitude change effects on AO fluence, and shading effects of UV on ISS. The sample contamination was coming from ISS components. The particles attributed to micrometeoroids and/or debris captured by MPAC might originate from the ISS solar array. Another MPAC&SEED will be aboard the Exposure Facility of the Japanese Experiment Module, KIBO Exposure Facility (EF) on ISS. The JEM/MPAC&SEED is attached to the Space Environment Data Acquisition Equipment-Attached Payload (SEDA-AP) and is exposed to space. Actually, SEDA-AP is a payload on EF to be launched by Space Shuttle flight 2J/A. In fact, SEDA-AP has space environment monitors such as a high-energy particle monitor, atomic oxygen monitor, and plasma monitor to measure in-situ natural space environment data during JEM/MPAC&SEED exposure. Some exposure samples for JEM/MPAC&SEED are identical to SM/MPAC&SEED samples. Consequently, effects on identical materials at different positions and operation periods of ISS will be evaluated. This report summarizes results from space environment monitoring samples for atomic oxygen analysis on SM/MPAC&SEED, along with experimental plans for JEM/MPAC&SEED.
Wallace, Joseph M
2015-04-01
Collagen's role in bone is often considered secondary. As increased attention is paid to collagen, understanding the impact of tissue preservation is important in interpreting experimental results. The goal of this study was to test the hypothesis that bone fixation prior to demineralization would maintain its collagen ultrastructure in an undisturbed state when analyzed using Atomic Force Microscopy (AFM). The anterior diaphysis of a pig femur was cut into 6 mm pieces along its length. Samples were mounted, polished and randomly assigned to control or fixation groups (n = 5/group). Fixation samples were fixed for 24 h prior to demineralization. All samples were briefly demineralized to expose collagen, and imaged using AFM. Mouse tail tendons were also analyzed to explore effects of dehydration and fixation. Measurements from each bone sample were averaged and compared using a Mann-Whitney U-test. Tendon sample means were compared using RMANOVA. To investigate differences in D-spacing distributions, Kolmogorov-Smirnov tests were used. Fixation decreased D-spacing variability within and between bone samples and induced or maintained a higher average D-spacing versus control by shifting the D-spacing population upward. Tendon data indicate that fixing and drying samples leaves collagen near its undisturbed and hydrated native state. Fixation in bone prior to demineralization decreased D-spacing variability. D-spacing was shifted upward in fixed samples, indicating that collagen is stretched with mineral present and relaxes upon its removal. The ability to decrease variability in bone suggests that fixation might increase the power to detect changes in collagen due to disease or other pressures.
Pisharady, Pramod Kumar; Sotiropoulos, Stamatios N; Sapiro, Guillermo; Lenglet, Christophe
2017-09-01
We propose a sparse Bayesian learning algorithm for improved estimation of white matter fiber parameters from compressed (under-sampled q-space) multi-shell diffusion MRI data. The multi-shell data is represented in a dictionary form using a non-monoexponential decay model of diffusion, based on continuous gamma distribution of diffusivities. The fiber volume fractions with predefined orientations, which are the unknown parameters, form the dictionary weights. These unknown parameters are estimated with a linear un-mixing framework, using a sparse Bayesian learning algorithm. A localized learning of hyperparameters at each voxel and for each possible fiber orientations improves the parameter estimation. Our experiments using synthetic data from the ISBI 2012 HARDI reconstruction challenge and in-vivo data from the Human Connectome Project demonstrate the improvements.
High-Frequency Subband Compressed Sensing MRI Using Quadruplet Sampling
Sung, Kyunghyun; Hargreaves, Brian A
2013-01-01
Purpose To presents and validates a new method that formalizes a direct link between k-space and wavelet domains to apply separate undersampling and reconstruction for high- and low-spatial-frequency k-space data. Theory and Methods High- and low-spatial-frequency regions are defined in k-space based on the separation of wavelet subbands, and the conventional compressed sensing (CS) problem is transformed into one of localized k-space estimation. To better exploit wavelet-domain sparsity, CS can be used for high-spatial-frequency regions while parallel imaging can be used for low-spatial-frequency regions. Fourier undersampling is also customized to better accommodate each reconstruction method: random undersampling for CS and regular undersampling for parallel imaging. Results Examples using the proposed method demonstrate successful reconstruction of both low-spatial-frequency content and fine structures in high-resolution 3D breast imaging with a net acceleration of 11 to 12. Conclusion The proposed method improves the reconstruction accuracy of high-spatial-frequency signal content and avoids incoherent artifacts in low-spatial-frequency regions. This new formulation also reduces the reconstruction time due to the smaller problem size. PMID:23280540
In Situ Biodosimetric Experiment for Space Applications
NASA Astrophysics Data System (ADS)
Goldschmidt, Gergely; Kovaliczky, Éva; Szabó, József; Rontó, Györgyi; Bérces, Attila
2012-06-01
This paper presents the principles and application of DNA based biological UV dosimeters, as developed by Research Group for Biophysics (RGB). These dosimeters are used for assessing the biological hazard of living systems on the Earth's surface and in different waters (rivers, lakes, seas, etc.). The UV dosimetry system has also been used in the space. In dosimeters a bacterial virus, bacteriophage T7 and polycrystalline uracil thin layers have been used as biological detectors. On the Earth's surface the UV radiation induces dimer formation in phage T7 and in the uracil detector, which was evaluated by loss of viability of the phage particles and by the decrease of the characteristic optical density (OD) of uracil thin layers. Recently the development of human space activities has also increased the need to measure the biological effect of extraterrestrial solar radiation, too. The evaluation of the space samples occurred on ground, thus only the starting and the final state were taken into account. A new improved, automated method is presented below which makes data collection more efficient and also makes the dynamics of the process observable.
Treating Sample Covariances for Use in Strongly Coupled Atmosphere-Ocean Data Assimilation
NASA Astrophysics Data System (ADS)
Smith, Polly J.; Lawless, Amos S.; Nichols, Nancy K.
2018-01-01
Strongly coupled data assimilation requires cross-domain forecast error covariances; information from ensembles can be used, but limited sampling means that ensemble derived error covariances are routinely rank deficient and/or ill-conditioned and marred by noise. Thus, they require modification before they can be incorporated into a standard assimilation framework. Here we compare methods for improving the rank and conditioning of multivariate sample error covariance matrices for coupled atmosphere-ocean data assimilation. The first method, reconditioning, alters the matrix eigenvalues directly; this preserves the correlation structures but does not remove sampling noise. We show that it is better to recondition the correlation matrix rather than the covariance matrix as this prevents small but dynamically important modes from being lost. The second method, model state-space localization via the Schur product, effectively removes sample noise but can dampen small cross-correlation signals. A combination that exploits the merits of each is found to offer an effective alternative.
Selective Data Acquisition in NMR. The Quantification of Anti-phase Scalar Couplings
NASA Astrophysics Data System (ADS)
Hodgkinson, P.; Holmes, K. J.; Hore, P. J.
Almost all time-domain NMR experiments employ "linear sampling," in which the NMR response is digitized at equally spaced times, with uniform signal averaging. Here, the possibilities of nonlinear sampling are explored using anti-phase doublets in the indirectly detected dimensions of multidimensional COSY-type experiments as an example. The Cramér-Rao lower bounds are used to evaluate and optimize experiments in which the sampling points, or the extent of signal averaging at each point, or both, are varied. The optimal nonlinear sampling for the estimation of the coupling constant J, by model fitting, turns out to involve just a few key time points, for example, at the first node ( t= 1/ J) of the sin(π Jt) modulation. Such sparse sampling patterns can be used to derive more practical strategies, in which the sampling or the signal averaging is distributed around the most significant time points. The improvements in the quantification of NMR parameters can be quite substantial especially when, as is often the case for indirectly detected dimensions, the total number of samples is limited by the time available.
Improvement of the thermal stability of Nb:TiO2-x samples for uncooled infrared detectors
NASA Astrophysics Data System (ADS)
Reddy, Y. Ashok Kumar; Kang, In-Ku; Shin, Young Bong; Lee, Hee Chul
2018-01-01
In order to reduce the sun-burn effect in a sample of the bolometric material Nb:TiO2-x , oxygen annealing was carried out. This effect can be examined by comparing thermal stability test results between the as-deposited and oxygen-atmosphere-annealed samples under high-temperature exposure conditions. Structural studies confirm the presence of amorphous and rutile phases in the as-deposited and annealed samples, respectively. Composition studies reveal the offset of oxygen vacancies in the Nb:TiO2-x samples through oxygen-atmosphere annealing. The oxygen atoms were diffused and seemed to occupy the vacant sites in the annealed samples. As a result, the annealed samples show better thermal stability performance than the as-deposited samples. The universal bolometric parameter (β) values were slightly decreased in the oxygen-annealed Nb:TiO2-x samples. Although bolometric performance was slightly decreased in the oxygen-annealed samples, high thermal stability would be the most essential factor in the case of special applications, such as the military and space industries. Finally, these results will be very useful for reducing the sun-burn effect in infrared detectors.
Whole Module Offgas Test Report: Space-Xl Dragon Module
NASA Technical Reports Server (NTRS)
James, John T.
2012-01-01
On September 26 and September 28,2012 a chemist from the JSC Toxicology Group acquired samples of air in 500 m1 evacuated canisters from the sealed Space-Xl Dragon Module. One sample was also acquired from Space-X Facility near the module at the start of the test. Samples of the module air were taken in triplicate once the module had been sealed, and then taken again in triplicate 1.98 days later. Ofthe triplicate samples, the first served as a line purge, and the last two were analyzed. The results of 5 samples are reported.
Insights into Regolith Dynamics from the Irradiation Record Preserved in Hayabusa Samples
NASA Technical Reports Server (NTRS)
Keller, Lindsay P.; Berger, E. L.
2014-01-01
The rates of space weathering processes are poorly constrained for asteroid surfaces, with recent estimates ranging over 5 orders of magnitude. The return of the first surface samples from a space-weathered asteroid by the Hayabusa mission and their laboratory analysis provides "ground truth" to anchor the timescales for space weathering. We determine the rates of space weathering on Itokawa by measuring solar flare track densities and the widths of solar wind damaged rims on grains. These measurements are made possible through novel focused ion beam (FIB) sample preparation methods.
Local Feature Selection for Data Classification.
Armanfard, Narges; Reilly, James P; Komeili, Majid
2016-06-01
Typical feature selection methods choose an optimal global feature subset that is applied over all regions of the sample space. In contrast, in this paper we propose a novel localized feature selection (LFS) approach whereby each region of the sample space is associated with its own distinct optimized feature set, which may vary both in membership and size across the sample space. This allows the feature set to optimally adapt to local variations in the sample space. An associated method for measuring the similarities of a query datum to each of the respective classes is also proposed. The proposed method makes no assumptions about the underlying structure of the samples; hence the method is insensitive to the distribution of the data over the sample space. The method is efficiently formulated as a linear programming optimization problem. Furthermore, we demonstrate the method is robust against the over-fitting problem. Experimental results on eleven synthetic and real-world data sets demonstrate the viability of the formulation and the effectiveness of the proposed algorithm. In addition we show several examples where localized feature selection produces better results than a global feature selection method.
A survey of rapid sample return needs from Space Station Freedom and potential return systems
NASA Technical Reports Server (NTRS)
Mccandless, Ronald S.; Siegel, Bette; Charlton, Kevin
1991-01-01
Results are presented of a survey conducted among potential users of the life sciences and material sciences facilities at the Space Station Freedom (SSF) to determine the need for a special rapid sample return (RSR) mission to bring the experimental samples from the Space Station Freedom (SSF) to earth between the Space Shuttle visits. The results of the survey show that, while some experimental objectives would benefit from the RSR capability, other available cost- and mission-effective means could be used instead of the RSR proposed. Potential vehicles for transporting samples from the SSF to earth are examined in the context of the survey results.
Estimating Terrestrial Carbon Exchange from Space: How Often and How Well?
NASA Technical Reports Server (NTRS)
Knox, Robert G.; Hall, Forrest G.; Huemmrich, Karl F.; Gervin, Janette C.
2003-01-01
Data from a new space mission measuring integrated light-use efficiency could provide a breakthrough in understanding of global carbon, water, and energy dynamics, and greatly improve the accuracy of model predictions for terrestrial carbon cycles and climate. Over the past decade, Gamon and others have shown that changes in photo-protective pigments are sensitive indicators of declines in light-use efficiency of plants and plant canopies. The requirements for integrated diurnal measurements from space need to be defined, before a space mission can be formulated successfully using this concept. We used towerbased CO2 flux data as idealized proxies for remote measurements, examining their sampling properties. Thousands of half-hourly CO2 flux measurements are needed before their average begins to converge on an average annual net CO2 exchange. Estimates of daily integrated fluxes (i.e., diurnal curves) are more statistically efficient, especially if the spacing between measured days is quasiregular, rather than random. Using a few measurements per day one can distinguish among days with different net CO2 exchanges. Fluxes sampled between mid-morning to mid-afternoon are more diagnostic than early morning or late afternoon measurements. Similar results (correlation >0.935) were obtained using 2 measurements per day with high accuracy ([:plusmn:]5%), 3 measurements per day with medium accuracy ([:plusmn:] 10%), or 5 measurements per day at lower accuracy ([:plusmn:]20%). An observatory in a geosynchronous or near-geosynchronous orbit could provide appropriate observations, as could a multi-satellite constellation in polar orbits, but there is a potential trade-off between the required number of observations per day and quality of each observation.
Reichert, Miriam; Morelli, John N; Runge, Val M; Tao, Ai; von Ritschl, Ruediger; von Ritschl, Andreas; Padua, Abraham; Dix, James E; Marra, Michael J; Schoenberg, Stefan O; Attenberger, Ulrike I
2013-01-01
The aim of this study was to compare the detection of brain metastases at 3 T using a 32-channel head coil with 2 different 3-dimensional (3D) contrast-enhanced sequences, a T1-weighted fast spin-echo-based (SPACE; sampling perfection with application-optimized contrasts using different flip angle evolutions) sequence and a conventional magnetization-prepared rapid gradient-echo (MP-RAGE) sequence. Seventeen patients with 161 brain metastases were examined prospectively using both SPACE and MP-RAGE sequences on a 3-T magnetic resonance system. Eight healthy volunteers were similarly examined for determination of signal-to-noise ratio (SNR) values. Parameters were adjusted to equalize acquisition times between the sequences (3 minutes and 30 seconds). The order in which sequences were performed was randomized. Two blinded board-certified neuroradiologists evaluated the number of detectable metastatic lesions with each sequence relative to a criterion standard reading conducted at the Gamma Knife facility by a neuroradiologist with access to all clinical and imaging data. In the volunteer assessment with SPACE and MP-RAGE, SNR (10.3 ± 0.8 vs 7.7 ± 0.7) and contrast-to-noise ratio (0.8 ± 0.2 vs 0.5 ± 0.1) were statistically significantly greater with the SPACE sequence (P < 0.05). Overall, lesion detection was markedly improved with the SPACE sequence (99.1% of lesions for reader 1 and 96.3% of lesions for reader 2) compared with the MP-RAGE sequence (73.6% of lesions for reader 1 and 68.5% of lesions for reader 2; P < 0.01). A 3D T1-weighted fast spin echo sequence (SPACE) improves detection of metastatic lesions relative to 3D T1-weighted gradient-echo-based scan (MP-RAGE) imaging when implemented with a 32-channel head coil at identical scan acquisition times (3 minutes and 30 seconds).
NASA Technical Reports Server (NTRS)
Angart, S.; Lauer, M.; Tewari, S. N.; Grugel, R. N.; Poirier, D. R.
2014-01-01
This article reports research that has been carried out under the aegis of NASA as part of a collaboration between ESA and NASA for solidification experiments on the International Space Station (ISS). The focus has been on the effect of convection on the microstructural evolution and macrosegregation in hypoeutectic Al-Si alloys during directional solidification (DS). Terrestrial DS-experiments have been carried out at Cleveland State University (CSU) and under microgravity on the International Space Station (ISS). The thermal processing-history of the experiments is well defined for both the terrestrially processed samples and the ISS-processed samples. As of this writing, two dendritic metrics was measured: primary dendrite arm spacings and primary dendrite trunk diameters. We have observed that these dendrite-metrics of two samples grown in the microgravity environment show good agreements with models based on diffusion controlled growth and diffusion controlled ripening, respectively. The gravity-driven convection (i.e., thermosolutal convection) in terrestrially grown samples has the effect of decreasing the primary dendrite arm spacings and causes macrosegregation. Dendrite trunk diameters also show differences between the earth- and space-grown samples. In order to process DS-samples aboard the ISS, the dendritic seed crystals were partially remelted in a stationary thermal gradient before the DS was carried out. Microstructural changes and macrosegregation effects during this period are described and have modeled.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Team AERO, from the Worcester Polytechnic Institute (WPI) transports their robot to the competition field for the level one of the competition during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Robots that will be competing in the Level one competition are seen as they sit in impound prior to the start of competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
Ahti Heinla, left, and Sulo Kallas, right, from Estonia, prepare team KuuKulgur's robot for the rerun of the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Jascha Little of team Survey is seen as he follows the teams robot as it conducts a demonstration of the level two challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The University of California Santa Cruz Rover Team poses for a picture with their robot after attempting the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-14
The University of California Santa Cruz Rover Team's robot is seen prior to starting it's second attempt at the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Saturday, June 14, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The Oregon State University Mars Rover Team poses for a picture with their robot following their attempt at the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
The University of Waterloo Robotics Team, from Canada, prepares to place their robot on the start platform during the level one challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
The University of Waterloo Robotics Team, from Ontario, Canada, prepares their robot for the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. The team from the University of Waterloo is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Sam Ortega, NASA program manager for Centennial Challenges, is interviewed by a member of the media before the start of level two competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Jim Rothrock, left, and Carrie Johnson, right, of the Wunderkammer Laboratory team pose for a picture with their robot after attempting the level one competition during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
The Oregon State University Mars Rover Team follows their robot on the practice field during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. The Oregon State University Mars Rover Team is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-11
Jerry Waechter of team Middleman from Dunedin, Florida, speaks about his team's robot, Ro-Bear, as it makes it attempt at the level one challenge during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Wednesday, June 11, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
The Oregon State University Mars Rover Team, from Corvallis, Oregon, follows their robot on the practice field during the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. The Oregon State University Mars Rover Team is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Apparatus and method for the spectrochemical analysis of liquids using the laser spark
Cremers, David A.; Radziemski, Leon J.; Loree, Thomas R.
1990-01-01
A method and apparatus for the qualitative and quantitative spectroscopic investigation of elements present in a liquid sample using the laser spark. A series of temporally closely spaced spark pairs is induced in the liquid sample utilizing pulsed electromagnetic radiation from a pair of lasers. The light pulses are not significantly absorbed by the sample so that the sparks occur inside of the liquid. The emitted light from the breakdown events is spectrally and temporally resolved, and the time period between the two laser pulses in each spark pair is adjusted to maximize the signal-to-noise ratio of the emitted signals. In comparison with the single pulse technique, a substantial reduction in the limits of detectability for many elements has been demonstrated. Narrowing of spectral features results in improved discrimination against interfering species.
Apparatus and method for the spectrochemical analysis of liquids using the laser spark
Cremers, D.A.; Radziemski, L.J.; Loree, T.R.
1984-05-01
A method and apparatus are disclosed for the qualitative and quantitative spectroscopic investigation of elements present in a liquid sample using the laser spark. A series of temporally closely spaced spark pairs is induced in the liquid sample utilizing pulsed electromagnetic radiation from a pair of lasers. The light pulses are not significantly absorbed by the sample so that the sparks occur inside of the liquid. The emitted light from the breakdown events is spectrally and temporally resolved, and the time period between the two laser pulses in each spark pair is adjusted to maximize the signal-to-noise ratio of the emitted signals. In comparison with the single pulse technique, a substantial reduction in the limits of detectability for many elements has been demonstrated. Narrowing of spectral features results in improved discrimination against interfering species.
NASA Astrophysics Data System (ADS)
Mezgebo, Biniyam; Nagib, Karim; Fernando, Namal; Kordi, Behzad; Sherif, Sherif
2018-02-01
Swept Source optical coherence tomography (SS-OCT) is an important imaging modality for both medical and industrial diagnostic applications. A cross-sectional SS-OCT image is obtained by applying an inverse discrete Fourier transform (DFT) to axial interferograms measured in the frequency domain (k-space). This inverse DFT is typically implemented as a fast Fourier transform (FFT) that requires the data samples to be equidistant in k-space. As the frequency of light produced by a typical wavelength-swept laser is nonlinear in time, the recorded interferogram samples will not be uniformly spaced in k-space. Many image reconstruction methods have been proposed to overcome this problem. Most such methods rely on oversampling the measured interferogram then use either hardware, e.g., Mach-Zhender interferometer as a frequency clock module, or software, e.g., interpolation in k-space, to obtain equally spaced samples that are suitable for the FFT. To overcome the problem of nonuniform sampling in k-space without any need for interferogram oversampling, an earlier method demonstrated the use of the nonuniform discrete Fourier transform (NDFT) for image reconstruction in SS-OCT. In this paper, we present a more accurate method for SS-OCT image reconstruction from nonuniform samples in k-space using a scaled nonuniform Fourier transform. The result is demonstrated using SS-OCT images of Axolotl salamander eggs.
Recent Advances in 3D Time-Resolved Contrast-Enhanced MR Angiography
Riederer, Stephen J.; Haider, Clifton R.; Borisch, Eric A.; Weavers, Paul T.; Young, Phillip M.
2015-01-01
Contrast-enhanced MR angiography (CE-MRA) was first introduced for clinical studies approximately 20 years ago. Early work provided 3 to 4 mm spatial resolution with acquisition times in the 30 sec range. Since that time there has been continuing effort to provide improved spatial resolution with reduced acquisition time, allowing high resolution three-dimensional (3D) time-resolved studies. The purpose of this work is to describe how this has been accomplished. Specific technical enablers have been: improved gradients allowing reduced repetition times, improved k-space sampling and reconstruction methods, parallel acquisition particularly in two directions, and improved and higher count receiver coil arrays. These have collectively made high resolution time-resolved studies readily available for many anatomic regions. Depending on the application, approximate 1 mm isotropic resolution is now possible with frame times of several seconds. Clinical applications of time-resolved CE-MRA are briefly reviewed. PMID:26032598
Improvement of Productivity in TIG Welding Plant by Equipment Design in Orbit
NASA Astrophysics Data System (ADS)
Gnanavel, C.; Saravanan, R.; Chandrasekaran, M.; Jayakanth, J. J.
2017-03-01
Measurements and improvements are very indispensable task at all levels of management. Here some samples are, at operator level: Measuring operating parameters to ensure OEE (Overall Equipment Effectiveness) and measuring Q components performance to ensure quality, at supervisory level: measuring operator’s performance to ensure labour utility at managerial level: production and productivity measurements and at top level capital and capacity utilization. An often accepted statement is “Improvement is impossible without measurement”. Measurements often referred as observation. The case study was conducted at Government Boiler factory in India. The scientific approach followed for indentifying non value added activities. Personalised new equipment designed and installed to achieve productivity improvement of 85% for a day. The new equipment can serve 360o around its axis hence it simplified loading and unloading procedures as well as reduce their times and ensured effective space and time.
NASA Technical Reports Server (NTRS)
Fernandez-Moran, H.; Pritzker, A. N.
1974-01-01
Improved instrumentation and preparation techniques for high resolution, high voltage cryo-electron microscopic and diffraction studies on terrestrial and extraterrestrial specimens are reported. Computer correlated ultrastructural and biochemical work on hydrated and dried cell membranes and related biological systems provided information on membrane organization, ice crystal formation and ordered water, RNA virus linked to cancer, lunar rock samples, and organometallic superconducting compounds. Apollo 11, 12, 14, and 15 specimens were analyzed
Non-ECG-gated unenhanced MRA of the carotids: optimization and clinical feasibility.
Raoult, H; Gauvrit, J Y; Schmitt, P; Le Couls, V; Bannier, E
2013-11-01
To optimise and assess the clinical feasibility of a carotid non-ECG-gated unenhanced MRA sequence. Sixteen healthy volunteers and 11 patients presenting with internal carotid artery (ICA) disease underwent large field-of-view balanced steady-state free precession (bSSFP) unenhanced MRA at 3T. Sampling schemes acquiring the k-space centre either early (kCE) or late (kCL) in the acquisition window were evaluated. Signal and image quality was scored in comparison to ECG-gated kCE unenhanced MRA and TOF. For patients, computed tomography angiography was used as the reference. In volunteers, kCE sampling yielded higher image quality than kCL and TOF, with fewer flow artefacts and improved signal homogeneity. kCE unenhanced MRA image quality was higher without ECG-gating. Arterial signal and artery/vein contrast were higher with both bSSFP sampling schemes than with TOF. The kCE sequence allowed correct quantification of ten significant stenoses, and it facilitated the identification of an infrapetrous dysplasia, which was outside of the TOF imaging coverage. Non-ECG-gated bSSFP carotid imaging offers high-quality images and is a promising sequence for carotid disease diagnosis in a short acquisition time with high spatial resolution and a large field of view. • Non-ECG-gated unenhanced bSSFP MRA offers high-quality imaging of the carotid arteries. • Sequences using early acquisition of the k-space centre achieve higher image quality. • Non-ECG-gated unenhanced bSSFP MRA allows quantification of significant carotid stenosis. • Short MR acquisition times and ungated sequences are helpful in clinical practice. • High 3D spatial resolution and a large field of view improve diagnostic performance.
Polarized BRDF for coatings based on three-component assumption
NASA Astrophysics Data System (ADS)
Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong
2017-02-01
A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.
Low temperature storage container for transporting perishables to space station
NASA Technical Reports Server (NTRS)
Owen, James W. (Inventor); Dean, William G. (Inventor)
1989-01-01
Two storage containers are disclosed within which food or biological samples may be stored for transfer in a module by the space shuttle to a space station while maintaining the food or samples at very low temperatures. The container is formed in two parts, each part having an inner shell and an outer shell disposed about the inner shell. The space between the shells is filled with a continuous wrap multi-layer insulation and a getter material. The two parts of the container have interlocking members and when connected together are sealed for preventing leakage from the space between the shells. After the two parts are filled with frozen food or samples they are connected together and a vacuum is drawn in the space between the shells and the container is stored in the module. For the extremely low temperature requirements of biological samples, an internal liner having a phase change material charged by a refrigerant coil is disposed in the space between the shells, and the container is formed from glass fiber material including honeycomb structural elements. All surfaces of the glass fiber which face the vacuum space are lined with a metal foil.
International Space Station (ISS) 3D Printer Performance and Material Characterization Methodology
NASA Technical Reports Server (NTRS)
Bean, Q. A.; Cooper, K. G.; Edmunson, J. E.; Johnston, M. M.; Werkheiser, M. J.
2015-01-01
In order for human exploration of the Solar System to be sustainable, manufacturing of necessary items on-demand in space or on planetary surfaces will be a requirement. As a first step towards this goal, the 3D Printing In Zero-G (3D Print) technology demonstration made the first items fabricated in space on the International Space Station. From those items, and comparable prints made on the ground, information about the microgravity effects on the printing process can be determined. Lessons learned from this technology demonstration will be applicable to other in-space manufacturing technologies, and may affect the terrestrial manufacturing industry as well. The flight samples were received at the George C. Marshall Space Flight Center on 6 April 2015. These samples will undergo a series of tests designed to not only thoroughly characterize the samples, but to identify microgravity effects manifested during printing by comparing their results to those of samples printed on the ground. Samples will be visually inspected, photographed, scanned with structured light, and analyzed with scanning electron microscopy. Selected samples will be analyzed with computed tomography; some will be assessed using ASTM standard tests. These tests will provide the information required to determine the effects of microgravity on 3D printing in microgravity.
Development and testing of fiber-reinforced composite space maintainers.
Kulkarni, Gajanan; Lau, Domenic; Hafezi, Sara
2009-01-01
The purpose of this study was to develop a clinically acceptable, cheaper, and more expedient alternative to standard stainless steel band and loop space maintainers. Loops of fiber-reinforced composites were constructed using polyethylene fiber (Ribbond) and glass fiber (Sticktech). The loops were bonded on extracted third molars and tested for flexural strength before and after thermocycling and following repair of the appliances after initial stress failure. Bacterial colonization on the appliances was also compared. Conventional stainless steel band and loop space maintainers cemented with Ketac were controls. Ribbond samples demonstrated higher flexural strength than Sticktech and the control (P<.05). No differences were noted among the other samples and the control. The repaired Ribbond samples were statistically comparable in flexural strength to the initial samples. Thermocycling resulted in decreased flexural strength of both Ribbond and Sticktech (P<.05). Thermocycled Ribbond samples were comparable to the control, but a lower flexural strength was noted for Sticktech samples (P<.05). While all space maintainers allowed some bacterial adhesion, Sticktech showed higher Streptococcus mutans counts than Ribbond (P=.06). Ribbond space-maintainers are comparable to the stainless steel in terms of physical strength and biofilm formation. The fiber-reinforced composite space maintainers may be a clinically acceptable and expedient alternative to the conventional band-loop appliance.
Ghorai, Santanu; Mukherjee, Anirban; Dutta, Pranab K
2010-06-01
In this brief we have proposed the multiclass data classification by computationally inexpensive discriminant analysis through vector-valued regularized kernel function approximation (VVRKFA). VVRKFA being an extension of fast regularized kernel function approximation (FRKFA), provides the vector-valued response at single step. The VVRKFA finds a linear operator and a bias vector by using a reduced kernel that maps a pattern from feature space into the low dimensional label space. The classification of patterns is carried out in this low dimensional label subspace. A test pattern is classified depending on its proximity to class centroids. The effectiveness of the proposed method is experimentally verified and compared with multiclass support vector machine (SVM) on several benchmark data sets as well as on gene microarray data for multi-category cancer classification. The results indicate the significant improvement in both training and testing time compared to that of multiclass SVM with comparable testing accuracy principally in large data sets. Experiments in this brief also serve as comparison of performance of VVRKFA with stratified random sampling and sub-sampling.
Multiple capillary biochemical analyzer with barrier member
Dovichi, N.J.; Zhang, J.Z.
1996-10-22
A multiple capillary biochemical analyzer is disclosed for sequencing DNA and performing other analyses, in which a set of capillaries extends from wells in a microtiter plate into a cuvette. In the cuvette the capillaries are held on fixed closely spaced centers by passing through a sandwich construction having a pair of metal shims which squeeze between them a rubber gasket, forming a leak proof seal for an interior chamber in which the capillary ends are positioned. Sheath fluid enters the chamber and entrains filament sample streams from the capillaries. The filament sample streams, and sheath fluid, flow through aligned holes in a barrier member spaced close to the capillary ends, into a collection chamber having a lower glass window. The filament streams are illuminated above the barrier member by a laser, causing them to fluoresce. The fluorescence is viewed end-on by a CCD camera chip located below the glass window. The arrangement ensures an equal optical path length from all fluorescing spots to the CCD chip and also blocks scattered fluorescence illumination, providing more uniform results and an improved signal-to-noise ratio. 12 figs.
Multiple capillary biochemical analyzer with barrier member
Dovichi, Norman J.; Zhang, Jian Z.
1996-01-01
A multiple capillary biochemical analyzer for sequencing DNA and performing other analyses, in which a set of capillaries extends from wells in a microtiter plate into a cuvette. In the cuvette the capillaries are held on fixed closely spaced centers by passing through a sandwich construction having a pair of metal shims which squeeze between them a rubber gasket, forming a leak proof seal for an interior chamber in which the capillary ends are positioned. Sheath fluid enters the chamber and entrains filament sample streams from the capillaries. The filament sample streams, and sheath fluid, flow through aligned holes in a barrier member spaced close to the capillary ends, into a collection chamber having a lower glass window. The filament streams are illuminated above the barrier member by a laser, causing them to fluoresce. The fluorescence is viewed end-on by a CCD camera chip located below the glass window. The arrangement ensures an equal optical path length from all fluorescing spots to the CCD chip and also blocks scattered fluorescence illumination, providing more uniform results and an improved signal to noise ratio.
Accurate calibration and control of relative humidity close to 100% by X-raying a DOPC multilayer
Ma, Yicong; Ghosh, Sajal K.; Bera, Sambhunath; ...
2015-01-01
Here in this study, we have designed a compact sample chamber that can achieve accurate and continuous control of the relative humidity (RH) in the vicinity of 100%. A 1,2-dioleoyl-sn-glycero-3-phosphocholine (DOPC) multilayer can be used as a humidity sensor by measuring its inter-layer repeat distance (d-spacing) via X-ray diffraction. We convert from DOPC d-spacing to RH according to a theory given in the literature and previously measured data of DOPC multilamellar vesicles in polyvinylpyrrolidone (PVP) solutions. This curve can be used for calibration of RH close to 100%, a regime where conventional sensors do not have sufficient accuracy. We demonstratemore » that this control method can provide RH accuracies of 0.1 to 0.01%, which is a factor of 10–100 improvement compared to existing methods of humidity control. Our method provides fine tuning capability of RH continuously for a single sample, whereas the PVP solution method requires new samples to be made for each PVP concentration. The use of this cell also potentially removes the need for an X-ray or neutron beam to pass through bulk water if one wishes to work close to biologically relevant conditions of nearly 100% RH.« less
NASA GRC and MSFC Space-Plasma Arc Testing Procedures
NASA Technical Reports Server (NTRS)
Ferguson, Dale C.; Vayner, Boris V.; Galofaro, Joel T,; Hillard, G. Barry; Vaughn, Jason; Schneider, Todd
2005-01-01
Tests of arcing and current collection in simulated space plasma conditions have been performed at the NASA Glenn Research Center (GRC) in Cleveland, Ohio, for over 30 years and at the Marshall Space Flight Center (MSFC) in Huntsville, Alabama, for almost as long. During this period, proper test conditions for accurate and meaningful space simulation have been worked out, comparisons with actual space performance in spaceflight tests and with real operational satellites have been made, and NASA has achieved our own internal standards for test protocols. It is the purpose of this paper to communicate the test conditions, test procedures, and types of analysis used at NASA GRC and MSFC to the space environmental testing community at large, to help with international space-plasma arcing-testing standardization. To be discussed are: 1.Neutral pressures, neutral gases, and vacuum chamber sizes. 2. Electron and ion densities, plasma uniformity, sample sizes, and Debuy lengths. 3. Biasing samples versus self-generated voltages. Floating samples versus grounded. 4. Power supplies and current limits. Isolation of samples from power supplies during arcs. 5. Arc circuits. Capacitance during biased arc-threshold tests. Capacitance during sustained arcing and damage tests. Arc detection. Prevention sustained discharges during testing. 6. Real array or structure samples versus idealized samples. 7. Validity of LEO tests for GEO samples. 8. Extracting arc threshold information from arc rate versus voltage tests. 9. Snapover and current collection at positive sample bias. Glows at positive bias. Kapon (R) pyrolisis. 10. Trigger arc thresholds. Sustained arc thresholds. Paschen discharge during sustained arcing. 11. Testing for Paschen discharge threshold. Testing for dielectric breakdown thresholds. Testing for tether arcing. 12. Testing in very dense plasmas (ie thruster plumes). 13. Arc mitigation strategies. Charging mitigation strategies. Models. 14. Analysis of test results. Finally, the necessity of testing will be emphasized, not to the exclusion of modeling, but as part of a complete strategy for determining when and if arcs will occur, and preventing them from occurring in space.
NASA GRC and MSFC Space-Plasma Arc Testing Procedures
NASA Technical Reports Server (NTRS)
Ferguson, Dale C.a; Vayner, Boris V.; Galofaro, Joel T.; Hillard, G. Barry; Vaughn, Jason; Schneider, Todd
2005-01-01
Tests of arcing and current collection in simulated space plasma conditions have been performed at the NASA Glenn Research Center (GRC) in Cleveland, Ohio, for over 30 years and at the Marshall Space flight Center (MSFC) for almost as long. During this period, proper test conditions for accurate and meaningful space simulation have been worked out, comparisons with actual space performance in spaceflight tests and with real operational satellites have been made, and NASA has achieved our own internal standards for test protocols. It is the purpose of this paper to communicate the test conditions, test procedures, and types of analysis used at NASA GRC and MSFC to the space environmental testing community at large, to help with international space-plasma arcing testing standardization. To be discussed are: 1. Neutral pressures, neutral gases, and vacuum chamber sizes. 2. Electron and ion densities, plasma uniformity, sample sizes, and Debye lengths. 3. Biasing samples versus self-generated voltages. Floating samples versus grounded. 4. Power supplies and current limits. Isolation of samples from power supplies during arcs. Arc circuits. Capacitance during biased arc-threshold tests. Capacitance during sustained arcing and damage tests. Arc detection. Preventing sustained discharges during testing. 5. Real array or structure samples versus idealized samples. 6. Validity of LEO tests for GEO samples. 7. Extracting arc threshold information from arc rate versus voltage tests. 8 . Snapover and current collection at positive sample bias. Glows at positive bias. Kapton pyrolization. 9. Trigger arc thresholds. Sustained arc thresholds. Paschen discharge during sustained arcing. 10. Testing for Paschen discharge thresholds. Testing for dielectric breakdown thresholds. Testing for tether arcing. 11. Testing in very dense plasmas (ie thruster plumes). 12. Arc mitigation strategies. Charging mitigation strategies. Models. 13. Analysis of test results. Finally, the necessity of testing will be emphasized, not to the exclusion of modeling, but as part of a complete strategy for determining when and if arcs will occur, and preventing them from occurring in space.
Braze Development of Graphite Fiber for Use in Phase Change Material Heat Sinks
NASA Technical Reports Server (NTRS)
Quinn, Gregory; Beringer, Woody; Gleason, Brian; Stephan, Ryan
2011-01-01
Hamilton Sundstrand (HS), together with NASA Johnson Space Center, developed methods to metallurgically join graphite fiber to aluminum. The goal of the effort was to demonstrate improved thermal conductance, tensile strength and manufacturability compared to existing epoxy bonded techniques. These improvements have the potential to increase the performance and robustness of phase change material heat sinks that use graphite fibers as an interstitial material. Initial work focused on evaluating joining techniques from four suppliers, each consisting of a metallization step followed by brazing or soldering of one inch square blocks of Fibercore graphite fiber material to aluminum end sheets. Results matched the strength and thermal conductance of the epoxy bonded control samples, so two suppliers were down-selected for a second round of braze development. The second round of braze samples had up to a 300% increase in strength and up to a 132% increase in thermal conductance over the bonded samples. However, scalability and repeatability proved to be significant hurdles with the metallization approach. An alternative approach was pursued which used a nickel braze allow to prepare the carbon fibers for joining with aluminum. Initial results on sample blocks indicate that this approach should be repeatable and scalable with good strength and thermal conductance when compared with epoxy bonding.
NASA Technical Reports Server (NTRS)
James, John T.
2010-01-01
Reports on the air quality aboard the Space Shuttle (STS-129), and the International Space station (ULF3). NASA analyzed the grab sample canisters (GSCs) and the formaldehyde badges aboard both locations for carbon monoxide levels. The three surrogates: (sup 13)C-acetone, fluorobenzene, and chlorobenzene registered 109, 101, and 109% in the space shuttle and 81, 87, and 55% in the International Space Station (ISS). From these results the atmosphere in both the Space Shuttle and the International Space Station (ISS) was found to be breathable.
NASA Technical Reports Server (NTRS)
Rabenberg, Ellen; Kaukler, William; Grugel, Richard
2015-01-01
Two sets of epoxy mixtures, both containing the same ionic liquid (IL) based resin but utilizing two different curing agents, were evaluated after spending more than two years of continual space exposure outside of the International Space Station on the MISSE-8 sample rack. During this period the samples, positioned on nadir side, also experienced some 12,500 thermal cycles between approximately -40?C and +40 C. Initial examination showed some color change, a miniscule weight variance, and no cracks or de-bonding from the sample substrate. Microscopic examination of the surface reveled some slight deformities and pitting. These observations, and others, are discussed in view of the ground-based control samples. Finally, the impetus of this study in terms of space applications is presented.
Investigation of rice proteomic change in response to microgravity
NASA Astrophysics Data System (ADS)
Sun, Weining
Gravity is one of the environmental factors that control development and growth of plants. Plant cells which are not part of specialized tissues such as the root columella can also sense gravity. Space environment, such as space shuttle missions, space labortories and space stations, etc. provide unique oppotunities to study the microgravity response of plant. During the Shenzhou 8 mission in November 2011, we cultured rice cali on the spaceship and the samples were fixed 4 days after launch. The flying samples in the static position (micro g, mug) and in the centrifuge which provide 1 g force to mimic the 1 g gravity in space, were recovered and the proteome changes were analyzed by iTRAQ. In total, 4840 proteins were identified, including 2085 proteins with function annotation by GO analysis. 431 proteins were changed >1.5 fold in space µg /ground group, including 179 up-regulated proteins and down-regulated 252 proteins. 321 proteins were changed >1.5 fold in space muµg / space 1 g group, among which 205 proteins were the same differentially expressed proteins responsive to microgravity. Enrichment of the differnetially expressed proteins by GO analysis showed that the ARF GTPase activity regulation proteins were enriched when compared the space µg with space 1 g sample, whereas the nucleic acid binding and DNA damage repairing proteins were enriched when compared the space µg and ground sample. Microscopic comparison of the rice cali showed that the space grown cells are more uniformed in size and proliferation, suggesting that cell proliferation pattern was changed in space microgravity conditions.
Kuipers performs Water Sample Analysis
2012-05-15
ISS031-E-084619 (15 May 2012) --- After collecting samples from the Water Recovery System (WRS), European Space Agency astronaut Andre Kuipers, Expedition 31 flight engineer, processes the samples for chemical and microbial analysis in the Unity node of the International Space Station.
2009-07-15
ISS020-E-020652 (15 July 2009) --- Canadian Space Agency astronaut Robert Thirsk, Expedition 20 flight engineer, uses the Surface Sample Kit (SSK) to collect microbiology samples from specific sampling locations in the Harmony node and other modules of the International Space Station.
2017-11-03
A video news file (or a collection of raw video and interview clips) about the EcAMSat mission. Ever wonder what would happen if you got sick in space? NASA is sending samples of bacteria into low-Earth orbit to find out. One of the latest small satellite missions from NASA’s Ames Research Center in California’s Silicon Valley is the E. coli Anti-Microbial Satellite, or EcAMSat for short. The CubeSat – a spacecraft the size of a shoebox built from cube-shaped units – will explore how effectively antibiotics can combat E. coli bacteria in the low gravity of space. This information will help us improve how we fight infections, providing safer journeys for astronauts on their future voyages, and offer benefits for medicine here on Earth.
Precise estimation of tropospheric path delays with GPS techniques
NASA Technical Reports Server (NTRS)
Lichten, S. M.
1990-01-01
Tropospheric path delays are a major source of error in deep space tracking. However, the tropospheric-induced delay at tracking sites can be calibrated using measurements of Global Positioning System (GPS) satellites. A series of experiments has demonstrated the high sensitivity of GPS to tropospheric delays. A variety of tests and comparisons indicates that current accuracy of the GPS zenith tropospheric delay estimates is better than 1-cm root-mean-square over many hours, sampled continuously at intervals of six minutes. These results are consistent with expectations from covariance analyses. The covariance analyses also indicate that by the mid-1990s, when the GPS constellation is complete and the Deep Space Network is equipped with advanced GPS receivers, zenith tropospheric delay accuracy with GPS will improve further to 0.5 cm or better.
A geostatistical state-space model of animal densities for stream networks.
Hocking, Daniel J; Thorson, James T; O'Neil, Kyle; Letcher, Benjamin H
2018-06-21
Population dynamics are often correlated in space and time due to correlations in environmental drivers as well as synchrony induced by individual dispersal. Many statistical analyses of populations ignore potential autocorrelations and assume that survey methods (distance and time between samples) eliminate these correlations, allowing samples to be treated independently. If these assumptions are incorrect, results and therefore inference may be biased and uncertainty under-estimated. We developed a novel statistical method to account for spatio-temporal correlations within dendritic stream networks, while accounting for imperfect detection in the surveys. Through simulations, we found this model decreased predictive error relative to standard statistical methods when data were spatially correlated based on stream distance and performed similarly when data were not correlated. We found that increasing the number of years surveyed substantially improved the model accuracy when estimating spatial and temporal correlation coefficients, especially from 10 to 15 years. Increasing the number of survey sites within the network improved the performance of the non-spatial model but only marginally improved the density estimates in the spatio-temporal model. We applied this model to Brook Trout data from the West Susquehanna Watershed in Pennsylvania collected over 34 years from 1981 - 2014. We found the model including temporal and spatio-temporal autocorrelation best described young-of-the-year (YOY) and adult density patterns. YOY densities were positively related to forest cover and negatively related to spring temperatures with low temporal autocorrelation and moderately-high spatio-temporal correlation. Adult densities were less strongly affected by climatic conditions and less temporally variable than YOY but with similar spatio-temporal correlation and higher temporal autocorrelation. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Converging Redundant Sensor Network Information for Improved Building Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dale Tiller; D. Phil; Gregor Henze
2007-09-30
This project investigated the development and application of sensor networks to enhance building energy management and security. Commercial, industrial and residential buildings often incorporate systems used to determine occupancy, but current sensor technology and control algorithms limit the effectiveness of these systems. For example, most of these systems rely on single monitoring points to detect occupancy, when more than one monitoring point could improve system performance. Phase I of the project focused on instrumentation and data collection. During the initial project phase, a new occupancy detection system was developed, commissioned and installed in a sample of private offices and open-planmore » office workstations. Data acquisition systems were developed and deployed to collect data on space occupancy profiles. Phase II of the project demonstrated that a network of several sensors provides a more accurate measure of occupancy than is possible using systems based on single monitoring points. This phase also established that analysis algorithms could be applied to the sensor network data stream to improve the accuracy of system performance in energy management and security applications. In Phase III of the project, the sensor network from Phase I was complemented by a control strategy developed based on the results from the first two project phases: this controller was implemented in a small sample of work areas, and applied to lighting control. Two additional technologies were developed in the course of completing the project. A prototype web-based display that portrays the current status of each detector in a sensor network monitoring building occupancy was designed and implemented. A new capability that enables occupancy sensors in a sensor network to dynamically set the 'time delay' interval based on ongoing occupant behavior in the space was also designed and implemented.« less
Optical Measurements on Solid Specimens of Solid Rocket Motor Exhaust and Solid Rocket Motor Slag
NASA Technical Reports Server (NTRS)
Roberts, F. E., III
1991-01-01
Samples of aluminum slag were investigated to aid the Earth Science and Applications Division at the Marshall Space Flight Center (MSFC). Alumina from space motor propellant exhaust and space motor propellant slag was examined as a component of space refuse. Thermal emittance and solar absorptivity measurements were taken to support their comparison with reflectance measurements derived from actual debris. To determine the similarity between the samples and space motor exhaust or space motor slag, emittance and absorbance results were correlated with an examination of specimen morphology.
Mobile robot motion estimation using Hough transform
NASA Astrophysics Data System (ADS)
Aldoshkin, D. N.; Yamskikh, T. N.; Tsarev, R. Yu
2018-05-01
This paper proposes an algorithm for estimation of mobile robot motion. The geometry of surrounding space is described with range scans (samples of distance measurements) taken by the mobile robot’s range sensors. A similar sample of space geometry in any arbitrary preceding moment of time or the environment map can be used as a reference. The suggested algorithm is invariant to isotropic scaling of samples or map that allows using samples measured in different units and maps made at different scales. The algorithm is based on Hough transform: it maps from measurement space to a straight-line parameters space. In the straight-line parameters, space the problems of estimating rotation, scaling and translation are solved separately breaking down a problem of estimating mobile robot localization into three smaller independent problems. The specific feature of the algorithm presented is its robustness to noise and outliers inherited from Hough transform. The prototype of the system of mobile robot orientation is described.
NASA Astrophysics Data System (ADS)
Han, Young-Geun; Dong, Xinyong; Lee, Ju Han; Lee, Sang Bae
2006-12-01
We propose and experimentally demonstrate a simple and flexible scheme for a wavelength-spacing-tunable multichannel filter exploiting a sampled chirped fiber Bragg grating based on a symmetrical modification of the chirp ratio. Symmetrical bending along a sampled chirped fiber Bragg grating attached to a flexible cantilever beam induces a variation of the chirp ratio and a reflection chirp bandwidth of the grating without a center wavelength shift. Accordingly, the wavelength spacing of a sampled chirped fiber Bragg grating is continuously controlled by the reflection chirp bandwidth variation of the grating corresponding to the bending direction, which allows for realization of an effective wavelength-spacing-tunable multichannel filter. Based on the proposed technique, we achieve the continuous tunability of the wavelength spacing in a range from 1.51 to 6.11 nm, depending on the bending direction of the cantilever beam.
Space Weather Activities of IONOLAB Group: TEC Mapping
NASA Astrophysics Data System (ADS)
Arikan, F.; Yilmaz, A.; Arikan, O.; Sayin, I.; Gurun, M.; Akdogan, K. E.; Yildirim, S. A.
2009-04-01
Being a key player in Space Weather, ionospheric variability affects the performance of both communication and navigation systems. To improve the performance of these systems, ionosphere has to be monitored. Total Electron Content (TEC), line integral of the electron density along a ray path, is an important parameter to investigate the ionospheric variability. A cost-effective way of obtaining TEC is by using dual-frequency GPS receivers. Since these measurements are sparse in space, accurate and robust interpolation techniques are needed to interpolate (or map) the TEC distribution for a given region in space. However, the TEC data derived from GPS measurements contain measurement noise, model and computational errors. Thus, it is necessary to analyze the interpolation performance of the techniques on synthetic data sets that can represent various ionospheric states. By this way, interpolation performance of the techniques can be compared over many parameters that can be controlled to represent the desired ionospheric states. In this study, Multiquadrics, Inverse Distance Weighting (IDW), Cubic Splines, Ordinary and Universal Kriging, Random Field Priors (RFP), Multi-Layer Perceptron Neural Network (MLP-NN), and Radial Basis Function Neural Network (RBF-NN) are employed as the spatial interpolation algorithms. These mapping techniques are initially tried on synthetic TEC surfaces for parameter and coefficient optimization and determination of error bounds. Interpolation performance of these methods are compared on synthetic TEC surfaces over the parameters of sampling pattern, number of samples, the variability of the surface and the trend type in the TEC surfaces. By examining the performance of the interpolation methods, it is observed that both Kriging, RFP and NN have important advantages and possible disadvantages depending on the given constraints. It is also observed that the determining parameter in the error performance is the trend in the Ionosphere. Optimization of the algorithms in terms of their performance parameters (like the choice of the semivariogram function for Kriging algorithms and the hidden layer and neuron numbers for MLP-NN) mostly depend on the behavior of the ionosphere at that given time instant for the desired region. The sampling pattern and number of samples are the other important parameters that may contribute to the higher errors in reconstruction. For example, for all of the above listed algorithms, hexagonal regular sampling of the ionosphere provides the lowest reconstruction error and the performance significantly degrades as the samples in the region become sparse and clustered. The optimized models and coefficients are applied to regional GPS-TEC mapping using the IONOLAB-TEC data (www.ionolab.org). Both Kriging combined with Kalman Filter and dynamic modeling of NN are also implemented as first trials of TEC and space weather predictions.
Low-sample flow secondary electrospray ionization: improving vapor ionization efficiency.
Vidal-de-Miguel, G; Macía, M; Pinacho, P; Blanco, J
2012-10-16
In secondary electrospray ionization (SESI) systems, gaseous analytes exposed to an elecrospray plume become ionized after charge is transferred from the charging electrosprayed particles to the sample species. Current SESI systems have shown a certain potential. However, their ionization efficiency is limited by space charge repulsion and by the high sample flows required to prevent vapor dilution. As a result, they have a poor conversion ratio of vapor into ions. We have developed and tested a new SESI configuration, termed low-flow SESI, that permits the reduction of the required sample flows. Although the ion to vapor concentration ratio is limited, the ionic flow to sample vapor flow ratio theoretically is not. The new ionizer is coupled to a planar differential mobility analyzer (DMA) and requires only 0.2 lpm of vapor sample flow to produce 3.5 lpm of ionic flow. The achieved ionization efficiency is 1/700 (one ion for every 700 molecules) for TNT and, thus, compared with previous SESI ionizers coupled with atmospheric pressure ionization-mass spectrometry (API-MS) (Mesonero, E.; Sillero, J. A.; Hernández, M.; Fernandez de la Mora, J. Philadelphia PA, 2009) has been improved by a large factor of at least 50-100 (our measurements indicate 70). The new ionizer coupled with the planar DMA and a triple quadrupole mass spectrometer (ABSciex API5000) requires only 20 fg (50 million molecules) to produce a discernible signal after mobility and MS(2) analysis.
Xu, Xin; Huang, Zhenhua; Graves, Daniel; Pedrycz, Witold
2014-12-01
In order to deal with the sequential decision problems with large or continuous state spaces, feature representation and function approximation have been a major research topic in reinforcement learning (RL). In this paper, a clustering-based graph Laplacian framework is presented for feature representation and value function approximation (VFA) in RL. By making use of clustering-based techniques, that is, K-means clustering or fuzzy C-means clustering, a graph Laplacian is constructed by subsampling in Markov decision processes (MDPs) with continuous state spaces. The basis functions for VFA can be automatically generated from spectral analysis of the graph Laplacian. The clustering-based graph Laplacian is integrated with a class of approximation policy iteration algorithms called representation policy iteration (RPI) for RL in MDPs with continuous state spaces. Simulation and experimental results show that, compared with previous RPI methods, the proposed approach needs fewer sample points to compute an efficient set of basis functions and the learning control performance can be improved for a variety of parameter settings.
LAWS (Laser Atmospheric Wind Sounder) earth observing system
NASA Technical Reports Server (NTRS)
1988-01-01
Wind profiles can be measured from space using current technology. These wind profiles are essential for answering many of the interdisciplinary scientific questions to be addressed by EOS, the Earth Observing System. This report provides guidance for the development of a spaceborne wind sounder, the Laser Atmospheric Wind Sounder (LAWS), discussing the current state of the technology and reviewing the scientific rationale for the instrument. Whether obtained globally from the EOS polar platform or in the tropics and subtropics from the Space Station, wind profiles from space will provide essential information for advancing the skill of numerical weather prediction, furthering knowledge of large-scale atmospheric circulation and climate dynamics, and improving understanding of the global biogeochemical and hydrologic cycles. The LAWS Instrument Panel recommends that it be given high priority for new instrument development because of the pressing scientific need and the availability of the necessary technology. LAWS is to measure wind profiles with an accuracy of a few meters per second and to sample at intervals of 100 km horizontally for layers km thick.
Damage to metallic samples produced by measured lightning currents
NASA Technical Reports Server (NTRS)
Fisher, Richard J.; Schnetzer, George H.
1991-01-01
A total of 10 sample disks of 2024-T3 aluminum and 4130 ferrous steel were exposed to rocket-triggered lightning currents at the Kennedy Space Center test site. The experimental configuration was arranged so that the samples were not exposed to the preliminary streamer, wire-burn, or following currents that are associated with an upward-initiated rocket-triggered flash but which are atypical of naturally initiated lightning. Return-stroke currents and continuing currents actually attaching to the sample were measured, augmented by close-up video recordings of approximately 3 feet of the channel above the sample and by 16-mm movies with 5-ms resolution. From these data it was possible to correlate individual damage spots with streamer, return-stroke, and continuing currents that produced them. Substantial penetration of 80-mil aluminum was produced by a continuing current of submedian amplitude and duration, and full penetration of a 35-mil steel sample occurred under an eightieth percentile continuing current. The primary purpose of the data acquired in these experiments is for use in improving and quantifying the fidelity of laboratory simulations of lightning burnthrough.
Communications Relay and Human-Assisted Sample Return from the Deep Space Gateway
NASA Astrophysics Data System (ADS)
Cichan, T.; Hopkins, J. B.; Bierhaus, B.; Murrow, D. W.
2018-02-01
The Deep Space Gateway can enable or enhance exploration of the lunar surface through two capabilities: 1. communications relay, opening up access to the lunar farside, and 2. sample return, enhancing the ability to return large sample masses.
Space law information system design, phase 2
NASA Technical Reports Server (NTRS)
Morenoff, J.; Roth, D. L.; Singleton, J. W.
1973-01-01
Design alternatives were defined for the implementation of a Space Law Information System for the Office of the General Counsel, NASA. A thesaurus of space law terms was developed and a selected document sample indexed on the basis of that thesaurus. Abstracts were also prepared for the sample document set.
2018-04-30
iss055e043245 (April 30, 2018) --- NASA astronaut Ricky Arnold transfers frozen biological samples from science freezers aboard the International Space Station to science freezers inside the SpaceX Dragon resupply ship. The research samples were returned to Earth aboard Dragon for retrieval by SpaceX engineers and analysis by NASA scientists.
The MISSE 7 Flexural Stress Effects Experiment After 1.5 Years of Wake Space Exposure
NASA Technical Reports Server (NTRS)
Snow, Kate E.; De Groh, Kim K.; Banks, Bruce A.
2017-01-01
Low Earth orbit space environment conditions, including ultraviolet radiation, thermal cycling, and atomic oxygen exposure, can cause degradation of exterior spacecraft materials over time. Radiation and thermal exposure often results in bond- breaking and embrittlement of polymers, reducing mechanical strength and structural integrity. An experiment called the Flexural Stress Effects Experiment (FSEE) was flown with the objective of determining the role of space environmental exposure on the degradation of polymers under flexural stress. The FSEE samples were flown in the wake orientation on the exterior of International Space Station for 1.5 years. Twenty-four samples were flown: 12 bent over a 0.375 in. mandrel and 12 were over a 0.25 in. mandrel. This was designed to simulate flight configurations of insulation blankets on spacecraft. The samples consisted of assorted polyimide and fluorinated polymers with various coatings. Half the samples were designated for bend testing and the other half will be tensile tested. A non-standard bend-test procedure was designed to determine the surface strain at which embrittled polymers crack. All ten samples designated for bend testing have been tested. None of the control samples' polymers cracked, even under surface strains up to 19.7%, although one coating cracked. Of the ten flight samples tested, seven show increased embrittlement through bend-test induced cracking at surface strains from 0.70%to 11.73%. These results show that most of the tested polymers are embrittled due to space exposure, when compared to their control samples. Determination of the extent of space induced embrittlement of polymers is important for designing durable spacecraft.
Development of the IES method for evaluating the color rendition of light sources
David, Aurelien; Fini, Paul T.; Houser, Kevin W.; ...
2015-06-08
We have developed a two-measure system for evaluating light sources’ color rendition that builds upon conceptual progress of numerous researchers over the last two decades. The system quantifies the color fidelity and color gamut (change in object chroma) of a light source in comparison to a reference illuminant. The calculations are based on a newly developed set of reflectance data from real samples uniformly distributed in color space (thereby fairly representing all colors) and in wavelength space (thereby precluding artificial optimization of the color rendition scores by spectral engineering). The color fidelity score R f is an improved version ofmore » the CIE color rendering index. The color gamut score R g is an improved version of the Gamut Area Index. In combination, they provide two complementary assessments to guide the optimization of future light sources. This method summarizes the findings of the Color Metric Task Group of the Illuminating Engineering Society of North America (IES). It is adopted in the upcoming IES TM-30-2015, and is proposed for consideration with the International Commission on Illumination (CIE).« less
NASA Astrophysics Data System (ADS)
Atemkeng, M.; Smirnov, O.; Tasse, C.; Foster, G.; Keimpema, A.; Paragi, Z.; Jonas, J.
2018-07-01
Traditional radio interferometric correlators produce regular-gridded samples of the true uv-distribution by averaging the signal over constant, discrete time-frequency intervals. This regular sampling and averaging then translate to be irregular-gridded samples in the uv-space, and results in a baseline-length-dependent loss of amplitude and phase coherence, which is dependent on the distance from the image phase centre. The effect is often referred to as `decorrelation' in the uv-space, which is equivalent in the source domain to `smearing'. This work discusses and implements a regular-gridded sampling scheme in the uv-space (baseline-dependent sampling) and windowing that allow for data compression, field-of-interest shaping, and source suppression. The baseline-dependent sampling requires irregular-gridded sampling in the time-frequency space, i.e. the time-frequency interval becomes baseline dependent. Analytic models and simulations are used to show that decorrelation remains constant across all the baselines when applying baseline-dependent sampling and windowing. Simulations using MeerKAT telescope and the European Very Long Baseline Interferometry Network show that both data compression, field-of-interest shaping, and outer field-of-interest suppression are achieved.
Kawai, Toshio; Sumino, Kimiaki; Ohashi, Fumiko; Ikeda, Masayuki
2011-01-01
To facilitate urine sample preparation prior to head-space gas-chromatographic (HS-GC) analysis. Urine samples containing one of the five solvents (acetone, methanol, methyl ethyl ketone, methyl isobutyl ketone and toluene) at the levels of biological exposure limits were aspirated into a vacuum tube via holder, a device commercially available for venous blood collection (the vacuum tube method). The urine sample, 5 ml, was quantitatively transferred to a 20-ml head-space vial prior to HS-GC analysis. The loaded tubes were stored at +4 ℃ in dark for up to 3 d. The vacuum tube method facilitated on-site procedures of urine sample preparation for HS-GC with no significant loss of solvents in the sample and no need of skilled hands, whereas on-site sample preparation time was significantly reduced. Furthermore, no loss of solvents was detected during the 3-d storage, irrespective of hydrophilic (acetone) or lipophilic solvent (toluene). In a pilot application, high performance of the vacuum tube method in sealing a sample in an air-tight space succeeded to confirm that no solvent will be lost when sealing is completed within 5 min after urine voiding, and that the allowance time is as long as 30 min in case of toluene in urine. The use of the holder-vacuum tube device not only saves hands for transfer of the sample to air-tight space, but facilitates sample storage prior to HS-GC analysis.
Consistently Sampled Correlation Filters with Space Anisotropic Regularization for Visual Tracking
Shi, Guokai; Xu, Tingfa; Luo, Jiqiang; Li, Yuankun
2017-01-01
Most existing correlation filter-based tracking algorithms, which use fixed patches and cyclic shifts as training and detection measures, assume that the training samples are reliable and ignore the inconsistencies between training samples and detection samples. We propose to construct and study a consistently sampled correlation filter with space anisotropic regularization (CSSAR) to solve these two problems simultaneously. Our approach constructs a spatiotemporally consistent sample strategy to alleviate the redundancies in training samples caused by the cyclical shifts, eliminate the inconsistencies between training samples and detection samples, and introduce space anisotropic regularization to constrain the correlation filter for alleviating drift caused by occlusion. Moreover, an optimization strategy based on the Gauss-Seidel method was developed for obtaining robust and efficient online learning. Both qualitative and quantitative evaluations demonstrate that our tracker outperforms state-of-the-art trackers in object tracking benchmarks (OTBs). PMID:29231876
Xie, Yibin; Yang, Qi; Xie, Guoxi; Pang, Jianing; Fan, Zhaoyang; Li, Debiao
2015-01-01
Purpose The purpose of this work is to develop a 3D black-blood imaging method for simultaneously evaluating carotid and intracranial arterial vessel wall with high spatial resolution and excellent blood suppression with and without contrast enhancement. Methods DANTE preparation module was incorporated into SPACE sequence to improve blood signal suppression. Simulations and phantom studies were performed to quantify image contrast variations induced by DANTE. DANTE-SPACE, SPACE and 2D TSE were compared for apparent SNR, CNR and morphometric measurements in fourteen healthy subjects. Preliminary clinical validation was performed in six symptomatic patients. Results Apparent residual luminal blood was observed in 5 (pre-CE) and 9 (post-CE) subjects with SPACE, and only 2 (post-CE) subjects with DANTE-SPACE. DANTE-SPACE showed 31% (pre-CE) and 100% (post-CE) improvement in wall-to-blood CNR over SPACE. Vessel wall area measured from SPACE was significantly larger than that from DANTE-SPACE due to possible residual blood signal contamination. In patients DANTE-SPACE showed the potential to detect vessel wall dissection and identify plaque components. Conclusion DANTE-SPACE significantly improved arterial and venous blood suppression compared with SPACE. Simultaneous high-resolution carotid and intracranial vessel wall imaging to potentially identify plaque components was feasible with scan time under 6 minutes. PMID:26152900
NASA Technical Reports Server (NTRS)
Orta, D.; Mudgett, P. D.; Ding, L.; Drybread, M.; Schultz, J. R.; Sauer, R. L.
1998-01-01
Drinking water and condensate samples collected from the US Space Shuttle and the Russian Mir Space Station are analyzed routinely at the NASA-Johnson Space Center as part of an ongoing effort to verify water quality and monitor the environment of the spacecraft. Water quality monitoring is particularly important for the Mir water supply because approximately half of the water consumed is recovered from humidity condensate. Drinking water on Shuttle is derived from the fuel cells. Because there is little equipment on board the spacecraft for monitoring the water quality, samples collected by the crew are transported to Earth on Shuttle or Soyuz vehicles, and analyzed exhaustively. As part of the test battery, anions and cations are measured by ion chromatography, and carboxylates and amines by capillary electrophoresis. Analytical data from Shuttle water samples collected before and after several missions, and Mir condensate and potable recovered water samples representing several recent missions are presented and discussed. Results show that Shuttle water is of distilled quality, and Mir recovered water contains various levels of minerals imparted during the recovery processes as designed. Organic ions are rarely detected in potable water samples, but were present in humidity condensate samples.
Ikeda, Nayu; Irie, Yuki; Shibuya, Kenji
2013-05-01
To assess how changes in socioeconomic and public health determinants may have contributed to the reduction in stunting prevalence seen among Cambodian children from 2000 to 2010. A nationally representative sample of 10 366 children younger than 5 years was obtained from pooled data of cross-sectional surveys conducted in Cambodia in 2000, 2005, and 2010. The authors used a multivariate hierarchical logistic model to examine the association between the prevalence of childhood stunting over time and certain determinants. They estimated those changes in the prevalence of stunting in 2010 that could have been achieved through further improvements in public health indicators. Child stunting was associated with the child's sex and age, type of birth, maternal height, maternal body mass index, previous birth intervals, number of household members, household wealth index score, access to improved sanitation facilities, presence of diarrhoea, parents' education, maternal tobacco use and mother's birth during the Khmer Rouge famine. The reduction in stunting prevalence during the past decade was attributable to improvements in household wealth, sanitation, parental education, birth spacing and maternal tobacco use. The prevalence of stunting would have been further reduced by scaling up the coverage of improved sanitation facilities, extending birth intervals, and eradicating maternal tobacco use. Child stunting in Cambodia has decreased owing to socioeconomic development and public health improvements. Effective policy interventions for sanitation, birth spacing and maternal tobacco use, as well as equitable economic growth and education, are the keys to further improvement in child nutrition.
Radar Doppler Processing with Nonuniform Sampling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doerry, Armin W.
2017-07-01
Conventional signal processing to estimate radar Doppler frequency often assumes uniform pulse/sample spacing. This is for the convenience of t he processing. More recent performance enhancements in processor capability allow optimally processing nonuniform pulse/sample spacing, thereby overcoming some of the baggage that attends uniform sampling, such as Doppler ambiguity and SNR losses due to sidelobe control measures.
Heidemann, Robin M; Anwander, Alfred; Feiweier, Thorsten; Knösche, Thomas R; Turner, Robert
2012-04-02
There is ongoing debate whether using a higher spatial resolution (sampling k-space) or a higher angular resolution (sampling q-space angles) is the better way to improve diffusion MRI (dMRI) based tractography results in living humans. In both cases, the limiting factor is the signal-to-noise ratio (SNR), due to the restricted acquisition time. One possible way to increase the spatial resolution without sacrificing either SNR or angular resolution is to move to a higher magnetic field strength. Nevertheless, dMRI has not been the preferred application for ultra-high field strength (7 T). This is because single-shot echo-planar imaging (EPI) has been the method of choice for human in vivo dMRI. EPI faces several challenges related to the use of a high resolution at high field strength, for example, distortions and image blurring. These problems can easily compromise the expected SNR gain with field strength. In the current study, we introduce an adapted EPI sequence in conjunction with a combination of ZOOmed imaging and Partially Parallel Acquisition (ZOOPPA). We demonstrate that the method can produce high quality diffusion-weighted images with high spatial and angular resolution at 7 T. We provide examples of in vivo human dMRI with isotropic resolutions of 1 mm and 800 μm. These data sets are particularly suitable for resolving complex and subtle fiber architectures, including fiber crossings in the white matter, anisotropy in the cortex and fibers entering the cortex. Copyright © 2011 Elsevier Inc. All rights reserved.
Cyclic voltammetry study of PEO processing of porous Ti and resulting coatings
NASA Astrophysics Data System (ADS)
Shbeh, Mohammed; Yerokhin, Aleksey; Goodall, Russell
2018-05-01
Ti is one of the most commonly used materials for biomedical applications. However, there are two issues associated with the use of it, namely its bio-inertness and high elastic modulus compared to the elastic modulus of the natural bone. Both of these hurdles could potentially be overcome by introducing a number of pores in the structure of the Ti implant to match the properties of the bone as well as improve the mechanical integration between the bone and implant, and subsequently coating it with a biologically active ceramic coating to promote chemical integration. Hence, in this study we investigated the usage of cyclic voltammetry in PEO treatment of porous Ti parts with different amount of porosity produced by both Metal Injection Moulding (MIM) and MIM in combination with a space holder. It was found that porous samples with higher porosity and open pores develop much thicker surface layers that penetrate through the inner structure of the samples forming a network of surface and subsurface coatings. The results are of potential benefit in producing surface engineered porous samples for biomedical applications which do not only address the stress shielding problem, but also improve the chemical integration.
Ollikainen, Noah; Smith, Colin A.; Fraser, James S.; Kortemme, Tanja
2013-01-01
Sampling alternative conformations is key to understanding how proteins work and engineering them for new functions. However, accurately characterizing and modeling protein conformational ensembles remains experimentally and computationally challenging. These challenges must be met before protein conformational heterogeneity can be exploited in protein engineering and design. Here, as a stepping stone, we describe methods to detect alternative conformations in proteins and strategies to model these near-native conformational changes based on backrub-type Monte Carlo moves in Rosetta. We illustrate how Rosetta simulations that apply backrub moves improve modeling of point mutant side chain conformations, native side chain conformational heterogeneity, functional conformational changes, tolerated sequence space, protein interaction specificity, and amino acid co-variation across protein-protein interfaces. We include relevant Rosetta command lines and RosettaScripts to encourage the application of these types of simulations to other systems. Our work highlights that critical scoring and sampling improvements will be necessary to approximate conformational landscapes. Challenges for the future development of these methods include modeling conformational changes that propagate away from designed mutation sites and modulating backbone flexibility to predictively design functionally important conformational heterogeneity. PMID:23422426
Spatial averaging for small molecule diffusion in condensed phase environments
NASA Astrophysics Data System (ADS)
Plattner, Nuria; Doll, J. D.; Meuwly, Markus
2010-07-01
Spatial averaging is a new approach for sampling rare-event problems. The approach modifies the importance function which improves the sampling efficiency while keeping a defined relation to the original statistical distribution. In this work, spatial averaging is applied to multidimensional systems for typical problems arising in physical chemistry. They include (I) a CO molecule diffusing on an amorphous ice surface, (II) a hydrogen molecule probing favorable positions in amorphous ice, and (III) CO migration in myoglobin. The systems encompass a wide range of energy barriers and for all of them spatial averaging is found to outperform conventional Metropolis Monte Carlo. It is also found that optimal simulation parameters are surprisingly similar for the different systems studied, in particular, the radius of the point cloud over which the potential energy function is averaged. For H2 diffusing in amorphous ice it is found that facile migration is possible which is in agreement with previous suggestions from experiment. The free energy barriers involved are typically lower than 1 kcal/mol. Spatial averaging simulations for CO in myoglobin are able to locate all currently characterized metastable states. Overall, it is found that spatial averaging considerably improves the sampling of configurational space.
Multivariate Analyses of Quality Metrics for Crystal Structures in the PDB Archive.
Shao, Chenghua; Yang, Huanwang; Westbrook, John D; Young, Jasmine Y; Zardecki, Christine; Burley, Stephen K
2017-03-07
Following deployment of an augmented validation system by the Worldwide Protein Data Bank (wwPDB) partnership, the quality of crystal structures entering the PDB has improved. Of significance are improvements in quality measures now prominently displayed in the wwPDB validation report. Comparisons of PDB depositions made before and after introduction of the new reporting system show improvements in quality measures relating to pairwise atom-atom clashes, side-chain torsion angle rotamers, and local agreement between the atomic coordinate structure model and experimental electron density data. These improvements are largely independent of resolution limit and sample molecular weight. No significant improvement in the quality of associated ligands was observed. Principal component analysis revealed that structure quality could be summarized with three measures (Rfree, real-space R factor Z score, and a combined molecular geometry quality metric), which can in turn be reduced to a single overall quality metric readily interpretable by all PDB archive users. Copyright © 2017 Elsevier Ltd. All rights reserved.
Low-Latency Telerobotic Sample Return and Biomolecular Sequencing for Deep Space Gateway
NASA Astrophysics Data System (ADS)
Lupisella, M.; Bleacher, J.; Lewis, R.; Dworkin, J.; Wright, M.; Burton, A.; Rubins, K.; Wallace, S.; Stahl, S.; John, K.; Archer, D.; Niles, P.; Regberg, A.; Smith, D.; Race, M.; Chiu, C.; Russell, J.; Rampe, E.; Bywaters, K.
2018-02-01
Low-latency telerobotics, crew-assisted sample return, and biomolecular sequencing can be used to acquire and analyze lunar farside and/or Apollo landing site samples. Sequencing can also be used to monitor and study Deep Space Gateway environment and crew health.
Extraterrestrial Samples at JSC
NASA Technical Reports Server (NTRS)
Allen, Carlton C.
2007-01-01
A viewgraph presentation on the curation of extraterrestrial samples at NASA Johnson Space Center is shown. The topics include: 1) Apollo lunar samples; 2) Meteorites from Antarctica; 3) Cosmic dust from the stratosphere; 4) Genesis solar wind ions; 5) Stardust comet and interstellar grains; and 5) Space-Exposed Hardware.
Combining Speed Information Across Space
NASA Technical Reports Server (NTRS)
Verghese, Preeti; Stone, Leland S.
1995-01-01
We used speed discrimination tasks to measure the ability of observers to combine speed information from multiple stimuli distributed across space. We compared speed discrimination thresholds in a classical discrimination paradigm to those in an uncertainty/search paradigm. Thresholds were measured using a temporal two-interval forced-choice design. In the discrimination paradigm, the n gratings in each interval all moved at the same speed and observers were asked to choose the interval with the faster gratings. Discrimination thresholds for this paradigm decreased as the number of gratings increased. This decrease was not due to increasing the effective stimulus area as a control experiment that increased the area of a single grating did not show a similar improvement in thresholds. Adding independent speed noise to each of the n gratings caused thresholds to decrease at a rate similar to the original no-noise case, consistent with observers combining an independent sample of speed from each grating in both the added- and no-noise cases. In the search paradigm, observers were asked to choose the interval in which one of the n gratings moved faster. Thresholds in this case increased with the number of gratings, behavior traditionally attributed to an input bottleneck. However, results from the discrimination paradigm showed that the increase was not due to observers' inability to process these gratings. We have also shown that the opposite trends of the data in the two paradigms can be predicted by a decision theory model that combines independent samples of speed information across space. This demonstrates that models typically used in classical detection and discrimination paradigms are also applicable to search paradigms. As our model does not distinguish between samples in space and time, it predicts that discrimination performance should be the same regardless of whether the gratings are presented in two spatial intervals or two temporal intervals. Our last experiment largely confirmed this prediction.
X-ray simulations method for the large field of view
NASA Astrophysics Data System (ADS)
Schelokov, I. A.; Grigoriev, M. V.; Chukalina, M. V.; Asadchikov, V. E.
2018-03-01
In the standard approach, X-ray simulation is usually limited to the step of spatial sampling to calculate the convolution of integrals of the Fresnel type. Explicitly the sampling step is determined by the size of the last Fresnel zone in the beam aperture. In other words, the spatial sampling is determined by the precision of integral convolution calculations and is not connected with the space resolution of an optical scheme. In the developed approach the convolution in the normal space is replaced by computations of the shear strain of ambiguity function in the phase space. The spatial sampling is then determined by the space resolution of an optical scheme. The sampling step can differ in various directions because of the source anisotropy. The approach was used to simulate original images in the X-ray Talbot interferometry and showed that the simulation can be applied to optimize the methods of postprocessing.
NASA Technical Reports Server (NTRS)
Angart, Samuel; Lauer, Mark; Poirier, David; Tewari, Surendra; Rajamure, Ravi; Grugel, Richard
2015-01-01
Samples from directionally solidified Al- 7 wt. % Si have been analyzed for primary dendrite arm spacing (lambda) and radial macrosegregation. The alloy was directionally solidified (DS) aboard the ISS to determine the effect of mitigating convection on lambda and macrosegregation. Samples from terrestrial DS-experiments thermal histories are discussed for comparison. In some experiments, lambda was measured in microstructures that developed during the transition from one speed to another. To represent DS in the presence of no convection, the Hunt-Lu model was used to represent diffusion controlled growth under steady-state conditions. By sectioning cross-sections throughout the entire length of a solidified sample, lambda was measured and calculated using the model. During steady-state, there was reasonable agreement between the measured and calculated lambda's in the space-grown samples. In terrestrial samples, the differences between measured and calculated lambda's indicated that the dendritic growth was influenced by convection.
2003-01-22
On Earth when scientists melt metals, bubbles that form in the molten material can rise to the surface, pop and disappear. In microgravity -- the near-weightless environment created as the International Space Station orbits Earth -- the lighter bubbles do not rise and disappear. Prior space experiments have shown that bubbles often become trapped in the final metal or crystal sample -similar to the bubbles trapped in this sample. In the solid, these bubbles, or porosity, are defects that diminish both the material's strength and usefulness. The Pore Formation and Mobility Investigation will melt samples of a transparent modeling material, succinonitrile and succinonitrile water mixtures, shown here in an ampoule being examined by Dr. Richard Grugel, the principal investigator for the experiment at NASA's Marshall Space Flight Center in Huntsville, Ala. As the samples are processed in space, Grugel will be able to observe how bubbles form in the samples and study their movements and interactions.
International Space Station Urine Monitoring System Functional Integration and Science Testing
NASA Technical Reports Server (NTRS)
Rodriquez, Branelle R.; Broyan, James Lee, Jr.
2011-01-01
Exposure to microgravity during human spaceflight needs to be better understood as the human exploration of space requires longer duration missions. It is known that long term exposure to microgravity causes bone loss. Measuring the calcium and other metabolic byproducts in a crew member s urine can evaluate the effectiveness of bone loss countermeasures. The International Space Station (ISS) Urine Monitoring System (UMS) is an automated urine collection device designed to collect urine, separate the urine and air, measure the void volume, and allow for syringe sampling. Accurate measuring and minimal cross-contamination is essential to determine bone loss and the effectiveness of countermeasures. The ISS UMS provides minimal cross-contamination (<0.7 mL urine) and has volume accuracy of 2% between 100 to 1000 mL urine voids. Designed to provide a non-invasive means to collect urine samples from crew members, the ISS UMS operates in-line with the Node 3 Waste and Hygiene Compartment (WHC). The ISS UMS has undergone modifications required to interface with the WHC, including material changes, science algorithm improvements, and software platform revisions. Integrated functional testing was performed to determine the pressure drop, air flow rate, and the maximum amount of fluid capable of being discharged from the UMS to the WHC. This paper will detail the results of the science and the functional integration tests.
Application of wavefield compressive sensing in surface wave tomography
NASA Astrophysics Data System (ADS)
Zhan, Zhongwen; Li, Qingyang; Huang, Jianping
2018-06-01
Dense arrays allow sampling of seismic wavefield without significant aliasing, and surface wave tomography has benefitted from exploiting wavefield coherence among neighbouring stations. However, explicit or implicit assumptions about wavefield, irregular station spacing and noise still limit the applicability and resolution of current surface wave methods. Here, we propose to apply the theory of compressive sensing (CS) to seek a sparse representation of the surface wavefield using a plane-wave basis. Then we reconstruct the continuous surface wavefield on a dense regular grid before applying any tomographic methods. Synthetic tests demonstrate that wavefield CS improves robustness and resolution of Helmholtz tomography and wavefield gradiometry, especially when traditional approaches have difficulties due to sub-Nyquist sampling or complexities in wavefield.
Felix, Larry Gordon; Farthing, William Earl; Irvin, James Hodges; Snyder, Todd Robert
2010-05-11
A dilution apparatus for diluting a gas sample. The apparatus includes a sample gas conduit having a sample gas inlet end and a diluted sample gas outlet end, and a sample gas flow restricting orifice disposed proximate the sample gas inlet end connected with the sample gas conduit and providing fluid communication between the exterior and the interior of the sample gas conduit. A diluted sample gas conduit is provided within the sample gas conduit having a mixing end with a mixing space inlet opening disposed proximate the sample gas inlet end, thereby forming an annular space between the sample gas conduit and the diluted sample gas conduit. The mixing end of the diluted sample gas conduit is disposed at a distance from the sample gas flow restricting orifice. A dilution gas source connected with the sample gas inlet end of the sample gas conduit is provided for introducing a dilution gas into the annular space, and a filter is provided for filtering the sample gas. The apparatus is particularly suited for diluting heated sample gases containing one or more condensable components.
Morrison, Michael D.; Fajardo-Cavazos, Patricia
2017-01-01
ABSTRACT Past results have suggested that bacterial antibiotic susceptibility is altered during space flight. To test this notion, Bacillus subtilis cells were cultivated in matched hardware, medium, and environmental conditions either in space flight microgravity on the International Space Station, termed flight (FL) samples, or at Earth-normal gravity, termed ground control (GC) samples. The susceptibility of FL and GC samples was compared to 72 antibiotics and growth-inhibitory compounds using the Omnilog phenotype microarray (PM) system. Only 9 compounds were identified by PM screening as exhibiting significant differences (P < 0.05, Student's t test) in FL versus GC samples: 6-mercaptopurine, cesium chloride, enoxacin, lomefloxacin, manganese(II) chloride, nalidixic acid, penimepicycline, rolitetracycline, and trifluoperazine. Testing of the same compounds by standard broth dilution assay did not reveal statistically significant differences in the 50% inhibitory concentrations (IC50s) between FL and GC samples. The results indicate that the susceptibility of B. subtilis cells to a wide range of antibiotics and growth inhibitors is not dramatically altered by space flight. IMPORTANCE This study addresses a major concern of mission planners for human space flight, that bacteria accompanying astronauts on long-duration missions might develop a higher level of resistance to antibiotics due to exposure to the space flight environment. The results of this study do not support that notion. PMID:28821547
Stress-Induced Subclinical Reactivation of Varicella Zoster Virus in Astronauts
NASA Technical Reports Server (NTRS)
Mehta, Satish K.; Pierson, Duane L.; Forghani, Bagher; Zerbe, Gary; Cohrs, Randall J.; Gilden, Donald H.
2003-01-01
After primary infection, varicella-zoster virus (VZV) becomes latent in ganglia. VZV reactivation occurs primarily in elderly individuals, organ transplant recipients, and patients with cancer and AIDS, correlating with a specific decline in cell-mediated immunity to VZV. VZV can also reactivate after surgical stress. To determine whether VZV can also reactivate after acute non-surgical stress, we examined total DNA extracted from 312 saliva samples of eight astronauts before, during and after space flight for VZV DNA by PCR: 112 samples were obtained 234 to 265 days before flight, 84 samples on days 2 through 13 of space flight, and 116 samples on days 1 through 15 after flight. Before space flight only one of the 112 saliva samples from a single astronaut was positive for VZV DNA. In contrast, during and after space flight, 61 of 200 (30%) saliva samples were positive in all 8 astronauts. No VZV DNA was detected in any of 88 saliva samples from 10 healthy control subjects. These data indicate that VZV can reactivate subclinically in healthy individuals after acute stress.
Point defect formation in optical materials expos ed to the space environment
NASA Astrophysics Data System (ADS)
Allen, J. L.; Seifert, N.; Yao, Y.; Albridge, R. G.; Barnes, A. V.; Tolk, N. H.; Strauss, A. M.; Linton, Roger C.; Kamenetzky, R. R.; Vaughn, Jason A.
1995-02-01
Point defect formation associated with early stages of optical damage was observed unexpectedly in two, and possibly three, different optical materials subjected to short-duration space exposure. Three calcium fluoride, two lithium fluoride, and three magnesium fluoride samples were flown on Space Shuttle flight STS-46 as part of the Evaluation of Oxygen Interactions with Materials - Third Phase experiment. One each of the calcium and magnesium fluoride samples was held at a fixed temperature of 60 C during the space exposure, while the temperatures of the other samples were allowed to vary with the ambient temperature of the shuttle cargo bay. Pre-flight and post-flight optical absorption measurements were performed on all of the samples. With the possible exception of the magnesium fluoride samples, every sample clearly showed the formation of F-centers in that section of the sample that was exposed to the low earth orbit environment. Solar vacuum ultraviolet radiation is the most probable primary cause of the defect formation; however, the resulting surface metallization may be synergistically altered by the atomic oxygen environment.
Covariant information-density cutoff in curved space-time.
Kempf, Achim
2004-06-04
In information theory, the link between continuous information and discrete information is established through well-known sampling theorems. Sampling theory explains, for example, how frequency-filtered music signals are reconstructible perfectly from discrete samples. In this Letter, sampling theory is generalized to pseudo-Riemannian manifolds. This provides a new set of mathematical tools for the study of space-time at the Planck scale: theories formulated on a differentiable space-time manifold can be equivalent to lattice theories. There is a close connection to generalized uncertainty relations which have appeared in string theory and other studies of quantum gravity.
NASA Astrophysics Data System (ADS)
Riess, Adam G.; Rodney, Steven A.; Scolnic, Daniel M.; Shafer, Daniel L.; Strolger, Louis-Gregory; Ferguson, Henry C.; Postman, Marc; Graur, Or; Maoz, Dan; Jha, Saurabh W.; Mobasher, Bahram; Casertano, Stefano; Hayden, Brian; Molino, Alberto; Hjorth, Jens; Garnavich, Peter M.; Jones, David O.; Kirshner, Robert P.; Koekemoer, Anton M.; Grogin, Norman A.; Brammer, Gabriel; Hemmati, Shoubaneh; Dickinson, Mark; Challis, Peter M.; Wolff, Schuyler; Clubb, Kelsey I.; Filippenko, Alexei V.; Nayyeri, Hooshang; U, Vivian; Koo, David C.; Faber, Sandra M.; Kocevski, Dale; Bradley, Larry; Coe, Dan
2018-02-01
We present an analysis of 15 Type Ia supernovae (SNe Ia) at redshift z> 1 (9 at 1.5< z< 2.3) recently discovered in the CANDELS and CLASH Multi-Cycle Treasury programs using WFC3 on the Hubble Space Telescope. We combine these SNe Ia with a new compilation of ∼1050 SNe Ia, jointly calibrated and corrected for simulated survey biases to produce accurate distance measurements. We present unbiased constraints on the expansion rate at six redshifts in the range 0.07< z< 1.5 based only on this combined SN Ia sample. The added leverage of our new sample at z> 1.5 leads to a factor of ∼3 improvement in the determination of the expansion rate at z = 1.5, reducing its uncertainty to ∼20%, a measurement of H(z=1.5)/{H}0 = {2.69}-0.52+0.86. We then demonstrate that these six derived expansion rate measurements alone provide a nearly identical characterization of dark energy as the full SN sample, making them an efficient compression of the SN Ia data. The new sample of SNe Ia at z> 1.5 usefully distinguishes between alternative cosmological models and unmodeled evolution of the SN Ia distance indicators, placing empirical limits on the latter. Finally, employing a realistic simulation of a potential Wide-Field Infrared Survey Telescope SN survey observing strategy, we forecast optimistic future constraints on the expansion rate from SNe Ia.
The space of ultrametric phylogenetic trees.
Gavryushkin, Alex; Drummond, Alexei J
2016-08-21
The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
EXPOSE-R2: The Astrobiological ESA Mission on Board of the International Space Station.
Rabbow, Elke; Rettberg, Petra; Parpart, Andre; Panitz, Corinna; Schulte, Wolfgang; Molter, Ferdinand; Jaramillo, Esther; Demets, René; Weiß, Peter; Willnecker, Rainer
2017-01-01
On July 23, 2014, the Progress cargo spacecraft 56P was launched from Baikonur to the International Space Station (ISS), carrying EXPOSE-R2, the third ESA (European Space Agency) EXPOSE facility, the second EXPOSE on the outside platform of the Russian Zvezda module, with four international astrobiological experiments into space. More than 600 biological samples of archaea, bacteria (as biofilms and in planktonic form), lichens, fungi, plant seeds, triops eggs, mosses and 150 samples of organic compounds were exposed to the harsh space environment and to parameters similar to those on the Mars surface. Radiation dosimeters distributed over the whole facility complemented the scientific payload. Three extravehicular activities later the chemical samples were returned to Earth on March 2, 2016, with Soyuz 44S, having spent 588 days in space. The biological samples arrived back later, on June 18, 2016, with 45S, after a total duration in space of 531 days. The exposure of the samples to Low Earth Orbit vacuum lasted for 531 days and was divided in two parts: protected against solar irradiation during the first 62 days, followed by exposure to solar radiation during the subsequent 469 days. In parallel to the space mission, a Mission Ground Reference (MGR) experiment with a flight identical Hardware and a complete flight identical set of samples was performed at the premises of DLR (German Aerospace Center) in Cologne by MUSC (Microgravity User Support Center), according to the mission data either downloaded from the ISS (temperature data, facility status, inner pressure status) or provided by RedShift Design and Engineering BVBA, Belgium (calculated ultra violet radiation fluence data). In this paper, the EXPOSE-R2 facility, the experimental samples, mission parameters, environmental parameters, and the overall mission and MGR sequences are described, building the background for the research papers of the individual experiments, their analysis and results.
EXPOSE-R2: The Astrobiological ESA Mission on Board of the International Space Station
Rabbow, Elke; Rettberg, Petra; Parpart, Andre; Panitz, Corinna; Schulte, Wolfgang; Molter, Ferdinand; Jaramillo, Esther; Demets, René; Weiß, Peter; Willnecker, Rainer
2017-01-01
On July 23, 2014, the Progress cargo spacecraft 56P was launched from Baikonur to the International Space Station (ISS), carrying EXPOSE-R2, the third ESA (European Space Agency) EXPOSE facility, the second EXPOSE on the outside platform of the Russian Zvezda module, with four international astrobiological experiments into space. More than 600 biological samples of archaea, bacteria (as biofilms and in planktonic form), lichens, fungi, plant seeds, triops eggs, mosses and 150 samples of organic compounds were exposed to the harsh space environment and to parameters similar to those on the Mars surface. Radiation dosimeters distributed over the whole facility complemented the scientific payload. Three extravehicular activities later the chemical samples were returned to Earth on March 2, 2016, with Soyuz 44S, having spent 588 days in space. The biological samples arrived back later, on June 18, 2016, with 45S, after a total duration in space of 531 days. The exposure of the samples to Low Earth Orbit vacuum lasted for 531 days and was divided in two parts: protected against solar irradiation during the first 62 days, followed by exposure to solar radiation during the subsequent 469 days. In parallel to the space mission, a Mission Ground Reference (MGR) experiment with a flight identical Hardware and a complete flight identical set of samples was performed at the premises of DLR (German Aerospace Center) in Cologne by MUSC (Microgravity User Support Center), according to the mission data either downloaded from the ISS (temperature data, facility status, inner pressure status) or provided by RedShift Design and Engineering BVBA, Belgium (calculated ultra violet radiation fluence data). In this paper, the EXPOSE-R2 facility, the experimental samples, mission parameters, environmental parameters, and the overall mission and MGR sequences are described, building the background for the research papers of the individual experiments, their analysis and results. PMID:28861052
2003-09-08
KENNEDY SPACE CENTER, FLA. - After removing its cover, technicians look over the Minus Eighty Lab Freezer for ISS (MELFI), provided as Laboratory Support Equipment by the European Space Agency for the International Space Station. The lab will provide cooling and storage for reagents, samples and perishable materials in four insulated containers called dewars with independently selectable temperatures of -80°C, -26°C, and +4°C. It also will be used to transport samples to and from the station. The MELFI is planned for launch on the ULF-1 mission.
Scalzi, Giuliano; Selbmann, Laura; Zucconi, Laura; Rabbow, Elke; Horneck, Gerda; Albertano, Patrizia; Onofri, Silvano
2012-06-01
Desiccated Antarctic rocks colonized by cryptoendolithic communities were exposed on the International Space Station (ISS) to space and simulated Mars conditions (LiFE-Lichens and Fungi Experiment). After 1.5 years in space samples were retrieved, rehydrated and spread on different culture media. Colonies of a green alga and a pink-coloured fungus developed on Malt-Agar medium; they were isolated from a sample exposed to simulated Mars conditions beneath a 0.1 % T Suprasil neutral density filter and from a sample exposed to space vacuum without solar radiation exposure, respectively. None of the other flight samples showed any growth after incubation. The two organisms able to grow were identified at genus level by Small SubUnit (SSU) and Internal Transcribed Spacer (ITS) rDNA sequencing as Stichococcus sp. (green alga) and Acarospora sp. (lichenized fungal genus) respectively. The data in the present study provide experimental information on the possibility of eukaryotic life transfer from one planet to another by means of rocks and of survival in Mars environment.
LDEF materials results for spacecraft applications: Executive summary
NASA Astrophysics Data System (ADS)
Whitaker, A. F.; Dooling, D.
1995-03-01
To address the challenges of space environmental effects, NASA designed the Long Duration Exposure Facility (LDEF) for an 18-month mission to expose thousands of samples of candidate materials that might be used on a space station or other orbital spacecraft. LDEF was launched in April 1984 and was to have been returned to Earth in 1985. Changes in mission schedules postponed retrieval until January 1990, after 69 months in orbit. Analyses of the samples recovered from LDEF have provided spacecraft designers and managers with the most extensive data base on space materials phenomena. Many LDEF samples were greatly changed by extended space exposure. Among even the most radially altered samples, NASA and its science teams are finding a wealth of surprising conclusions and tantalizing clues about the effects of space on materials. Many were discussed at the first two LDEF results conferences and subsequent professional papers. The LDEF Materials Results for Spacecraft Applications Conference was convened in Huntsville to discuss implications for spacecraft design. Already, paint and thermal blanket selections for space station and other spacecraft have been affected by LDEF data. This volume synopsizes those results.
NASA Astrophysics Data System (ADS)
Scalzi, Giuliano; Selbmann, Laura; Zucconi, Laura; Rabbow, Elke; Horneck, Gerda; Albertano, Patrizia; Onofri, Silvano
2012-06-01
Desiccated Antarctic rocks colonized by cryptoendolithic communities were exposed on the International Space Station (ISS) to space and simulated Mars conditions (LiFE— Lichens and Fungi Experiment). After 1.5 years in space samples were retrieved, rehydrated and spread on different culture media. Colonies of a green alga and a pink-coloured fungus developed on Malt-Agar medium; they were isolated from a sample exposed to simulated Mars conditions beneath a 0.1 % T Suprasil neutral density filter and from a sample exposed to space vacuum without solar radiation exposure, respectively. None of the other flight samples showed any growth after incubation. The two organisms able to grow were identified at genus level by Small SubUnit (SSU) and Internal Transcribed Spacer (ITS) rDNA sequencing as Stichococcus sp. (green alga) and Acarospora sp. (lichenized fungal genus) respectively. The data in the present study provide experimental information on the possibility of eukaryotic life transfer from one planet to another by means of rocks and of survival in Mars environment.
LDEF materials results for spacecraft applications: Executive summary
NASA Technical Reports Server (NTRS)
Whitaker, A. F. (Compiler); Dooling, D. (Compiler)
1995-01-01
To address the challenges of space environmental effects, NASA designed the Long Duration Exposure Facility (LDEF) for an 18-month mission to expose thousands of samples of candidate materials that might be used on a space station or other orbital spacecraft. LDEF was launched in April 1984 and was to have been returned to Earth in 1985. Changes in mission schedules postponed retrieval until January 1990, after 69 months in orbit. Analyses of the samples recovered from LDEF have provided spacecraft designers and managers with the most extensive data base on space materials phenomena. Many LDEF samples were greatly changed by extended space exposure. Among even the most radially altered samples, NASA and its science teams are finding a wealth of surprising conclusions and tantalizing clues about the effects of space on materials. Many were discussed at the first two LDEF results conferences and subsequent professional papers. The LDEF Materials Results for Spacecraft Applications Conference was convened in Huntsville to discuss implications for spacecraft design. Already, paint and thermal blanket selections for space station and other spacecraft have been affected by LDEF data. This volume synopsizes those results.
EXPOSE-E: an ESA astrobiology mission 1.5 years in space.
Rabbow, Elke; Rettberg, Petra; Barczyk, Simon; Bohmeier, Maria; Parpart, André; Panitz, Corinna; Horneck, Gerda; von Heise-Rotenburg, Ralf; Hoppenbrouwers, Tom; Willnecker, Rainer; Baglioni, Pietro; Demets, René; Dettmann, Jan; Reitz, Guenther
2012-05-01
The multi-user facility EXPOSE-E was designed by the European Space Agency to enable astrobiology research in space (low-Earth orbit). On 7 February 2008, EXPOSE-E was carried to the International Space Station (ISS) on the European Technology Exposure Facility (EuTEF) platform in the cargo bay of Space Shuttle STS-122 Atlantis. The facility was installed at the starboard cone of the Columbus module by extravehicular activity, where it remained in space for 1.5 years. EXPOSE-E was returned to Earth with STS-128 Discovery on 12 September 2009 for subsequent sample analysis. EXPOSE-E provided accommodation in three exposure trays for a variety of astrobiological test samples that were exposed to selected space conditions: either to space vacuum, solar electromagnetic radiation at >110 nm and cosmic radiation (trays 1 and 3) or to simulated martian surface conditions (tray 2). Data on UV radiation, cosmic radiation, and temperature were measured every 10 s and downlinked by telemetry. A parallel mission ground reference (MGR) experiment was performed on ground with a parallel set of hardware and samples under simulated space conditions. EXPOSE-E performed a successful 1.5-year mission in space.
Stratification-Based Outlier Detection over the Deep Web.
Xian, Xuefeng; Zhao, Pengpeng; Sheng, Victor S; Fang, Ligang; Gu, Caidong; Yang, Yuanfeng; Cui, Zhiming
2016-01-01
For many applications, finding rare instances or outliers can be more interesting than finding common patterns. Existing work in outlier detection never considers the context of deep web. In this paper, we argue that, for many scenarios, it is more meaningful to detect outliers over deep web. In the context of deep web, users must submit queries through a query interface to retrieve corresponding data. Therefore, traditional data mining methods cannot be directly applied. The primary contribution of this paper is to develop a new data mining method for outlier detection over deep web. In our approach, the query space of a deep web data source is stratified based on a pilot sample. Neighborhood sampling and uncertainty sampling are developed in this paper with the goal of improving recall and precision based on stratification. Finally, a careful performance evaluation of our algorithm confirms that our approach can effectively detect outliers in deep web.
Photoconductive circuit element reflectometer
Rauscher, Christen
1990-01-01
A photoconductive reflectometer for characterizing semiconductor devices at millimeter wavelength frequencies where a first photoconductive circuit element (PCE) is biased by a direct current voltage source and produces short electrical pulses when excited into conductance by short first laser light pulses. The electrical pulses are electronically conditioned to improve the frequency related amplitude characteristics of the pulses which thereafter propagate along a transmission line to a device under test. Second PCEs are connected along the transmission line to sample the signals on the transmission line when excited into conductance by short second laser light pulses, spaced apart in time a variable period from the first laser light pulses. Electronic filters connected to each of the second PCEs act as low-pass filters and remove parasitic interference from the sampled signals and output the sampled signals in the form of slowed-motion images of the signals on the transmission line.
Photoconductive circuit element reflectometer
Rauscher, C.
1987-12-07
A photoconductive reflectometer for characterizing semiconductor devices at millimeter wavelength frequencies where a first photoconductive circuit element (PCE) is biased by a direct current voltage source and produces short electrical pulses when excited into conductance by short first laser light pulses. The electrical pulses are electronically conditioned to improve the frequency related amplitude characteristics of the pulses which thereafter propagate along a transmission line to a device under test. Second PCEs are connected along the transmission line to sample the signals on the transmission line when excited into conductance by short second laser light pulses, spaced apart in time a determinable period from the first laser light pulses. Electronic filters connected to each of the second PCEs act as low-pass filters and remove parasitic interference from the sampled signals and output the sampled signals in the form of slowed-motion images of the signals on the transmission line. 4 figs.
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-12
Russel Howe of team Survey, center, works on a laptop to prepare the team's robot for a demonstration run after the team's robot failed to leave the starting platform during it's attempt at the level two challenge at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Thursday, June 12, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Eighteen teams are competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
2014 NASA Centennial Challenges Sample Return Robot Challenge
2014-06-10
A pair of Worcester Polytechnic Institute (WPI) students walk past a pair of team KuuKulgur's robots on the campus quad, during a final tuneup before the start of competition at the 2014 NASA Centennial Challenges Sample Return Robot Challenge, Tuesday, June 10, 2014, at the Worcester Polytechnic Institute (WPI) in Worcester, Mass. Team KuuKulgur is one of eighteen teams competing for a $1.5 million NASA prize purse. Teams will be required to demonstrate autonomous robots that can locate and collect samples from a wide and varied terrain, operating without human control. The objective of this NASA-WPI Centennial Challenge is to encourage innovations in autonomous navigation and robotics technologies. Innovations stemming from the challenge may improve NASA's capability to explore a variety of destinations in space, as well as enhance the nation's robotic technology for use in industries and applications on Earth. Photo Credit: (NASA/Joel Kowsky)
Spectral Absorption Properties of Aerosol Particles from 350-2500nm
NASA Technical Reports Server (NTRS)
Martins, J. Vanderlei; Artaxo, Paulo; Kaufman, Yoram J.; Castanho, Andrea D.; Remer, Lorraine A.
2009-01-01
The aerosol spectral absorption efficiency (alpha (sub a) in square meters per gram) is measured over an extended wavelength range (350 2500 nm) using an improved calibrated and validated reflectance technique and applied to urban aerosol samples from Sao Paulo, Brazil and from a site in Virginia, Eastern US, that experiences transported urban/industrial aerosol. The average alpha (sub a) values (approximately 3 square meters per gram at 550 nm) for Sao Paulo samples are 10 times larger than alpha (sub a) values obtained for aerosols in Virginia. Sao Paulo aerosols also show evidence of enhanced UV absorption in selected samples, probably associated with organic aerosol components. This extra UV absorption can double the absorption efficiency observed from black carbon alone, therefore reducing by up to 50% the surface UV fluxes, with important implications for climate, UV photolysis rates, and remote sensing from space.
Stratification-Based Outlier Detection over the Deep Web
Xian, Xuefeng; Zhao, Pengpeng; Sheng, Victor S.; Fang, Ligang; Gu, Caidong; Yang, Yuanfeng; Cui, Zhiming
2016-01-01
For many applications, finding rare instances or outliers can be more interesting than finding common patterns. Existing work in outlier detection never considers the context of deep web. In this paper, we argue that, for many scenarios, it is more meaningful to detect outliers over deep web. In the context of deep web, users must submit queries through a query interface to retrieve corresponding data. Therefore, traditional data mining methods cannot be directly applied. The primary contribution of this paper is to develop a new data mining method for outlier detection over deep web. In our approach, the query space of a deep web data source is stratified based on a pilot sample. Neighborhood sampling and uncertainty sampling are developed in this paper with the goal of improving recall and precision based on stratification. Finally, a careful performance evaluation of our algorithm confirms that our approach can effectively detect outliers in deep web. PMID:27313603
Scalable free energy calculation of proteins via multiscale essential sampling
NASA Astrophysics Data System (ADS)
Moritsugu, Kei; Terada, Tohru; Kidera, Akinori
2010-12-01
A multiscale simulation method, "multiscale essential sampling (MSES)," is proposed for calculating free energy surface of proteins in a sizable dimensional space with good scalability. In MSES, the configurational sampling of a full-dimensional model is enhanced by coupling with the accelerated dynamics of the essential degrees of freedom. Applying the Hamiltonian exchange method to MSES can remove the biasing potential from the coupling term, deriving the free energy surface of the essential degrees of freedom. The form of the coupling term ensures good scalability in the Hamiltonian exchange. As a test application, the free energy surface of the folding process of a miniprotein, chignolin, was calculated in the continuum solvent model. Results agreed with the free energy surface derived from the multicanonical simulation. Significantly improved scalability with the MSES method was clearly shown in the free energy calculation of chignolin in explicit solvent, which was achieved without increasing the number of replicas in the Hamiltonian exchange.
Efficient exploration of chemical space by fragment-based screening.
Hall, Richard J; Mortenson, Paul N; Murray, Christopher W
2014-01-01
Screening methods seek to sample a vast chemical space in order to identify starting points for further chemical optimisation. Fragment based drug discovery exploits the superior sampling of chemical space that can be achieved when the molecular weight is restricted. Here we show that commercially available fragment space is still relatively poorly sampled and argue for highly sensitive screening methods to allow the detection of smaller fragments. We analyse the properties of our fragment library versus the properties of X-ray hits derived from the library. We particularly consider properties related to the degree of planarity of the fragments. Copyright © 2014 Elsevier Ltd. All rights reserved.
Kamath, Ganesh; Kurnikov, Igor; Fain, Boris; Leontyev, Igor; Illarionov, Alexey; Butin, Oleg; Olevanov, Michael; Pereyaslavets, Leonid
2016-11-01
We present the performance of blind predictions of water-cyclohexane distribution coefficients for 53 drug-like compounds in the SAMPL5 challenge by three methods currently in use within our group. Two of them utilize QMPFF3 and ARROW, polarizable force-fields of varying complexity, and the third uses the General Amber Force-Field (GAFF). The polarizable FF's are implemented in an in-house MD package, Arbalest. We find that when we had time to parametrize the functional groups with care (batch 0), the polarizable force-fields outperformed the non-polarizable one. Conversely, on the full set of 53 compounds, GAFF performed better than both QMPFF3 and ARROW. We also describe the torsion-restrain method we used to improve sampling of molecular conformational space and thus the overall accuracy of prediction. The SAMPL5 challenge highlighted several drawbacks of our force-fields, such as our significant systematic over-estimation of hydrophobic interactions, specifically for alkanes and aromatic rings.
Importance sampling large deviations in nonequilibrium steady states. I.
Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T
2018-03-28
Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.
NASA Astrophysics Data System (ADS)
Wang, Ting; Plecháč, Petr
2017-12-01
Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.
Importance sampling large deviations in nonequilibrium steady states. I
NASA Astrophysics Data System (ADS)
Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T.
2018-03-01
Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.
Mars rover/sample return mission requirements affecting space station
NASA Technical Reports Server (NTRS)
1988-01-01
The possible interfaces between the Space Station and the Mars Rover/Sample Return (MRSR) mission are defined. In order to constrain the scope of the report a series of seven design reference missions divided into three major types were assumed. These missions were defined to span the probable range of Space Station-MRSR interactions. The options were reduced, the MRSR sample handling requirements and baseline assumptions about the MRSR hardware and the key design features and requirements of the Space Station are summarized. Only the aspects of the design reference missions necessary to define the interfaces, hooks and scars, and other provisions on the Space Station are considered. An analysis of each of the three major design reference missions, is reported, presenting conceptual designs of key hardware to be mounted on the Space Station, a definition of weights, interfaces, and required hooks and scars.
Bossert, Thomas John; Mitchell, Andrew David
2011-01-01
Health sector decentralization has been widely adopted to improve delivery of health services. While many argue that institutional capacities and mechanisms of accountability required to transform decentralized decision-making into improvements in local health systems are lacking, few empirical studies exist which measure or relate together these concepts. Based on research instruments administered to a sample of 91 health sector decision-makers in 17 districts of Pakistan, this study analyzes relationships between three dimensions of decentralization: decentralized authority (referred to as "decision space"), institutional capacities, and accountability to local officials. Composite quantitative indicators of these three dimensions were constructed within four broad health functions (strategic and operational planning, budgeting, human resources management, and service organization/delivery) and on an overall/cross-function basis. Three main findings emerged. First, district-level respondents report varying degrees of each dimension despite being under a single decentralization regime and facing similar rules across provinces. Second, within dimensions of decentralization-particularly decision space and capacities-synergies exist between levels reported by respondents in one function and those reported in other functions (statistically significant coefficients of correlation ranging from ρ=0.22 to ρ=0.43). Third, synergies exist across dimensions of decentralization, particularly in terms of an overall indicator of institutional capacities (significantly correlated with both overall decision space (ρ=0.39) and accountability (ρ=0.23)). This study demonstrates that decentralization is a varied experience-with some district-level officials making greater use of decision space than others and that those who do so also tend to have more capacity to make decisions and are held more accountable to elected local officials for such choices. These findings suggest that Pakistan's decentralization policy should focus on synergies among dimensions of decentralization to encouraging more use of de jure decision space, work toward more uniform institutional capacity, and encourage greater accountability to local elected officials. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Ghods, M.; Tewari, S. N.; Lauer, M.; Poirier, D. R.; Grugel, R. N.
2016-01-01
Under a NASA-ESA collaborative research project, three Al-7-weight-percentage Si samples (MICAST-6, MICAST-7 and MICAST 2-12) were directionally solidified aboard the International Space Station to determine the effect of mitigating convection on the primary dendrite array. The samples were approximately 25 centimeters in length with a diameter of 7.8 millimeter-diameter cylinders that were machined from [100] oriented terrestrially grown dendritic Al-7Si samples and inserted into alumina ampoules within the Sample Cartridge Assembly (SCA) inserts of the Low Gradient Furnace (LGF). The feed rods were partially remelted in space and directionally solidified to effect the [100] dendrite-orientation. MICAST-6 was grown at 5 microns per second for 3.75 centimeters and then at 50 microns per second for its remaining 11.2 centimeters of its length. MICAST-7 was grown at 20 microns per second for 8.5 centimeters and then at 10 microns per second for 9 centimeters of its remaining length. MICAST2-12 was grown at 40 microns per second for 11 centimeters. The thermal gradient at the liquidus temperature varied from 22 to 14 degrees Kelvin per centimeter during growth of MICAST-6, from 26 to 24 degrees Kelvin per centimeter for MICAST-7 and from 33 to 31 degrees Kelvin per centimeter for MICAST2-12. Microstructures on the transverse sections along the sample length were analyzed to determine nearest-neighbor spacing of the primary dendrite arms and trunk diameters of the primary dendrite-arrays. This was done along the lengths where steady-state growth prevailed and also during the transients associated with the speed-changes. The observed nearest-neighbor spacings during steady-state growth of the MICAST samples show a very good agreement with predictions from the Hunt-Lu primary spacing model for diffusion controlled growth. The observed primary dendrite trunk diameters during steady-state growth of these samples also agree with predictions from a coarsening-based model. The radial macrosegregation and "steepling" caused by thermosolutal convection during terrestrial growth of the Al-7Si was not observed in the space-grown MICAST samples.
NASA Astrophysics Data System (ADS)
Cloninger, Alexander; Czaja, Wojciech; Doster, Timothy
2017-07-01
As the popularity of non-linear manifold learning techniques such as kernel PCA and Laplacian Eigenmaps grows, vast improvements have been seen in many areas of data processing, including heterogeneous data fusion and integration. One problem with the non-linear techniques, however, is the lack of an easily calculable pre-image. Existence of such pre-image would allow visualization of the fused data not only in the embedded space, but also in the original data space. The ability to make such comparisons can be crucial for data analysts and other subject matter experts who are the end users of novel mathematical algorithms. In this paper, we propose a pre-image algorithm for Laplacian Eigenmaps. Our method offers major improvements over existing techniques, which allow us to address the problem of noisy inputs and the issue of how to calculate the pre-image of a point outside the convex hull of training samples; both of which have been overlooked in previous studies in this field. We conclude by showing that our pre-image algorithm, combined with feature space rotations, allows us to recover occluded pixels of an imaging modality based off knowledge of that image measured by heterogeneous modalities. We demonstrate this data recovery on heterogeneous hyperspectral (HS) cameras, as well as by recovering LIDAR measurements from HS data.
Standard solar model. II - g-modes
NASA Technical Reports Server (NTRS)
Guenther, D. B.; Demarque, P.; Pinsonneault, M. H.; Kim, Y.-C.
1992-01-01
The paper presents the g-mode oscillation for a set of modern solar models. Each solar model is based on a single modification or improvement to the physics of a reference solar model. Improvements were made to the nuclear reaction rates, the equation of state, the opacities, and the treatment of the atmosphere. The error in the predicted g-mode periods associated with the uncertainties in the model physics is predicted and the specific sensitivities of the g-mode periods and their period spacings to the different model structures are described. In addition, these models are compared to a sample of published observations. A remarkably good agreement is found between the 'best' solar model and the observations of Hill and Gu (1990).
NASA Astrophysics Data System (ADS)
Obuchi, Tomoyuki; Cocco, Simona; Monasson, Rémi
2015-11-01
We consider the problem of learning a target probability distribution over a set of N binary variables from the knowledge of the expectation values (with this target distribution) of M observables, drawn uniformly at random. The space of all probability distributions compatible with these M expectation values within some fixed accuracy, called version space, is studied. We introduce a biased measure over the version space, which gives a boost increasing exponentially with the entropy of the distributions and with an arbitrary inverse `temperature' Γ . The choice of Γ allows us to interpolate smoothly between the unbiased measure over all distributions in the version space (Γ =0) and the pointwise measure concentrated at the maximum entropy distribution (Γ → ∞ ). Using the replica method we compute the volume of the version space and other quantities of interest, such as the distance R between the target distribution and the center-of-mass distribution over the version space, as functions of α =(log M)/N and Γ for large N. Phase transitions at critical values of α are found, corresponding to qualitative improvements in the learning of the target distribution and to the decrease of the distance R. However, for fixed α the distance R does not vary with Γ which means that the maximum entropy distribution is not closer to the target distribution than any other distribution compatible with the observable values. Our results are confirmed by Monte Carlo sampling of the version space for small system sizes (N≤ 10).
A Survey of Environmental Microbial Flora During Closed Chamber Studies
NASA Technical Reports Server (NTRS)
Ott, C. Mark; Groves, Theron O.; Bell-Robinson, Denetia; Pierson, Duane L.; Paloski, W. H. (Technical Monitor)
1999-01-01
Services, Inc. and NASA Johnson Space Center, Houston, TX As NASA prepares for long-term missions aboard the International Space Station and the eventual exploration of Mars, closed-environment chambers on Earth have become important test beds for systems evaluations. During 2 separate studies of a selfcontained ecosystem containing 4 crewmembers, microbial surveys of samples from 13 surface and 3 air sites were performed. Microbial concentration of samples from surface sites with frequent water contact (e.g., urinal, sink) did not indicate significantly higher levels of contamination than drier areas, though surface cleaning by the crew may have influenced this conclusion. Changes in bacterial diversity on surface sites implied that the number of transient species was high, suggesting movement by crew activities, aerosols, or both. A non-linear relationship between bacterial diversity and enumeration from surface samples indicated that a rapid increase occurred in the number of species as cell concentration increased to 5 CFU/sq cm. Above this concentration, the number of different bacterial species varied between 11 and 16. Airborne bacteria and fungi averaged only 160 and 1 CFU/m3, respectively. Microbial contamination of the potable water system primarily consisted of 3 species of Gram negative bacteria; however, after 60 days during one study, several species of Bacillus became the dominant flora. This study suggests that under these conditions, microbial contamination in the air and water was suppressed by the life-support systems, though contamination was possible. Conversely, the crew and their activities controlled microbial levels on surfaces. Understanding the factors that affect microbial control will improve the design of microbial testing both during space flight and in analogous Earth-based environments.
NASA Technical Reports Server (NTRS)
Schonfeld, Julie E.
2015-01-01
Wetlab-2 is a research platform for conducting real-time quantitative gene expression analysis aboard the International Space Station. The system enables spaceflight genomic studies involving a wide variety of biospecimen types in the unique microgravity environment of space. Currently, gene expression analyses of space flown biospecimens must be conducted post flight after living cultures or frozen or chemically fixed samples are returned to Earth from the space station. Post-flight analysis is limited for several reasons. First, changes in gene expression can be transient, changing over a timescale of minutes. The delay between sampling on Earth can range from days to months, and RNA may degrade during this period of time, even in fixed or frozen samples. Second, living organisms that return to Earth may quickly re-adapt to terrestrial conditions. Third, forces exerted on samples during reentry and return to Earth may affect results. Lastly, follow up experiments designed in response to post-flight results must wait for a new flight opportunity to be tested.
NASA Technical Reports Server (NTRS)
Minton, Timothy K.; Moore, Teresa A.
1995-01-01
Mass spectra of products emerging from identical samples of a C-13-enriched polyimide polymer (chemically equivalent to Kapton) under atomic oxygen bombardment in space and in the laboratory were collected. Reaction products unambiguously detected in space were CO-13, NO, (12)CO2, and (13)CO2. These reaction products and two others, H2O and CO-12, were detected in the laboratory, along with inelastically scattered atomic and molecular oxygen. Qualitative agreement was seen in the mass spectra taken in space and in the laboratory; the agreement may be improved by reducing the fraction of O2 in the laboratory molecular beam. Both laboratory and space data indicated that CO and CO2 products come preferentially from reaction with the imide component of the polymer chain, raising the possibility that the either component may degrade in part by the 'evaporation' of higher molecular weight fragments. Laboratory time-of-flight distributions showed: (1) incomplete energy accommodation of impinging O and O2 species that do not react with the surface; and (2) both hyperthermal and thermal CO and CO2 products, suggesting two distinct reaction mechanisms with the surface.
NASA Technical Reports Server (NTRS)
Dever, Joyce; Miller, Sharon; Messer, Russell; Sechkar, Edward; Tollis, Greg
2002-01-01
Seventy-nine samples of polymer film thermal control (PFTC) materials have been provided by the National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) for exposure to the low Earth orbit environment on the exterior of the International Space Station (ISS) as part of the Materials International Space Station Experiment (MISSE). MISSE is a materials flight experiment sponsored by the Air Force Research Lab/Materials Lab and NASA. This paper will describe background, objectives, and configurations for the GRC PFTC samples for MISSE. These samples include polyimides, fluorinated polyimides, and Teflon fluorinated ethylene propylene (FEP) with and without second-surface metallizing layers and/or surface coatings. Also included are polyphenylene benzobisoxazole (PBO) and a polyarylene ether benzimidazole (TOR-LM). On August 16, 2001, astronauts installed passive experiment carriers (PECs) on the exterior of the ISS in which were located twenty-eight of the GRC PFTC samples for 1-year space exposure. MISSE PECs for 3-year exposure, which will contain fifty-one GRC PFTC samples, will be installed on the ISS at a later date. Once returned from the ISS, MISSE GRC PFTC samples will be examined for changes in optical and mechanical properties and atomic oxygen (AO) erosion. Additional sapphire witness samples located on the AO exposed trays will be examined for deposition of contaminants.
Stress-induced subclinical reactivation of varicella zoster virus in astronauts
NASA Technical Reports Server (NTRS)
Mehta, Satish K.; Cohrs, Randall J.; Forghani, Bagher; Zerbe, Gary; Gilden, Donald H.; Pierson, Duane L.
2004-01-01
Varicella zoster virus (VZV) becomes latent in human ganglia after primary infection. VZV reactivation occurs primarily in elderly individuals, organ transplant recipients, and patients with cancer and AIDS, correlating with a specific decline in cell-mediated immunity to the virus. VZV can also reactivate after surgical stress. The unexpected occurrence of thoracic zoster 2 days before space flight in a 47-year-old healthy astronaut from a pool of 81 physically fit astronauts prompted our search for VZV reactivation during times of stress to determine whether VZV can also reactivate after non-surgical stress. We examined total DNA extracted from 312 saliva samples of eight astronauts before, during, and after space flight for VZV DNA by polymerase chain reaction: 112 samples were obtained 234-265 days before flight, 84 samples on days 2 through 13 of space flight, and 116 samples on days 1 through 15 after flight. Before space flight, only one of the 112 saliva samples from a single astronaut was positive for VZV DNA. In contrast, during and after space flight, 61 of 200 (30%) saliva samples were positive in all eight astronauts. No VZV DNA was detected in any of 88 saliva samples from 10 healthy control subjects. These results indicate that VZV can reactivate subclinically in healthy individuals after non-surgical stress. Copyright 2004 Wiley-Liss, Inc.
Development of a Novel Self-Enclosed Sample Preparation Device for DNA/RNA Isolation in Space
NASA Technical Reports Server (NTRS)
Zhang, Ye; Mehta, Satish K.; Pensinger, Stuart J.; Pickering, Karen D.
2011-01-01
Modern biology techniques present potentials for a wide range of molecular, cellular, and biochemistry applications in space, including detection of infectious pathogens and environmental contaminations, monitoring of drug-resistant microbial and dangerous mutations, identification of new phenotypes of microbial and new life species. However, one of the major technological blockades in enabling these technologies in space is a lack of devices for sample preparation in the space environment. To overcome such an obstacle, we constructed a prototype of a DNA/RNA isolation device based on our novel designs documented in the NASA New Technology Reporting System (MSC-24811-1/3-1). This device is self-enclosed and pipette free, purposely designed for use in the absence of gravity. Our design can also be modified easily for preparing samples in space for other applications, such as flowcytometry, immunostaining, cell separation, sample purification and separation according to its size and charges, sample chemical labeling, and sample purification. The prototype of our DNA/RNA isolation device was tested for efficiencies of DNA and RNA isolation from various cell types for PCR analysis. The purity and integrity of purified DNA and RNA were determined as well. Results showed that our developed DNA/RNA isolation device offers similar efficiency and quality in comparison to the samples prepared using the standard protocol in the laboratory.
The development of global motion discrimination in school aged children
Bogfjellmo, Lotte-Guri; Bex, Peter J.; Falkenberg, Helle K.
2014-01-01
Global motion perception matures during childhood and involves the detection of local directional signals that are integrated across space. We examine the maturation of local directional selectivity and global motion integration with an equivalent noise paradigm applied to direction discrimination. One hundred and three observers (6–17 years) identified the global direction of motion in a 2AFC task. The 8° central stimuli consisted of 100 dots of 10% Michelson contrast moving 2.8°/s or 9.8°/s. Local directional selectivity and global sampling efficiency were estimated from direction discrimination thresholds as a function of external directional noise, speed, and age. Direction discrimination thresholds improved gradually until the age of 14 years (linear regression, p < 0.05) for both speeds. This improvement was associated with a gradual increase in sampling efficiency (linear regression, p < 0.05), with no significant change in internal noise. Direction sensitivity was lower for dots moving at 2.8°/s than at 9.8°/s for all ages (paired t test, p < 0.05) and is mainly due to lower sampling efficiency. Global motion perception improves gradually during development and matures by age 14. There was no change in internal noise after the age of 6, suggesting that local direction selectivity is mature by that age. The improvement in global motion perception is underpinned by a steady increase in the efficiency with which direction signals are pooled, suggesting that global motion pooling processes mature for longer and later than local motion processing. PMID:24569985
Low-Power SOI CMOS Transceiver
NASA Technical Reports Server (NTRS)
Fujikawa, Gene (Technical Monitor); Cheruiyot, K.; Cothern, J.; Huang, D.; Singh, S.; Zencir, E.; Dogan, N.
2003-01-01
The work aims at developing a low-power Silicon on Insulator Complementary Metal Oxide Semiconductor (SOI CMOS) Transceiver for deep-space communications. RF Receiver must accomplish the following tasks: (a) Select the desired radio channel and reject other radio signals, (b) Amplify the desired radio signal and translate them back to baseband, and (c) Detect and decode the information with Low BER. In order to minimize cost and achieve high level of integration, receiver architecture should use least number of external filters and passive components. It should also consume least amount of power to minimize battery cost, size, and weight. One of the most stringent requirements for deep-space communication is the low-power operation. Our study identified that two candidate architectures listed in the following meet these requirements: (1) Low-IF receiver, (2) Sub-sampling receiver. The low-IF receiver uses minimum number of external components. Compared to Zero-IF (Direct conversion) architecture, it has less severe offset and flicker noise problems. The Sub-sampling receiver amplifies the RF signal and samples it using track-and-hold Subsampling mixer. These architectures provide low-power solution for the short- range communications missions on Mars. Accomplishments to date include: (1) System-level design and simulation of a Double-Differential PSK receiver, (2) Implementation of Honeywell SOI CMOS process design kit (PDK) in Cadence design tools, (3) Design of test circuits to investigate relationships between layout techniques, geometry, and low-frequency noise in SOI CMOS, (4) Model development and verification of on-chip spiral inductors in SOI CMOS process, (5) Design/implementation of low-power low-noise amplifier (LNA) and mixer for low-IF receiver, and (6) Design/implementation of high-gain LNA for sub-sampling receiver. Our initial results show that substantial improvement in power consumption is achieved using SOI CMOS as compared to standard CMOS process. Potential advantages of SOI CMOS for deep-space communication electronics include: (1) Radiation hardness, (2) Low-power operation, and (3) System-on-Chip (SOC) solutions.
AVNM: A Voting based Novel Mathematical Rule for Image Classification.
Vidyarthi, Ankit; Mittal, Namita
2016-12-01
In machine learning, the accuracy of the system depends upon classification result. Classification accuracy plays an imperative role in various domains. Non-parametric classifier like K-Nearest Neighbor (KNN) is the most widely used classifier for pattern analysis. Besides its easiness, simplicity and effectiveness characteristics, the main problem associated with KNN classifier is the selection of a number of nearest neighbors i.e. "k" for computation. At present, it is hard to find the optimal value of "k" using any statistical algorithm, which gives perfect accuracy in terms of low misclassification error rate. Motivated by the prescribed problem, a new sample space reduction weighted voting mathematical rule (AVNM) is proposed for classification in machine learning. The proposed AVNM rule is also non-parametric in nature like KNN. AVNM uses the weighted voting mechanism with sample space reduction to learn and examine the predicted class label for unidentified sample. AVNM is free from any initial selection of predefined variable and neighbor selection as found in KNN algorithm. The proposed classifier also reduces the effect of outliers. To verify the performance of the proposed AVNM classifier, experiments are made on 10 standard datasets taken from UCI database and one manually created dataset. The experimental result shows that the proposed AVNM rule outperforms the KNN classifier and its variants. Experimentation results based on confusion matrix accuracy parameter proves higher accuracy value with AVNM rule. The proposed AVNM rule is based on sample space reduction mechanism for identification of an optimal number of nearest neighbor selections. AVNM results in better classification accuracy and minimum error rate as compared with the state-of-art algorithm, KNN, and its variants. The proposed rule automates the selection of nearest neighbor selection and improves classification rate for UCI dataset and manually created dataset. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Fluidized Bed Asbestos Sampler Design and Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karen E. Wright; Barry H. O'Brien
A large number of samples are required to characterize a site contaminated with asbestos from previous mine or other industrial operations. Current methods, such as EPA Region 10’s glovebox method, or the Berman Elutriator method are time consuming and costly primarily because the equipment is difficult to decontaminate between samples. EPA desires a shorter and less costly method for characterizing soil samples for asbestos. The objective of this was to design and test a qualitative asbestos sampler that operates as a fluidized bed. The proposed sampler employs a conical spouted bed to vigorously mix the soil and separate fine particulatemore » including asbestos fibers on filters. The filters are then analyzed using transmission electron microscopy for presence of asbestos. During initial testing of a glass prototype using ASTM 20/30 sand and clay fines as asbestos surrogates, fine particulate adhered to the sides of the glass vessel and the tubing to the collection filter – presumably due to static charge on the fine particulate. This limited the fines recovery to ~5% of the amount added to the sand surrogate. A second prototype was constructed of stainless steel, which improved fines recovery to about 10%. Fines recovery was increased to 15% by either humidifying the inlet air or introducing a voltage probe in the air space above the sample. Since this was not a substantial improvement, testing using the steel prototype proceeded without using these techniques. Final testing of the second prototype using asbestos suggests that the fluidized bed is considerably more sensitive than the Berman elutriator method. Using a sand/tremolite mixture with 0.005% tremolite, the Berman elutriator did not segregate any asbestos structures while the fluidized bed segregated an average of 11.7. The fluidized bed was also able to segregate structures in samples containing asbestos at a 0.0001% concentration, while the Berman elutriator method did not detect any fibers at this concentration. Opportunities for improvement with the fluidized bed include improving reproducibility among replicates, increasing mass recovery, improving the lid gasket seal.« less
MSRR Rack Materials Science Research Rack
NASA Technical Reports Server (NTRS)
Reagan, Shawn
2017-01-01
The Materials Science Research Rack (MSRR) is a research facility developed under a cooperative research agreement between NASA and the European Space Agency (ESA) for materials science investigations on the International Space Station (ISS). The MSRR is managed at the Marshall Space Flight Center (MSFC) in Huntsville, AL. The MSRR facility subsystems were manufactured by Teledyne Brown Engineering (TBE) and integrated with the ESA/EADS-Astrium developed Materials Science Laboratory (MSL) at the MSFC Space Station Integration and Test Facility (SSITF) as part of the Systems Development Operations Support (SDOS) contract. MSRR was launched on STS-128 in August 2009, and is currently installed in the U. S. Destiny Laboratory Module on the ISS. Materials science is an integral part of developing new, safer, stronger, more durable materials for use throughout everyday life. The goal of studying materials processing in space is to develop a better understanding of the chemical and physical mechanisms involved, and how they differ in the microgravity environment of space. To that end, the MSRR accommodates advanced investigations in the microgravity environment of the ISS for basic materials science research in areas such as solidification of metals and alloys. MSRR allows for the study of a variety of materials including metals, ceramics, semiconductor crystals, and glasses. Materials science research benefits from the microgravity environment of space, where the researcher can better isolate chemical and thermal properties of materials from the effects of gravity. With this knowledge, reliable predictions can be made about the conditions required on Earth to achieve improved materials. MSRR is a highly automated facility with a modular design capable of supporting multiple types of investigations. Currently the NASA-provided Rack Support Subsystem provides services (power, thermal control, vacuum access, and command and data handling) to the ESA developed Materials Science Laboratory (MSL) which accommodates interchangeable Furnace Inserts (FI). Two ESA-developed FIs are presently available on the ISS: the Low Gradient Furnace (LGF) and the Solidification and Quenching Furnace (SQF). Sample-Cartridge Assemblies (SCAs), each containing one or more material samples, are installed in the FI by the crew and can be processed at temperatures up to 1400 C. Once an SCA is installed, the experiment can be run by automatic command or science conducted via telemetry commands from the ground. This facility is available to support materials science investigations through programs such as the US National Laboratory, Technology Development, NASA Research Announcements, and others. TBE and MSFC are currently developing NASA Sample Cartridge Assemblies (SCA's) with a planned availability for launch in 2017.
NASA Technical Reports Server (NTRS)
2003-01-01
This video presents an overview of the first Tracking and Data Relay Satellite (TDRS-1) in the form of text, computer animations, footage, and an interview with its program manager. Launched by the Space Shuttle Challenger in 1983, TDRS-1 was the first of a network of satellites used for relaying data to and from scientific spacecraft. Most of this short video is silent, and consists of footage and animation of the deployment of TDRS-1, written and animated explanations of what TDRS satellites do, and samples of the astronomical and Earth science data they transmit. The program manager explains in the final segment of the video the improvement TDRS satellites brought to communication with manned space missions, including alleviation of blackout during reentry, and also the role TDRS-1 played in providing telemedicine for a breast cancer patient in Antarctica.
Analysis of the coupling efficiency of a tapered space receiver with a calculus mathematical model
NASA Astrophysics Data System (ADS)
Hu, Qinggui; Mu, Yining
2018-03-01
We establish a calculus mathematical model to study the coupling characteristics of tapered optical fibers in a space communications system, and obtained the coupling efficiency equation. Then, using MATLAB software, the solution was calculated. After this, the sample was produced by the mature flame-brush technique. The experiment was then performed, and the results were in accordance with the theoretical analysis. This shows that the theoretical analysis was correct and indicates that a tapered structure could improve its tolerance with misalignment. Project supported by The National Natural Science Foundation of China (grant no. 61275080); 2017 Jilin Province Science and Technology Development Plan-Science and Technology Innovation Fund for Small and Medium Enterprises (20170308029HJ); ‘thirteen five’ science and technology research project of the Department of Education of Jilin 2016 (16JK009).
Many Molecular Properties from One Kernel in Chemical Space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramakrishnan, Raghunathan; von Lilienfeld, O. Anatole
We introduce property-independent kernels for machine learning modeling of arbitrarily many molecular properties. The kernels encode molecular structures for training sets of varying size, as well as similarity measures sufficiently diffuse in chemical space to sample over all training molecules. Corresponding molecular reference properties provided, they enable the instantaneous generation of ML models which can systematically be improved through the addition of more data. This idea is exemplified for single kernel based modeling of internal energy, enthalpy, free energy, heat capacity, polarizability, electronic spread, zero-point vibrational energy, energies of frontier orbitals, HOMOLUMO gap, and the highest fundamental vibrational wavenumber. Modelsmore » of these properties are trained and tested using 112 kilo organic molecules of similar size. Resulting models are discussed as well as the kernels’ use for generating and using other property models.« less
NASA Technical Reports Server (NTRS)
Rauscher, Bernard J.; Arendt, Richard G.; Fixsen, D. J.; Lander, Matthew; Lindler, Don; Loose, Markus; Moseley, S. H.; Wilson, Donna V.; Xenophontos, Christos
2012-01-01
IRS2 is a Wiener-optimal approach to using all of the reference information that Teledyne's HAWAII-2RG detector arrays provide. Using a new readout pattern, IRS2 regularly interleaves reference pixels with the normal pixels during readout. This differs from conventional clocking, in which the reference pixels are read out infrequently, and only in a few rows and columns around the outside edges of the detector array. During calibration, the data are processed in Fourier space, which is <;:lose to the noise's eigenspace. Using IRS2, we have reduced the read noise of the James Webb Space Telescope Near Infrared Spectrograph by 15% compared to conventional readout. We are attempting to achieve further gains by calibrating out recently recognized non-stationary noise that appears at the frame rate.
NASA Astrophysics Data System (ADS)
de Goeij, B. T. G.; Otter, G. C. J.; van Wakeren, J. M. O.; Veefkind, J. P.; Vlemmix, T.; Ge, X.; Levelt, P. F.; Dirks, B. P. F.; Toet, P. M.; van der Wal, L. F.; Jansen, R.
2017-09-01
In recent years TNO has investigated and developed different innovative opto-mechanical designs to realize advanced spectrometers for space applications in a more compact and cost-effective manner. This offers multiple advantages: a compact instrument can be flown on a much smaller platform or as add-on on a larger platform; a low-cost instrument opens up the possibility to fly multiple instruments in a satellite constellation, improving both global coverage and temporal sampling (e.g. multiple overpasses per day to study diurnal processes); in this way a constellation of low-cost instruments may provide added value to the larger scientific and operational satellite missions (e.g. the Copernicus Sentinel missions); a small, lightweight spectrometer can easily be mounted on a small aircraft or high-altitude UAV (offering high spatial resolution).
High Temperature Carbonized Grass as a High Performance Sodium Ion Battery Anode.
Zhang, Fang; Yao, Yonggang; Wan, Jiayu; Henderson, Doug; Zhang, Xiaogang; Hu, Liangbing
2017-01-11
Hard carbon is currently considered the most promising anode candidate for room temperature sodium ion batteries because of its relatively high capacity, low cost, and good scalability. In this work, switchgrass as a biomass example was carbonized under an ultrahigh temperature, 2050 °C, induced by Joule heating to create hard carbon anodes for sodium ion batteries. Switchgrass derived carbon materials intrinsically inherit its three-dimensional porous hierarchical architecture, with an average interlayer spacing of 0.376 nm. The larger interlayer spacing than that of graphite allows for the significant Na ion storage performance. Compared to the sample carbonized under 1000 °C, switchgrass derived carbon at 2050 °C induced an improved initial Coulombic efficiency. Additionally, excellent rate capability and superior cycling performance are demonstrated for the switchgrass derived carbon due to the unique high temperature treatment.
VizieR Online Data Catalog: Atlas of HST STIS spectra of Seyfert galaxies (Spinelli+, 2006)
NASA Astrophysics Data System (ADS)
Spinelli, P. F.; Storchi-Bergmann, T.; Brandt, C. H.; Calzetti, D.
2008-05-01
We present a compilation of spectra of 101 Seyfert galaxies obtained with the Hubble Space Telescope (HST) Space Telescope Imaging Spectrograph (STIS), covering the UV and/or optical spectral range. Information on all the available spectra have been collected in a Mastertable, which is a very useful tool for anyone interested in a quick glance at the existent STIS spectra for Seyfert galaxies in the HST archive, and it can be recovered electronically. Nuclear spectra of the galaxies have been extracted in windows of 0.2" for an optimized sampling (as this is the slit width in most cases) and combined in order to improve the signal-to-noise ratio and provide the widest possible wavelength coverage. These combined spectra are also available electronically, at http://www.if.ufrgs.br/~pat/atlas.htm . (3 data files).
Drug stability analyzer for long duration spaceflights
NASA Astrophysics Data System (ADS)
Shende, Chetan; Smith, Wayne; Brouillette, Carl; Farquharson, Stuart
2014-06-01
Crewmembers of current and future long duration spaceflights require drugs to overcome the deleterious effects of weightlessness, sickness and injuries. Unfortunately, recent studies have shown that some of the drugs currently used may degrade more rapidly in space, losing their potency well before their expiration dates. To complicate matters, the degradation products of some drugs can be toxic. Consequently there is a need for an analyzer that can determine if a drug is safe at the time of use, as well as to monitor and understand space-induced degradation, so that drug types, formulations, and packaging can be improved. Towards this goal we have been investigating the ability of Raman spectroscopy to monitor and quantify drug degradation. Here we present preliminary data by measuring acetaminophen, and its degradation product, p-aminophenol, as pure samples, and during forced degradation reactions.
Optical properties monitor: Experiment definition phase
NASA Technical Reports Server (NTRS)
Wilkes, Donald R.; Bennett, Jean M.; Hummer, Leigh L.; Chipman, Russell A.; Hadaway, James B.; Pezzaniti, Larry
1990-01-01
The stability of materials used in the space environment will continue to be a limiting technology for space missions. The Optical Properties Monitor (OPM) Experiment provides a comprehensive space research program to study the effects of the space environment (both natural and induced) on optical, thermal and space power materials. The OPM Experiment was selected for definition under the NASA/OAST In-Space Technology Experiment Program. The results of the OPM Definition Phase are presented. The OPM experiment will expose selected materials to the space environment and measure the effects with in-space optical measurements. In-space measurements include total hemispherical reflectance total integrated scatter and VUV reflectance/transmittance. The in-space measurements will be augmented with extensive pre- and post-flight sample measurements to determine other optical, mechanical, electrical, chemical or surface effects of space exposure. Environmental monitors will provide the amount and time history of the sample exposure to solar irradiation, atomic oxygen and molecular contamination.
Optical properties monitor: Experiment definition phase
NASA Technical Reports Server (NTRS)
Wilkes, Donald R.; Bennett, Jean M.; Hummer, Leigh L.; Chipman, Russell A.; Hadaway, James B.; Pezzaniti, Larry
1989-01-01
The stability of materials used in the space environment will continue to be a limiting technology for space missions. The Optical Properties Monitor (OPM) Experiment provides a comprehensive space research program to study the effects of the space environment-both natural and induced-on optical, thermal and space power materials. The OPM Experiment was selected for definition under the NASA/OAST In-Space Technology Experiment Program. The results of the OPM Definition Phase are presented. The OPM Experiment will expose selected materials to the space environment and measure the effects with in-space optical measurements. In-space measurements include total hemispherical reflectance total integrated scatter and VUV reflectance/transmittance. The in-space measurements will be augmented with extensive pre- and post-flight sample measurements to determine other optical, mechanical, electrical, chemical or surface effects of space exposure. Environmental monitors will provide the amount and time history of the sample exposure to solar irradiation, atomic oxygen and molecular contamination.
NASA Astrophysics Data System (ADS)
Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.
2017-12-01
The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.
NASA Technical Reports Server (NTRS)
Angart, Samuel; Erdman, R. G.; Poirier, David R.; Tewari, S.N.; Grugel, R. N.
2014-01-01
This talk reports research that has been carried out under the aegis of NASA as part of a collaboration between ESA and NASA for solidification experiments on the International Space Station (ISS). The focus has been on the effect of convection on the microstructural evolution and macrosegregation in hypoeutectic Al-Si alloys during directional solidification (DS). The DS-experiments have been carried out under 1-g at Cleveland State University (CSU) and under low-g on the International Space Station (ISS). The thermal processing-history of the experiments is well defined for both the terrestrially-processed samples and the ISS-processed samples. We have observed that the primary dendrite arm spacings of two samples grown in the low-g environment of the ISS show good agreement with a dendrite-growth model based on diffusion controlled growth. The gravity-driven convection (i.e., thermosolutal convection) in terrestrially grown samples has the effect of decreasing the primary dendrite arm spacings and causes macrosgregation. In order to process DS-samples aboard the ISS, dendritic-seed crystals have to partially remelted in a stationary thermal gradient before the DS is carried out. Microstructural changes and macrosegregation effects during this period are described.
2012-04-26
ISS030-E-257690 (26 April 2012) --- European Space Agency astronaut Andre Kuipers, Expedition 30 flight engineer, prepares for IMMUNE venous blood sample draws in the Columbus laboratory of the International Space Station. Following the blood draws, the samples were temporarily stowed in the Minus Eighty Laboratory Freezer for ISS 1 (MELFI-1) and later packed together with saliva samples on the Soyuz TMA-22 for return to Earth for analysis.
Optical Analysis of Transparent Polymeric Material Exposed to Simulated Space Environment
NASA Technical Reports Server (NTRS)
Edwards, David L.; Finckenor, Miria M.
2000-01-01
Many innovations in spacecraft power and propulsion have been recently tested at NASA, particularly in non-chemical propulsion. One improvement in solar array technology is solar concentration using thin polymer film Fresnel lenses. Weight and cost savings were proven with the Solar Concentrator Arrays with Refractive Linear Element Technology (SCARLET)-II array on NASA's Deep Space I spacecraft. The Fresnel lens concentrates solar energy onto high-efficiency solar cells, decreasing the area of solar cells needed for power. Continued efficiency of this power system relies on the thin film's durability in the space environment and maintaining transmission in the 300 - 1000 nm bandwidth. Various polymeric materials have been tested for use in solar concentrators, including Lexan(TM), polyethylene terephthalate (PET), several formulations of Tefzel(Tm) and Teflon(TM), and DC 93-500, the material selected for SCARLET-II. Also tested were several innovative materials including Langley Research Center's CPI and CP2 polymers and atomic oxygen- resistant polymers developed by Triton Systems, Inc. The Environmental Effects Group of the Marshall Space Flight Center's Materials, Processes, and Manufacturing Department exposed these materials to simulated space environment and evaluated them for any change in optical transmission. Samples were exposed to a minimum of 1000 equivalent Sun hours of near-UV radiation (250 - 400 nm wavelength). Materials that appeared robust after near-UV exposure were then exposed to charged particle radiation equivalent to a five-year dose in geosynchronous orbit. These exposures were performed in MSFC's Combined Environmental Effects Test Chamber, a unique facility with the capability to expose materials simultaneously or sequentially to protons, low-energy electrons, high-energy electrons, near UV radiation and vacuum UV radiation. Reflectance measurements can be made on the samples in vacuum. Prolonged exposure to the space environment will decrease the polymer film's transmission and thus reduce the conversion efficiency. A method was developed to normalize the transmission loss and thus rank the materials according to their tolerance to space environmental exposure. Spectral results and the material ranking according to transmission loss are presented.
1996-03-24
Astronaut Michael Clifford places a liquid nitrogen Dewar containing frozen protein solutions aboard Russia's space station Mir during a visit by the Space Shuttle (STS-76). The protein samples were flash-frozen on Earth and will be allowed to thaw and crystallize in the microgravity environment on Mir Space Station. A later crew will return the Dewar to Earth for sample analysis. Dr. Alexander McPherson of the University of California at Riverside is the principal investigator. Photo credit: NASA/Johnson Space Center.
1996-09-20
Astronaut Tom Akers places a liquid nitrogen Dewar containing frozen protein solutions aboard Russia's space Station Mir during a visit by the Space Shuttle (STS-79). The protein samples were flash-frozen on Earth and will be allowed to thaw and crystallize in the microgravity environment on Mir Space Station. A later crew will return the Dewar to Earth for sample analysis. Dr. Alexander McPherson of the University of California at Riverside is the principal investigator. Photo credit: NASA/Johnson Space Center.
Kundu, Anupam; Sabhapandit, Sanjib; Dhar, Abhishek
2011-03-01
We present an algorithm for finding the probabilities of rare events in nonequilibrium processes. The algorithm consists of evolving the system with a modified dynamics for which the required event occurs more frequently. By keeping track of the relative weight of phase-space trajectories generated by the modified and the original dynamics one can obtain the required probabilities. The algorithm is tested on two model systems of steady-state particle and heat transport where we find a huge improvement from direct simulation methods.
Love, Jeffrey J.; Finn, Carol
2017-01-01
An examination is made of opportunities and challenges for enhancing global, real-time geomagnetic monitoring that would be beneficial for a variety of operational projects. This enhancement in geomagnetic monitoring can be attained by expanding the geographic distribution of magnetometer stations, improving the quality of magnetometer data, increasing acquisition sampling rates, increasing the promptness of data transmission, and facilitating access to and use of the data. Progress will benefit from new partnerships to leverage existing capacities and harness multisector, cross-disciplinary, and international interests.
NASA Astrophysics Data System (ADS)
Shimobaba, Tomoyoshi; Nagahama, Yuki; Kakue, Takashi; Takada, Naoki; Okada, Naohisa; Endo, Yutaka; Hirayama, Ryuji; Hiyama, Daisuke; Ito, Tomoyoshi
2014-02-01
A calculation reduction method for color digital holography (DH) and computer-generated holograms (CGHs) using color space conversion is reported. Color DH and color CGHs are generally calculated on RGB space. We calculate color DH and CGHs in other color spaces for accelerating the calculation (e.g., YCbCr color space). In YCbCr color space, a RGB image or RGB hologram is converted to the luminance component (Y), blue-difference chroma (Cb), and red-difference chroma (Cr) components. In terms of the human eye, although the negligible difference of the luminance component is well recognized, the difference of the other components is not. In this method, the luminance component is normal sampled and the chroma components are down-sampled. The down-sampling allows us to accelerate the calculation of the color DH and CGHs. We compute diffraction calculations from the components, and then we convert the diffracted results in YCbCr color space to RGB color space. The proposed method, which is possible to accelerate the calculations up to a factor of 3 in theory, accelerates the calculation over two times faster than the ones in RGB color space.
NASA Advanced Explorations Systems: Advancements in Life Support Systems
NASA Technical Reports Server (NTRS)
Shull, Sarah A.; Schneider, Walter F.
2016-01-01
The NASA Advanced Exploration Systems (AES) Life Support Systems (LSS) project strives to develop reliable, energy-efficient, and low-mass spacecraft systems to provide environmental control and life support systems (ECLSS) critical to enabling long duration human missions beyond low Earth orbit (LEO). Highly reliable, closed-loop life support systems are among the capabilities required for the longer duration human space exploration missions assessed by NASA's Habitability Architecture Team (HAT). The LSS project is focused on four areas: architecture and systems engineering for life support systems, environmental monitoring, air revitalization, and wastewater processing and water management. Starting with the international space station (ISS) LSS systems as a point of departure (where applicable), the mission of the LSS project is three-fold: 1. Address discrete LSS technology gaps 2. Improve the reliability of LSS systems 3. Advance LSS systems towards integrated testing on the ISS. This paper summarized the work being done in the four areas listed above to meet these objectives. Details will be given on the following focus areas: Systems Engineering and Architecture- With so many complex systems comprising life support in space, it is important to understand the overall system requirements to define life support system architectures for different space mission classes, ensure that all the components integrate well together and verify that testing is as representative of destination environments as possible. Environmental Monitoring- In an enclosed spacecraft that is constantly operating complex machinery for its own basic functionality as well as science experiments and technology demonstrations, it's possible for the environment to become compromised. While current environmental monitors aboard the ISS will alert crew members and mission control if there is an emergency, long-duration environmental monitoring cannot be done in-orbit as current methodologies rely largely on sending environmental samples back to Earth. The LSS project is developing onboard analysis capabilities that will replace the need to return air and water samples from space for ground analysis. Air Revitalization- The air revitalization task is comprised of work in carbon dioxide removal, oxygen generation and recovery and trace contamination and particulate control. The CO2 Removal and associated air drying development efforts under the LSS project are focused both on improving the current SOA technology on the ISS and assessing and examining the viability of other sorbents and technologies available in academia and industry. The Oxygen Generation and Recovery technology development area encompasses several sub-tasks in an effort to supply O2 to the crew at the required conditions, to recover O2 from metabolic CO2, and to recycle recovered O2 back to the cabin environment. Current state-of-the-art oxygen generation systems aboard space station are capable of generating or recovering approximately 40% of required oxygen; for exploration missions this percentage needs to be greatly increased. A spacecraft cabin trace contaminant and particulate control system serves to keep the environment below the spacecraft maximum allowable concentration (SMAC) for chemicals and particulates. Both passive (filters) and active (scrubbers) methods contribute to the overall TC & PC design. Work in the area of trace contamination and particulate control under the LSS project is focused on making improvements to the SOA TC & PC systems on ISS to improve performance and reduce consumables. Wastewater Processing and Water Management- A major goal of the LSS project is the development of water recovery systems to support long duration human exploration beyond LEO. Current space station wastewater processing and water management systems distill urine and wastewater to recover water from urine and humidity condensate in the spacecraft at a approximately 74% recovery rate. For longer, farther missions into deep space, that recovery rate must be greatly increased so that astronauts can journey for months without resupply cargo ships from Earth.
Materials International Space Station Experiment (MISSE): Overview, Accomplishments and Future Needs
NASA Technical Reports Server (NTRS)
deGroh, Kim K.; Jaworske, Donald A.; Pippin, Gary; Jenkins, Philip P.; Walters, Robert J.; Thibeault, Sheila A.; Palusinski, Iwona; Lorentzen, Justin R.
2014-01-01
Materials and devices used on the exterior of spacecraft in low Earth orbit (LEO) are subjected to environmental threats that can cause degradation in material properties, possibly threatening spacecraft mission success. These threats include: atomic oxygen (AO), ultraviolet and x-ray radiation, charged particle radiation, temperature extremes and thermal cycling, micrometeoroid and debris impacts, and contamination. Space environmental threats vary greatly based on spacecraft materials, thicknesses and stress levels, and the mission environment and duration. For more than a decade the Materials International Space Station Experiment (MISSE) has enabled the study of the long duration environmental durability of spacecraft materials in the LEO environment. The overall objective of MISSE is to test the stability and durability of materials and devices in the space environment in order to gain valuable knowledge on the performance of materials in space, as well as to enable lifetime predictions of new materials that may be used in future space flight. MISSE is a series of materials flight experiments, which are attached to the exterior of the International Space Station (ISS). Individual experiments were loaded onto suitcase-like trays, called Passive Experiment Containers (PECs). The PECs were transported to the ISS in the Space Shuttle cargo bay and attached to, and removed from, the ISS during extravehicular activities (EVAs). The PECs were retrieved after one or more years of space exposure and returned to Earth enabling post-flight experiment evaluation. MISSE is a multi-organization project with participants from the National Aeronautics and Space Administration (NASA), the Department of Defense (DoD), industry and academia. MISSE has provided a platform for environmental durability studies for thousands of samples and numerous devices, and it has produced many tangible impacts. Ten PECs (and one smaller tray) have been flown, representing MISSE 1 through MISSE 8, yielding long-duration space environmental performance and durability data that enable material validation, processing recertification and space qualification; improved predictions of materials and component lifetimes in space; model verification and development; and correlation factors between space-exposure and ground-facilities enabling more accurate in-space performance predictions based on ground-laboratory testing. A few of the many experiment results and observations, and their impacts, are provided. Those highlighted include examples on improved understanding of atomic oxygen scattering mechanisms, LEO coating durability results, and polymer erosion yields and their impacts on spacecraft design. The MISSE 2 Atomic Oxygen Scattering Chamber Experiment discovered that the peak flux of scattered AO was determined to be 45 deg from normal incidence, not the model predicted cosine dependence. In addition, the erosion yield (E(sub y)) of Kapton H for AO scattered off oxidized-Al is 22% of the E(sub y) of direct AO impingement. These results were used to help determine the degradation mechanism of a cesium iodide detector within the Hubble Space Telescope Cosmic Origins Spectrograph Experiment. The MISSE 6 Indium Tin Oxide (ITO) Degradation Experiment measured surface electrical resistance of ram and wake ITO coated samples. The data confirmed that ITO is a stable AO protective coating, and the results validated the durability of ITO conductive coatings for solar arrays for the Atmosphere-Space Transition 2 Explorer program. The MISSE 2, 6 and 7 Polymer Experiments have provided LEO AO Ey data on over 120 polymer and composites samples. The flight E(sub y) values were found to range from 3.05 x 10(exp -26) cu cm/atom for the AO resistant polymer CORIN to 9.14 x 10(exp -26) cu cm/atom for polyoxymethylene (POM). In addition, flying the same polymers on different missions has advanced the understanding of the AO E(sub y) dependency on solar exposure for polymers containing fluorine. The MISSE polymer results are highly requested and have impacted spacecraft design for WorldView-2 & -3, the Global Precipitation Measurement-Microwave Imager, and other spacecraft. The flight data has enabled the development of an Atomic Oxygen Erosion Predictive Tool that allows the erosion prediction of new and non-flown polymers. The data has also been used to develop a new NASA Technical Standards Handbook "Spacecraft Polymers Atomic Oxygen Durability Handbook." Many intangible benefits have also been derived from MISSE. For example, over 40 students have collaborated on Glenn's MISSE experiments, which have resulted in greater than $80K in student scholarships and awards in national and international science fairs. Students have also given presentations and won poster competition awards at international space conferences.
50 CFR 679.93 - Amendment 80 Program recordkeeping, permits, monitoring, and catch accounting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED... moved to the fish bin. (6) Sample storage. There is sufficient space to accommodate a minimum of 10 observer sampling baskets. This space must be within or adjacent to the observer sample station. (7) Pre...
Investigating the Efficacy of CubeSats for Asteroid Detection
NASA Technical Reports Server (NTRS)
O'Toole, Conor
2015-01-01
A simulation to examine the potential of a network of CubeSats for detecting Near Earth Objects is discussed, in terms of goals, methods used and initial results obtained. By designing a basic optical system and the orbital parameters of the satellites in this network, their effectiveness for detecting asteroids is examined, with a small sample of cataloged asteroids considered.The conditions to be satisfied for detection cover both the geometrical aspects of astronomy such as field of view and line of sight, along with more technical optics-based conditions such as resolution and sensitivity of our telescopes. Of special interest to us in this work is the region of the sky between 45 deg. and 90 deg. from the Sun, as seen from the Earth. This part of the sky is currently unobservable by ground-based surveys and so provides the primary reason to consider a space-based one. There exist a number of issues with the simulation which call these results into question, but an eort has been made to remove those results which exceed the possible capabilities of the satellite network, and identify those aspects of the mission which should be examined in order to provide an in-depth assessment of it's performance. With these filters applied to the overall data, a tentative result of 1458 total detections over an 85 year period has been obtained, with 14 of the 22 asteroids in the sample being detected at least once. A number of ways in which the simulation could be improved are also proposed, both in-terms of addressing the aforementioned issues, as well as how to improve on the accuracy of the simulation and capture as many aspects of a space-based optical astronomy mission as possible,with the possible nal form of the simulation being a tool for assessing the performance of any space-based optical mission to detect asteroids.
A Sustainable Architecture for Lunar Resource Prospecting from an EML-based Exploration Platform
NASA Astrophysics Data System (ADS)
Klaus, K.; Post, K.; Lawrence, S. J.
2012-12-01
Introduction - We present a point of departure architecture for prospecting for Lunar Resources from an Exploration Platform at the Earth - Moon Lagrange points. Included in our study are launch vehicle, cis-lunar transportation architecture, habitat requirements and utilization, lander/rover concepts and sample return. Different transfer design techniques can be explored by mission designers, testing various propulsive systems, maneuvers, rendezvous, and other in-space and surface operations. Understanding the availability of high and low energy trajectory transfer options opens up the possibility of exploring the human and logistics support mission design space and deriving solutions never before contemplated. For sample return missions from the lunar surface, low-energy transfers could be utilized between EML platform and the surface as well as return of samples to EML-based spacecraft. Human Habitation at the Exploration Platform - Telerobotic and telepresence capabilities are considered by the agency to be "grand challenges" for space technology. While human visits to the lunar surface provide optimal opportunities for field geologic exploration, on-orbit telerobotics may provide attractive early opportunities for geologic exploration, resource prospecting, and other precursor activities in advance of human exploration campaigns and ISRU processing. The Exploration Platform provides a perfect port for a small lander which could be refueled and used for multiple missions including sample return. The EVA and robotic capabilities of the EML Exploration Platform allow the lander to be serviced both internally and externally, based on operational requirements. The placement of the platform at an EML point allows the lander to access any site on the lunar surface, thus providing the global lunar surface access that is commonly understood to be required in order to enable a robust lunar exploration program. Designing the sample return lander for low-energy trajectories would reduce the overall mass and potentially increase the sample return mass. The Initial Lunar Mission -Building upon Apollo sample investigations, the recent results of the LRO/LCROSS, international missions such as Chandrayaan-1, and legacy missions including Lunar Prospector, and Clementine, among the most important science and exploration goals is surface prospecting for lunar resources and to provide ground truth for orbital observations. Being able to constrain resource production potential will allow us to estimate the prospect for reducing the size of payloads launched from Earth required for Solar System exploration. Flight opportunities for something like the NASA RESOLVE instrument suite to areas of high science and exploration interest could be used to refine and improve future Exploration architectures, reducing the outlays required for cis-lunar operations. Summary - EML points are excellent for placement of a semi-permanent human-tended Exploration Platform both in the near term, while providing important infrastructure and deep-space experience that will be built upon to gradually increase long-term operational capabilities.
Scarduelli, Lucia; Giacchini, Roberto; Parenti, Paolo; Migliorati, Sonia; Di Brisco, Agnese Maria; Vighi, Marco
2017-11-01
Biomarkers are widely used in ecotoxicology as indicators of exposure to toxicants. However, their ability to provide ecologically relevant information remains controversial. One of the major problems is understanding whether the measured responses are determined by stress factors or lie within the natural variability range. In a previous work, the natural variability of enzymatic levels in invertebrates sampled in pristine rivers was proven to be relevant across both space and time. In the present study, the experimental design was improved by considering different life stages of the selected taxa and by measuring more environmental parameters. The experimental design considered sampling sites in 2 different rivers, 8 sampling dates covering the whole seasonal cycle, 4 species from 3 different taxonomic groups (Plecoptera, Perla grandis; Ephemeroptera, Baetis alpinus and Epeorus alpicula; Tricoptera, Hydropsyche pellucidula), different life stages for each species, and 4 enzymes (acetylcholinesterase, glutathione S-transferase, alkaline phosphatase, and catalase). Biomarker levels were related to environmental (physicochemical) parameters to verify any kind of dependence. Data were statistically elaborated using hierarchical multilevel Bayesian models. Natural variability was found to be relevant across both space and time. The results of the present study proved that care should be paid when interpreting biomarker results. Further research is needed to better understand the dependence of the natural variability on environmental parameters. Environ Toxicol Chem 2017;36:3158-3167. © 2017 SETAC. © 2017 SETAC.
Barber, Jared; Tanase, Roxana; Yotov, Ivan
2016-06-01
Several Kalman filter algorithms are presented for data assimilation and parameter estimation for a nonlinear diffusion model of epithelial cell migration. These include the ensemble Kalman filter with Monte Carlo sampling and a stochastic collocation (SC) Kalman filter with structured sampling. Further, two types of noise are considered -uncorrelated noise resulting in one stochastic dimension for each element of the spatial grid and correlated noise parameterized by the Karhunen-Loeve (KL) expansion resulting in one stochastic dimension for each KL term. The efficiency and accuracy of the four methods are investigated for two cases with synthetic data with and without noise, as well as data from a laboratory experiment. While it is observed that all algorithms perform reasonably well in matching the target solution and estimating the diffusion coefficient and the growth rate, it is illustrated that the algorithms that employ SC and KL expansion are computationally more efficient, as they require fewer ensemble members for comparable accuracy. In the case of SC methods, this is due to improved approximation in stochastic space compared to Monte Carlo sampling. In the case of KL methods, the parameterization of the noise results in a stochastic space of smaller dimension. The most efficient method is the one combining SC and KL expansion. Copyright © 2016 Elsevier Inc. All rights reserved.
Direct Printing of 1-D and 2-D Electronically Conductive Structures by Molten Lead-Free Solder
Wang, Chien-Hsun; Tsai, Ho-Lin; Hwang, Weng-Sing
2016-01-01
This study aims to determine the effects of appropriate experimental parameters on the thermophysical properties of molten micro droplets, Sn-3Ag-0.5Cu solder balls with an average droplet diameter of 50 μm were prepared. The inkjet printing parameters of the molten micro droplets, such as the dot spacing, stage velocity and sample temperature, were optimized in the 1D and 2D printing of metallic microstructures. The impact and mergence of molten micro droplets were observed with a high-speed digital camera. The line width of each sample was then calculated using a formula over a temperature range of 30 to 70 °C. The results showed that a metallic line with a width of 55 μm can be successfully printed with dot spacing (50 μm) and the stage velocity (50 mm∙s−1) at the substrate temperature of 30 °C. The experimental results revealed that the height (from 0.63 to 0.58) and solidification contact angle (from 72° to 56°) of the metallic micro droplets decreased as the temperature of the sample increased from 30 to 70 °C. High-speed digital camera (HSDC) observations showed that the quality of the 3D micro patterns improved significantly when the droplets were deposited at 70 °C. PMID:28772361
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Groh, Kim De; Kneubel, Christian A.
2014-01-01
A space experiment flown as part of the Materials International Space Station Experiment 6B (MISSE 6B) was designed to compare the atomic oxygen erosion yield (Ey) of layers of Kapton H polyimide with no spacers between layers with that of layers of Kapton H with spacers between layers. The results were compared to a solid Kapton H (DuPont, Wilmington, DE) sample. Monte Carlo computational modeling was performed to optimize atomic oxygen interaction parameter values to match the results of both the MISSE 6B multilayer experiment and the undercut erosion profile from a crack defect in an aluminized Kapton H sample flown on the Long Duration Exposure Facility (LDEF). The Monte Carlo modeling produced credible agreement with space results of increased Ey for all samples with spacers as well as predicting the space-observed enhancement in erosion near the edges of samples due to scattering from the beveled edges of the sample holders.
NASA Technical Reports Server (NTRS)
Tewari, Surendra; Rajamure, Ravi; Grugel, Richard; Erdmann, Robert; Poirier, David
2012-01-01
Influence of natural convection on primary dendrite array morphology during directional solidification is being investigated under a collaborative European Space Agency-NASA joint research program, "Microstructure Formation in Castings of Technical Alloys under Diffusive and Magnetically Controlled Convective Conditions (MICAST)". Two Aluminum-7 wt pct Silicon alloy samples, MICAST6 and MICAST7, were directionally solidified in microgravity on the International Space Station. Terrestrially grown dendritic monocrystal cylindrical samples were remelted and directionally solidified at 18 K/cm (MICAST6) and 28 K/cm (MICAST7). Directional solidification involved a growth speed step increase (MICAST6-from 5 to 50 micron/s) and a speed decrease (MICAST7-from 20 to 10 micron/s). Distribution and morphology of primary dendrites is currently being characterized in these samples, and also in samples solidified on earth under nominally similar thermal gradients and growth speeds. Primary dendrite spacing and trunk diameter measurements from this investigation will be presented.
Primary Dendrite Array: Observations from Ground-Based and Space Station Processed Samples
NASA Technical Reports Server (NTRS)
Tewari, Surendra N.; Grugel, Richard N.; Erdman, Robert G.; Poirier, David R.
2012-01-01
Influence of natural convection on primary dendrite array morphology during directional solidification is being investigated under a collaborative European Space Agency-NASA joint research program, Microstructure Formation in Castings of Technical Alloys under Diffusive and Magnetically Controlled Convective Conditions (MICAST). Two Aluminum-7 wt pct Silicon alloy samples, MICAST6 and MICAST7, were directionally solidified in microgravity on the International Space Station. Terrestrially grown dendritic monocrystal cylindrical samples were remelted and directionally solidified at 18 K per centimeter (MICAST6) and 28 K per centimeter (MICAST7). Directional solidification involved a growth speed step increase (MICAST6-from 5 to 50 millimeters per second) and a speed decrease (MICAST7-from 20 to 10 millimeters per second). Distribution and morphology of primary dendrites is currently being characterized in these samples, and also in samples solidified on earth under nominally similar thermal gradients and growth speeds. Primary dendrite spacing and trunk diameter measurements from this investigation will be presented.
Recursive algorithms for phylogenetic tree counting.
Gavryushkina, Alexandra; Welch, David; Drummond, Alexei J
2013-10-28
In Bayesian phylogenetic inference we are interested in distributions over a space of trees. The number of trees in a tree space is an important characteristic of the space and is useful for specifying prior distributions. When all samples come from the same time point and no prior information available on divergence times, the tree counting problem is easy. However, when fossil evidence is used in the inference to constrain the tree or data are sampled serially, new tree spaces arise and counting the number of trees is more difficult. We describe an algorithm that is polynomial in the number of sampled individuals for counting of resolutions of a constraint tree assuming that the number of constraints is fixed. We generalise this algorithm to counting resolutions of a fully ranked constraint tree. We describe a quadratic algorithm for counting the number of possible fully ranked trees on n sampled individuals. We introduce a new type of tree, called a fully ranked tree with sampled ancestors, and describe a cubic time algorithm for counting the number of such trees on n sampled individuals. These algorithms should be employed for Bayesian Markov chain Monte Carlo inference when fossil data are included or data are serially sampled.