Sample records for q-space analysis simulations

  1. Light scattering Q-space analysis of irregularly shaped particles

    NASA Astrophysics Data System (ADS)

    Heinson, Yuli W.; Maughan, Justin B.; Heinson, William R.; Chakrabarti, Amitabha; Sorensen, Christopher M.

    2016-01-01

    We report Q-space analysis of light scattering phase function data for irregularly shaped dust particles and of theoretical model output to describe them. This analysis involves plotting the scattered intensity versus the magnitude of the scattering wave vector q = (4π/λ)sin(θ/2), where λ is the optical wavelength and θ is the scattering angle, on a double-logarithmic plot. In q-space all the particle shapes studied display a scattering pattern which includes a q-independent forward scattering regime; a crossover, Guinier regime when q is near the inverse size; a power law regime; and an enhanced backscattering regime. Power law exponents show a quasi-universal functionality with the internal coupling parameter ρ'. The absolute value of the exponents start from 4 when ρ' < 1, the diffraction limit, and decreases as ρ' increases until a constant 1.75 ± 0.25 when ρ' ≳ 10. The diffraction limit exponent implies that despite their irregular structures, all the particles studied have mass and surface scaling dimensions of Dm = 3 and Ds = 2, respectively. This is different from fractal aggregates that have a power law equal to the fractal dimension Df because Df = Dm = Ds < 3. Spheres have Dm = 3 and Ds = 2 but do not show a single power law nor the same functionality with ρ'. The results presented here imply that Q-space analysis can differentiate between spheres and these two types of irregularly shaped particles. Furthermore, they are applicable to analysis of the contribution of aerosol radiative forcing to climate change and of aerosol remote sensing data.

  2. Q-space analysis of light scattering by ice crystals

    NASA Astrophysics Data System (ADS)

    Heinson, Yuli W.; Maughan, Justin B.; Ding, Jiachen; Chakrabarti, Amitabha; Yang, Ping; Sorensen, Christopher M.

    2016-12-01

    Q-space analysis is applied to extensive simulations of the single-scattering properties of ice crystals with various habits/shapes over a range of sizes. The analysis uncovers features common to all the shapes: a forward scattering regime with intensity quantitatively related to the Rayleigh scattering by the particle and the internal coupling parameter, followed by a Guinier regime dependent upon the particle size, a complex power law regime with incipient two dimensional diffraction effects, and, in some cases, an enhanced backscattering regime. The effects of significant absorption on the scattering profile are also studied. The overall features found for the ice crystals are similar to features in scattering from same sized spheres.

  3. q-Space Upsampling Using x-q Space Regularization.

    PubMed

    Chen, Geng; Dong, Bin; Zhang, Yong; Shen, Dinggang; Yap, Pew-Thian

    2017-09-01

    Acquisition time in diffusion MRI increases with the number of diffusion-weighted images that need to be acquired. Particularly in clinical settings, scan time is limited and only a sparse coverage of the vast q -space is possible. In this paper, we show how non-local self-similar information in the x - q space of diffusion MRI data can be harnessed for q -space upsampling. More specifically, we establish the relationships between signal measurements in x - q space using a patch matching mechanism that caters to unstructured data. We then encode these relationships in a graph and use it to regularize an inverse problem associated with recovering a high q -space resolution dataset from its low-resolution counterpart. Experimental results indicate that the high-resolution datasets reconstructed using the proposed method exhibit greater quality, both quantitatively and qualitatively, than those obtained using conventional methods, such as interpolation using spherical radial basis functions (SRBFs).

  4. Radial q-space sampling for DSI

    PubMed Central

    Baete, Steven H.; Yutzy, Stephen; Boada, Fernando, E.

    2015-01-01

    Purpose Diffusion Spectrum Imaging (DSI) has been shown to be an effective tool for non-invasively depicting the anatomical details of brain microstructure. Existing implementations of DSI sample the diffusion encoding space using a rectangular grid. Here we present a different implementation of DSI whereby a radially symmetric q-space sampling scheme for DSI (RDSI) is used to improve the angular resolution and accuracy of the reconstructed Orientation Distribution Functions (ODF). Methods Q-space is sampled by acquiring several q-space samples along a number of radial lines. Each of these radial lines in q-space is analytically connected to a value of the ODF at the same angular location by the Fourier slice theorem. Results Computer simulations and in vivo brain results demonstrate that RDSI correctly estimates the ODF when moderately high b-values (4000 s/mm2) and number of q-space samples (236) are used. Conclusion The nominal angular resolution of RDSI depends on the number of radial lines used in the sampling scheme, and only weakly on the maximum b-value. In addition, the radial analytical reconstruction reduces truncation artifacts which affect Cartesian reconstructions. Hence, a radial acquisition of q-space can be favorable for DSI. PMID:26363002

  5. Radial q-space sampling for DSI.

    PubMed

    Baete, Steven H; Yutzy, Stephen; Boada, Fernando E

    2016-09-01

    Diffusion spectrum imaging (DSI) has been shown to be an effective tool for noninvasively depicting the anatomical details of brain microstructure. Existing implementations of DSI sample the diffusion encoding space using a rectangular grid. Here we present a different implementation of DSI whereby a radially symmetric q-space sampling scheme for DSI is used to improve the angular resolution and accuracy of the reconstructed orientation distribution functions. Q-space is sampled by acquiring several q-space samples along a number of radial lines. Each of these radial lines in q-space is analytically connected to a value of the orientation distribution functions at the same angular location by the Fourier slice theorem. Computer simulations and in vivo brain results demonstrate that radial diffusion spectrum imaging correctly estimates the orientation distribution functions when moderately high b-values (4000 s/mm2) and number of q-space samples (236) are used. The nominal angular resolution of radial diffusion spectrum imaging depends on the number of radial lines used in the sampling scheme, and only weakly on the maximum b-value. In addition, the radial analytical reconstruction reduces truncation artifacts which affect Cartesian reconstructions. Hence, a radial acquisition of q-space can be favorable for DSI. Magn Reson Med 76:769-780, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  6. Estimation of the Mean Axon Diameter and Intra-axonal Space Volume Fraction of the Human Corpus Callosum: Diffusion q-space Imaging with Low q-values.

    PubMed

    Suzuki, Yuriko; Hori, Masaaki; Kamiya, Kouhei; Fukunaga, Issei; Aoki, Shigeki; VAN Cauteren, Marc

    2016-01-01

    Q-space imaging (QSI) is a diffusion-weighted imaging (DWI) technique that enables investigation of tissue microstructure. However, for sufficient displacement resolution to measure the microstructure, QSI requires high q-values that are usually difficult to achieve with a clinical scanner. The recently introduced "low q-value method" fits the echo attenuation to only low q-values to extract the root mean square displacement. We investigated the clinical feasibility of the low q-value method for estimating the microstructure of the human corpus callosum using a 3.0-tesla clinical scanner within a clinically feasible scan time. We performed a simulation to explore the acceptable range of maximum q-values for the low q-value method. We simulated echo attenuations caused by restricted diffusion in the intra-axonal space (IAS) and hindered diffusion in the extra-axonal space (EAS) assuming 100,000 cylinders with various diameters, and we estimated mean axon diameter, IAS volume fraction, and EAS diffusivity by fitting echo attenuations with different maximum q-values. Furthermore, we scanned the corpus callosum of 7 healthy volunteers and estimated the mean axon diameter and IAS volume fraction. Good agreement between estimated and defined values in the simulation study with maximum q-values of 700 and 800 cm(-1) suggested that the maximum q-value used in the in vivo experiment, 737 cm(-1), was reasonable. In the in vivo experiment, the mean axon diameter was larger in the body of the corpus callosum and smaller in the genu and splenium, and this anterior-to-posterior trend is consistent with previously reported histology, although our mean axon diameter seems larger in size. On the other hand, we found an opposite anterior-to-posterior trend, with high IAS volume fraction in the genu and splenium and a lower fraction in the body, which is similar to the fiber density reported in the histology study. The low q-value method may provide insights into tissue

  7. Discrete-Time Deterministic $Q$ -Learning: A Novel Convergence Analysis.

    PubMed

    Wei, Qinglai; Lewis, Frank L; Sun, Qiuye; Yan, Pengfei; Song, Ruizhuo

    2017-05-01

    In this paper, a novel discrete-time deterministic Q -learning algorithm is developed. In each iteration of the developed Q -learning algorithm, the iterative Q function is updated for all the state and control spaces, instead of updating for a single state and a single control in traditional Q -learning algorithm. A new convergence criterion is established to guarantee that the iterative Q function converges to the optimum, where the convergence criterion of the learning rates for traditional Q -learning algorithms is simplified. During the convergence analysis, the upper and lower bounds of the iterative Q function are analyzed to obtain the convergence criterion, instead of analyzing the iterative Q function itself. For convenience of analysis, the convergence properties for undiscounted case of the deterministic Q -learning algorithm are first developed. Then, considering the discounted factor, the convergence criterion for the discounted case is established. Neural networks are used to approximate the iterative Q function and compute the iterative control law, respectively, for facilitating the implementation of the deterministic Q -learning algorithm. Finally, simulation results and comparisons are given to illustrate the performance of the developed algorithm.

  8. Light Scattering Analysis of Irregularly Shaped Dust Particles: A Study Using 3-Dimensional Reconstructions from Focused Ion-Beam (FIB) Tomography and Q-Space Analysis

    NASA Astrophysics Data System (ADS)

    Ortiz-Montalvo, D. L.; Conny, J. M.

    2017-12-01

    We study the scattering properties of irregularly shaped ambient dust particles. The way in which they scatter and absorb light has implications for aerosol optical remote sensing and aerosol radiative forcing applications. However, understanding light scattering and absorption by non-spherical particles can be very challenging. We used focused ion-beam scanning electron microscopy and energy-dispersive x-ray spectroscopy (FIB-SEM-EDS) to reconstruct three-dimensional (3-D) configurations of dust particles collected from urban and Asian sources. The 3-D reconstructions were then used in a discrete dipole approximation method (DDA) to determine their scattering properties for a range of shapes, sizes, and refractive indices. Scattering properties where obtained using actual-shapes of the particles, as well as, (theoretical) equivalently-sized geometrical shapes like spheres, ellipsoids, cubes, rectangular prisms, and tetrahedrons. We use Q-space analysis to interpret the angular distribution of the scattered light obtained for each particle. Q-space analysis has been recently used to distinguish scattering by particles of different shapes, and it involves plotting the scattered intensity versus the scattering wave vector (q or qR) on a log-log scale, where q = 2ksin(θ/2), k = 2π/λ, and R = particle effective radius. Results from a limited number of particles show that when Q-space analysis is applied, common patterns appear that agree with previous Q-space studies done on ice crystals and other irregularly shaped particles. More specifically, we found similar Q-space regimes including a forward scattering regime of constant intensity when qR < 1, followed by the Guinier regime when qR ≈ 1, which is then followed by a complex power law regime with a -3 slope regime, a transition regime, and then a -4 slope regime. Currently, Q-space comparisons between actual- and geometric shapes are underway with the objective of determining which geometric shape best

  9. Analysis of the Space Shuttle main engine simulation

    NASA Technical Reports Server (NTRS)

    Deabreu-Garcia, J. Alex; Welch, John T.

    1993-01-01

    This is a final report on an analysis of the Space Shuttle Main Engine Program, a digital simulator code written in Fortran. The research was undertaken in ultimate support of future design studies of a shuttle life-extending Intelligent Control System (ICS). These studies are to be conducted by NASA Lewis Space Research Center. The primary purpose of the analysis was to define the means to achieve a faster running simulation, and to determine if additional hardware would be necessary for speeding up simulations for the ICS project. In particular, the analysis was to consider the use of custom integrators based on the Matrix Stability Region Placement (MSRP) method. In addition to speed of execution, other qualities of the software were to be examined. Among these are the accuracy of computations, the useability of the simulation system, and the maintainability of the program and data files. Accuracy involves control of truncation error of the methods, and roundoff error induced by floating point operations. It also involves the requirement that the user be fully aware of the model that the simulator is implementing.

  10. A 4D Hyperspherical Interpretation of q-Space

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Wu, Yu-Chien; Bendlin, Barbara B.; Alexander, Andrew L.

    2015-01-01

    3D q-space can be viewed as the surface of a 4D hypersphere. In this paper, we seek to develop a 4D hyperspherical interpretation of q-space by projecting it onto a hypersphere and subsequently modeling the q-space signal via 4D hyperspherical harmonics (HSH). Using this orthonormal basis, we derive several well-established q-space indices and numerically estimate the diffusion orientation distribution function (dODF). We also derive the integral transform describing the relationship between the diffusion signal and propagator on a hypersphere. Most importantly, we will demonstrate that for hybrid diffusion imaging (HYDI) acquisitions low order linear expansion of the HSH basis is sufficient to characterize diffusion in neural tissue. In fact, the HSH basis achieves comparable signal and better dODF reconstructions than other well-established methods, such as Bessel Fourier orientation reconstruction (BFOR), using fewer fitting parameters. All in all, this work provides a new way of looking at q-space. PMID:25624043

  11. A 4D hyperspherical interpretation of q-space.

    PubMed

    Pasha Hosseinbor, A; Chung, Moo K; Wu, Yu-Chien; Bendlin, Barbara B; Alexander, Andrew L

    2015-04-01

    3D q-space can be viewed as the surface of a 4D hypersphere. In this paper, we seek to develop a 4D hyperspherical interpretation of q-space by projecting it onto a hypersphere and subsequently modeling the q-space signal via 4D hyperspherical harmonics (HSH). Using this orthonormal basis, we derive several well-established q-space indices and numerically estimate the diffusion orientation distribution function (dODF). We also derive the integral transform describing the relationship between the diffusion signal and propagator on a hypersphere. Most importantly, we will demonstrate that for hybrid diffusion imaging (HYDI) acquisitions low order linear expansion of the HSH basis is sufficient to characterize diffusion in neural tissue. In fact, the HSH basis achieves comparable signal and better dODF reconstructions than other well-established methods, such as Bessel Fourier orientation reconstruction (BFOR), using fewer fitting parameters. All in all, this work provides a new way of looking at q-space. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. XQ-NLM: Denoising Diffusion MRI Data via x-q Space Non-Local Patch Matching.

    PubMed

    Chen, Geng; Wu, Yafeng; Shen, Dinggang; Yap, Pew-Thian

    2016-10-01

    Noise is a major issue influencing quantitative analysis in diffusion MRI. The effects of noise can be reduced by repeated acquisitions, but this leads to long acquisition times that can be unrealistic in clinical settings. For this reason, post-acquisition denoising methods have been widely used to improve SNR. Among existing methods, non-local means (NLM) has been shown to produce good image quality with edge preservation. However, currently the application of NLM to diffusion MRI has been mostly focused on the spatial space (i.e., the x -space), despite the fact that diffusion data live in a combined space consisting of the x -space and the q -space (i.e., the space of wavevectors). In this paper, we propose to extend NLM to both x -space and q -space. We show how patch-matching, as required in NLM, can be performed concurrently in x-q space with the help of azimuthal equidistant projection and rotation invariant features. Extensive experiments on both synthetic and real data confirm that the proposed x-q space NLM (XQ-NLM) outperforms the classic NLM.

  13. Integrated dynamic analysis simulation of space stations with controllable solar array

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    A methodology is formulated and presented for the integrated structural dynamic analysis of space stations with controllable solar arrays and non-controllable appendages. The structural system flexibility characteristics are considered in the dynamic analysis by a synthesis technique whereby free-free space station modal coordinates and cantilever appendage coordinates are inertially coupled. A digital simulation of this analysis method is described and verified by comparison of interaction load solutions with other methods of solution. Motion equations are simulated for both the zero gravity and artificial gravity (spinning) orbital conditions. Closed loop controlling dynamics for both orientation control of the arrays and attitude control of the space station are provided in the simulation by various generic types of controlling systems. The capability of the simulation as a design tool is demonstrated by utilizing typical space station and solar array structural representations and a specific structural perturbing force. Response and interaction load solutions are presented for this structural configuration and indicate the importance of using an integrated type analysis for the predictions of structural interactions.

  14. Long-Duration Space Flight Provokes Pathologic Q-Tc Interval Prolongation

    NASA Technical Reports Server (NTRS)

    D'Aunno, DOminick S.; Dougherty, Anne H.; DeBlock, Heidi F.; Meck, Janice V.

    2002-01-01

    Space flight has a profound influence on the cardiovascular and autonomic nervous systems. Alterations in baroreflex function, plasma catecholamine concentrations, and arterial pressure regulation have been observed. Changes in autonomic regulation of cardiac function may lead to serious rhythm disturbances. In fact, ventricular tachycardia has been reported during long-duration space flight. The study aim was to determine the effects of space flight on cardiac conduction. Methods and Results: Electrocardiograms (ECGs) and serum electrolytes were obtained before and after short-duration (SD) (4-16 days) and long-duration (LD) (4-6 months) missions. Holter recordings were obtained from 3 different subjects before, during and after a 4-month mission. P-R, R-R, and Q-T intervals were measured manually in a random, blinded fashion and Bazzet's formula used to correct the Q-T interval (Q-Tc). Space flight had no clinically significant effect on electrolyte concentrations. P-R and RR intervals were decreased after SD flight (p<0.05) and recovered 3 days after landing. In the same subjects, P-R and Q-Tc intervals were prolonged after LD flight (p<0.01). Clinically significant Q-Tc prolongation (>0.44 sec) occurred during the first month of flight and persisted until 3 days after landing (p<0.01). Conclusions - Space flight alters cardiac conduction with more ominous changes seen with LD missions. Alterations in autonomic tone may explain ECG changes associated with space flight. Primary cardiac changes may also contribute to the conduction changes with LD flight. Q-Tc prolongation may predispose astronauts to ventricular arrhythmias during and after long-duration space flight.

  15. Guidelines for Using the "Q" Test in Meta-Analysis

    ERIC Educational Resources Information Center

    Maeda, Yukiko; Harwell, Michael R.

    2016-01-01

    The "Q" test is regularly used in meta-analysis to examine variation in effect sizes. However, the assumptions of "Q" are unlikely to be satisfied in practice prompting methodological researchers to conduct computer simulation studies examining its statistical properties. Narrative summaries of this literature are available but…

  16. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  17. Modeling Hubble Space Telescope flight data by Q-Markov cover identification

    NASA Technical Reports Server (NTRS)

    Liu, K.; Skelton, R. E.; Sharkey, J. P.

    1992-01-01

    A state space model for the Hubble Space Telescope under the influence of unknown disturbances in orbit is presented. This model was obtained from flight data by applying the Q-Markov covariance equivalent realization identification algorithm. This state space model guarantees the match of the first Q-Markov parameters and covariance parameters of the Hubble system. The flight data were partitioned into high- and low-frequency components for more efficient Q-Markov cover modeling, to reduce some computational difficulties of the Q-Markov cover algorithm. This identification revealed more than 20 lightly damped modes within the bandwidth of the attitude control system. Comparisons with the analytical (TREETOPS) model are also included.

  18. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    NASA Astrophysics Data System (ADS)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  19. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    NASA Technical Reports Server (NTRS)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  20. Chemistry and Molecular Dynamics Simulations of Heme b-HemQ and Coproheme-HemQ

    PubMed Central

    2016-01-01

    Recently, a novel pathway for heme b biosynthesis in Gram-positive bacteria has been proposed. The final poorly understood step is catalyzed by an enzyme called HemQ and includes two decarboxylation reactions leading from coproheme to heme b. Coproheme has been suggested to act as both substrate and redox active cofactor in this reaction. In the study presented here, we focus on HemQs from Listeria monocytogenes (LmHemQ) and Staphylococcus aureus (SaHemQ) recombinantly produced as apoproteins in Escherichia coli. We demonstrate the rapid and two-phase uptake of coproheme by both apo forms and the significant differences in thermal stability of the apo forms, coproheme-HemQ and heme b-HemQ. Reduction of ferric high-spin coproheme-HemQ to the ferrous form is shown to be enthalpically favored but entropically disfavored with standard reduction potentials of −205 ± 3 mV for LmHemQ and −207 ± 3 mV for SaHemQ versus the standard hydrogen electrode at pH 7.0. Redox thermodynamics suggests the presence of a pronounced H-bonding network and restricted solvent mobility in the heme cavity. Binding of cyanide to the sixth coproheme position is monophasic but relatively slow (∼1 × 104 M–1 s–1). On the basis of the available structures of apo-HemQ and modeling of both loaded forms, molecular dynamics simulation allowed analysis of the interaction of coproheme and heme b with the protein as well as the role of the flexibility at the proximal heme cavity and the substrate access channel for coproheme binding and heme b release. Obtained data are discussed with respect to the proposed function of HemQ in monoderm bacteria. PMID:27599156

  1. Pedestrian simulation and distribution in urban space based on visibility analysis and agent simulation

    NASA Astrophysics Data System (ADS)

    Ying, Shen; Li, Lin; Gao, Yurong

    2009-10-01

    Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.

  2. openQ*D simulation code for QCD+QED

    NASA Astrophysics Data System (ADS)

    Campos, Isabel; Fritzsch, Patrick; Hansen, Martin; Krstić Marinković, Marina; Patella, Agostino; Ramos, Alberto; Tantalo, Nazario

    2018-03-01

    The openQ*D code for the simulation of QCD+QED with C* boundary conditions is presented. This code is based on openQCD-1.6, from which it inherits the core features that ensure its efficiency: the locally-deflated SAP-preconditioned GCR solver, the twisted-mass frequency splitting of the fermion action, the multilevel integrator, the 4th order OMF integrator, the SSE/AVX intrinsics, etc. The photon field is treated as fully dynamical and C* boundary conditions can be chosen in the spatial directions. We discuss the main features of openQ*D, and we show basic test results and performance analysis. An alpha version of this code is publicly available and can be downloaded from http://rcstar.web.cern.ch/.

  3. Integrated dynamic analysis simulation of space stations with controllable solar arrays (supplemental data and analyses)

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    Space station and solar array data and the analyses which were performed in support of the integrated dynamic analysis study. The analysis methods and the formulated digital simulation were developed. Control systems for space station altitude control and solar array orientation control include generic type control systems. These systems have been digitally coded and included in the simulation.

  4. Q-Space Truncation and Sampling in Diffusion Spectrum Imaging

    PubMed Central

    Tian, Qiyuan; Rokem, Ariel; Folkerth, Rebecca D.; Nummenmaa, Aapo; Fan, Qiuyun; Edlow, Brian L.; McNab, Jennifer A.

    2015-01-01

    Purpose To characterize the q-space truncation and sampling on the spin-displacement probability density function (PDF) in diffusion spectrum imaging (DSI). Methods DSI data were acquired using the MGH-USC connectome scanner (Gmax=300mT/m) with bmax=30,000s/mm2, 17×17×17, 15×15×15 and 11×11×11 grids in ex vivo human brains and bmax=10,000s/mm2, 11×11×11 grid in vivo. An additional in vivo scan using bmax=7,000s/mm2, 11×11×11 grid was performed with a derated gradient strength of 40mT/m. PDFs and orientation distribution functions (ODFs) were reconstructed with different q-space filtering and PDF integration lengths, and from down-sampled data by factors of two and three. Results Both ex vivo and in vivo data showed Gibbs ringing in PDFs, which becomes the main source of artifact in the subsequently reconstructed ODFs. For down-sampled data, PDFs interfere with the first replicas or their ringing, leading to obscured orientations in ODFs. Conclusion The minimum required q-space sampling density corresponds to a field-of-view approximately equal to twice the mean displacement distance (MDD) of the tissue. The 11×11×11 grid is suitable for both ex vivo and in vivo DSI experiments. To minimize the effects of Gibbs ringing, ODFs should be reconstructed from unfiltered q-space data with the integration length over the PDF constrained to around the MDD. PMID:26762670

  5. Comparing Shock geometry from MHD simulation to that from the Q/A-scaling analysis

    NASA Astrophysics Data System (ADS)

    Li, G.; Zhao, L.; Jin, M.

    2017-12-01

    In large SEP events, ions can be accelerated at CME-driven shocks to very high energies. Spectra of heavy ions in many large SEP events show features such as roll-overs or spectral breaks. In some events when the spectra are plotted in energy/nucleon they can be shifted relatively to each other so that the spectra align. The amount of shift is charge-to-mass ratio (Q/A) dependent and varies from event to event. In the work of Li et al. (2009), the Q/A dependences of the scaling is related to shock geometry when the CME-driven shock is close to the Sun. For events where multiple in-situ spacecraft observations exist, one may expect that different spacecraft are connected to different portions of the CME-driven shock that have different shock geometries, therefore yielding different Q/A dependence. At the same time, shock geometry can be also obtained from MHD simulations. This means we can compare shock geometry from two completely different approaches: one from MHD simulation and the other from in-situ spectral fitting. In this work, we examine this comparison for selected events.

  6. Q-space truncation and sampling in diffusion spectrum imaging.

    PubMed

    Tian, Qiyuan; Rokem, Ariel; Folkerth, Rebecca D; Nummenmaa, Aapo; Fan, Qiuyun; Edlow, Brian L; McNab, Jennifer A

    2016-12-01

    To characterize the q-space truncation and sampling on the spin-displacement probability density function (PDF) in diffusion spectrum imaging (DSI). DSI data were acquired using the MGH-USC connectome scanner (G max  = 300 mT/m) with b max  = 30,000 s/mm 2 , 17 × 17 × 17, 15 × 15 × 15 and 11 × 11 × 11 grids in ex vivo human brains and b max  = 10,000 s/mm 2 , 11 × 11 × 11 grid in vivo. An additional in vivo scan using b max =7,000 s/mm 2 , 11 × 11 × 11 grid was performed with a derated gradient strength of 40 mT/m. PDFs and orientation distribution functions (ODFs) were reconstructed with different q-space filtering and PDF integration lengths, and from down-sampled data by factors of two and three. Both ex vivo and in vivo data showed Gibbs ringing in PDFs, which becomes the main source of artifact in the subsequently reconstructed ODFs. For down-sampled data, PDFs interfere with the first replicas or their ringing, leading to obscured orientations in ODFs. The minimum required q-space sampling density corresponds to a field-of-view approximately equal to twice the mean displacement distance (MDD) of the tissue. The 11 × 11 × 11 grid is suitable for both ex vivo and in vivo DSI experiments. To minimize the effects of Gibbs ringing, ODFs should be reconstructed from unfiltered q-space data with the integration length over the PDF constrained to around the MDD. Magn Reson Med 76:1750-1763, 2016. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  7. Q-space trajectory imaging for multidimensional diffusion MRI of the human brain

    PubMed Central

    Westin, Carl-Fredrik; Knutsson, Hans; Pasternak, Ofer; Szczepankiewicz, Filip; Özarslan, Evren; van Westen, Danielle; Mattisson, Cecilia; Bogren, Mats; O’Donnell, Lauren; Kubicki, Marek; Topgaard, Daniel; Nilsson, Markus

    2016-01-01

    This work describes a new diffusion MR framework for imaging and modeling of microstructure that we call q-space trajectory imaging (QTI). The QTI framework consists of two parts: encoding and modeling. First we propose q-space trajectory encoding, which uses time-varying gradients to probe a trajectory in q-space, in contrast to traditional pulsed field gradient sequences that attempt to probe a point in q-space. Then we propose a microstructure model, the diffusion tensor distribution (DTD) model, which takes advantage of additional information provided by QTI to estimate a distributional model over diffusion tensors. We show that the QTI framework enables microstructure modeling that is not possible with the traditional pulsed gradient encoding as introduced by Stejskal and Tanner. In our analysis of QTI, we find that the well-known scalar b-value naturally extends to a tensor-valued entity, i.e., a diffusion measurement tensor, which we call the b-tensor. We show that b-tensors of rank 2 or 3 enable estimation of the mean and covariance of the DTD model in terms of a second order tensor (the diffusion tensor) and a fourth order tensor. The QTI framework has been designed to improve discrimination of the sizes, shapes, and orientations of diffusion microenvironments within tissue. We derive rotationally invariant scalar quantities describing intuitive microstructural features including size, shape, and orientation coherence measures. To demonstrate the feasibility of QTI on a clinical scanner, we performed a small pilot study comparing a group of five healthy controls with five patients with schizophrenia. The parameter maps derived from QTI were compared between the groups, and 9 out of the 14 parameters investigated showed differences between groups. The ability to measure and model the distribution of diffusion tensors, rather than a quantity that has already been averaged within a voxel, has the potential to provide a powerful paradigm for the study of

  8. Monte Carlo simulation of a dynamical fermion problem: The light q sup 2 q sup 2 system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grondin, G.

    1991-01-01

    We present results from a Guided Random Walk Monte Carlo simulation of the light q{sup 2}{bar q}{sup 2} system in a Coulomb-plus-linear quark potential model using an Intel iPSC/860 hypercube. A solvable model problem is first considered, after which we study the full q{sup 2}{bar q}{sup 2} system in (J,I) = (2,2) and (2,0) sectors. We find evidence for no bound states below the vector-vector threshold in these systems. 17 refs., 6 figs.

  9. Assessing the multiscale architecture of muscular tissue with Q-space magnetic resonance imaging: Review.

    PubMed

    Hoffman, Matthew P; Taylor, Erik N; Aninwene, George E; Sadayappan, Sakthivel; Gilbert, Richard J

    2018-02-01

    Contraction of muscular tissue requires the synchronized shortening of myofibers arrayed in complex geometrical patterns. Imaging such myofiber patterns with diffusion-weighted MRI reveals architectural ensembles that underlie force generation at the organ scale. Restricted proton diffusion is a stochastic process resulting from random translational motion that may be used to probe the directionality of myofibers in whole tissue. During diffusion-weighted MRI, magnetic field gradients are applied to determine the directional dependence of proton diffusion through the analysis of a diffusional probability distribution function (PDF). The directions of principal (maximal) diffusion within the PDF are associated with similarly aligned diffusion maxima in adjacent voxels to derive multivoxel tracts. Diffusion-weighted MRI with tractography thus constitutes a multiscale method for depicting patterns of cellular organization within biological tissues. We provide in this review, details of the method by which generalized Q-space imaging is used to interrogate multidimensional diffusion space, and thereby to infer the organization of muscular tissue. Q-space imaging derives the lowest possible angular separation of diffusion maxima by optimizing the conditions by which magnetic field gradients are applied to a given tissue. To illustrate, we present the methods and applications associated with Q-space imaging of the multiscale myoarchitecture associated with the human and rodent tongues. These representations emphasize the intricate and continuous nature of muscle fiber organization and suggest a method to depict structural "blueprints" for skeletal and cardiac muscle tissue. © 2016 Wiley Periodicals, Inc.

  10. Complete integrability of geodesic motion in Sasaki-Einstein toric Yp,q spaces

    NASA Astrophysics Data System (ADS)

    Babalic, Elena Mirela; Visinescu, Mihai

    2015-09-01

    We construct explicitly the constants of motion for geodesics in the five-dimensional Sasaki-Einstein spaces Yp,q. To carry out this task, we use the knowledge of the complete set of Killing vectors and Killing-Yano tensors on these spaces. In spite of the fact that we generate a multitude of constants of motion, only five of them are functionally independent implying the complete integrability of geodesic flow on Yp,q spaces. In the particular case of the homogeneous Sasaki-Einstein manifold T1,1 the integrals of motion have simpler forms and the relations between them are described in detail.

  11. Joint 6D k-q Space Compressed Sensing for Accelerated High Angular Resolution Diffusion MRI.

    PubMed

    Cheng, Jian; Shen, Dinggang; Basser, Peter J; Yap, Pew-Thian

    2015-01-01

    High Angular Resolution Diffusion Imaging (HARDI) avoids the Gaussian. diffusion assumption that is inherent in Diffusion Tensor Imaging (DTI), and is capable of characterizing complex white matter micro-structure with greater precision. However, HARDI methods such as Diffusion Spectrum Imaging (DSI) typically require significantly more signal measurements than DTI, resulting in prohibitively long scanning times. One of the goals in HARDI research is therefore to improve estimation of quantities such as the Ensemble Average Propagator (EAP) and the Orientation Distribution Function (ODF) with a limited number of diffusion-weighted measurements. A popular approach to this problem, Compressed Sensing (CS), affords highly accurate signal reconstruction using significantly fewer (sub-Nyquist) data points than required traditionally. Existing approaches to CS diffusion MRI (CS-dMRI) mainly focus on applying CS in the q-space of diffusion signal measurements and fail to take into consideration information redundancy in the k-space. In this paper, we propose a framework, called 6-Dimensional Compressed Sensing diffusion MRI (6D-CS-dMRI), for reconstruction of the diffusion signal and the EAP from data sub-sampled in both 3D k-space and 3D q-space. To our knowledge, 6D-CS-dMRI is the first work that applies compressed sensing in the full 6D k-q space and reconstructs the diffusion signal in the full continuous q-space and the EAP in continuous displacement space. Experimental results on synthetic and real data demonstrate that, compared with full DSI sampling in k-q space, 6D-CS-dMRI yields excellent diffusion signal and EAP reconstruction with low root-mean-square error (RMSE) using 11 times less samples (3-fold reduction in k-space and 3.7-fold reduction in q-space).

  12. Op-amp gyrator simulates high Q inductor

    NASA Technical Reports Server (NTRS)

    Sutherland, W. C.

    1977-01-01

    Gyrator circuit consisting of dual operational amplifier and four resistors inverts impedance of capacitor to simulate inductor. Synthetic inductor has high Q factor, good stability, wide bandwidth, and easily determined value of inductance that is independent of frequency. It readily lends itself to integrated-circuit applications, including filter networks.

  13. Analysis of autism susceptibility gene loci on chromosomes 1p, 4p, 6q, 7q, 13q, 15q, 16p, 17q, 19q and 22q in Finnish multiplex families.

    PubMed

    Auranen, M; Nieminen, T; Majuri, S; Vanhala, R; Peltonen, L; Järvelä, I

    2000-05-01

    The role of genetic factors in the etiology of the autistic spectrum of disorders has clearly been demonstrated. Ten chromosomal regions, on chromosomes 1p, 4p, 6q, 7q, 13q, 15q, 16p, 17q, 19q and 22q have potentially been linked to autism.1-8 We have analyzed these chromosomal regions in a total of 17 multiplex families with autism originating from the isolated Finnish population by pairwise linkage analysis and sib-pair analysis. Mild evidence for putative contribution was found only with the 1p chromosomal region in the susceptibility to autism. Our data suggest that additional gene loci exist for autism which will be detectable in and even restricted to the isolated Finnish population.

  14. Prospective estimation of mean axon diameter and extra-axonal space of the posterior limb of the internal capsule in patients with idiopathic normal pressure hydrocephalus before and after a lumboperitoneal shunt by using q-space diffusion MRI.

    PubMed

    Hori, Masaaki; Kamiya, Kouhei; Nakanishi, Atsushi; Fukunaga, Issei; Miyajima, Masakazu; Nakajima, Madoka; Suzuki, Michimasa; Suzuki, Yuriko; Irie, Ryusuke; Kamagata, Koji; Arai, Hajime; Aoki, Shigeki

    2016-09-01

    To prospectively estimate the mean axon diameter (MAD) and extracellular space of the posterior limb of the internal capsule (PLIC) in patients with idiopathic normal pressure hydrocephalus (iNPH) before and after a lumboperitoneal (LP) shunting operation using q-space diffusion MRI analysis. We studied 12 consecutive patients with iNPH and 12 controls at our institution. After conventional magnetic resonance imaging (MRI), q-space image (QSI) data were acquired with a 3-T MRI scanner. The MAD and extra-axonal space of the PLIC before and after LP shunting were calculated using two-component q-space imaging analyses; the before and after values were compared. After LP shunt surgery, the extracellular space of the PLIC was significantly higher than that of the same patients before the operation (one-way analysis of variance (ANOVA) with Scheffé's post-hoc test, P = 0.024). No significant differences were observed in the PLIC axon diameters among normal controls or in patients before and after surgery. Increases in the root mean square displacement in the extra-axonal space of the PLIC in patients with iNPH after an LP shunt procedure are associated with the microstructural changes of white matter and subsequent abatement of patient symptoms. • Q-space diffusion MRI provides information on microstructural changes in the corticospinal tract • Lumboperitoneal (LP) shunting operation is useful for idiopathic normal pressure hydrocephalus • Q-space measurement may be a biomarker for the effect of the LP shunt procedure.

  15. Leveraging EAP-Sparsity for Compressed Sensing of MS-HARDI in (k, q)-Space.

    PubMed

    Sun, Jiaqi; Sakhaee, Elham; Entezari, Alireza; Vemuri, Baba C

    2015-01-01

    Compressed Sensing (CS) for the acceleration of MR scans has been widely investigated in the past decade. Lately, considerable progress has been made in achieving similar speed ups in acquiring multi-shell high angular resolution diffusion imaging (MS-HARDI) scans. Existing approaches in this context were primarily concerned with sparse reconstruction of the diffusion MR signal S(q) in the q-space. More recently, methods have been developed to apply the compressed sensing framework to the 6-dimensional joint (k, q)-space, thereby exploiting the redundancy in this 6D space. To guarantee accurate reconstruction from partial MS-HARDI data, the key ingredients of compressed sensing that need to be brought together are: (1) the function to be reconstructed needs to have a sparse representation, and (2) the data for reconstruction ought to be acquired in the dual domain (i.e., incoherent sensing) and (3) the reconstruction process involves a (convex) optimization. In this paper, we present a novel approach that uses partial Fourier sensing in the 6D space of (k, q) for the reconstruction of P(x, r). The distinct feature of our approach is a sparsity model that leverages surfacelets in conjunction with total variation for the joint sparse representation of P(x, r). Thus, our method stands to benefit from the practical guarantees for accurate reconstruction from partial (k, q)-space data. Further, we demonstrate significant savings in acquisition time over diffusion spectral imaging (DSI) which is commonly used as the benchmark for comparisons in reported literature. To demonstrate the benefits of this approach,.we present several synthetic and real data examples.

  16. Helium-3 MR q-space imaging with radial acquisition and iterative highly constrained back-projection.

    PubMed

    O'Halloran, Rafael L; Holmes, James H; Wu, Yu-Chien; Alexander, Andrew; Fain, Sean B

    2010-01-01

    An undersampled diffusion-weighted stack-of-stars acquisition is combined with iterative highly constrained back-projection to perform hyperpolarized helium-3 MR q-space imaging with combined regional correction of radiofrequency- and T1-related signal loss in a single breath-held scan. The technique is tested in computer simulations and phantom experiments and demonstrated in a healthy human volunteer with whole-lung coverage in a 13-sec breath-hold. Measures of lung microstructure at three different lung volumes are evaluated using inhaled gas volumes of 500 mL, 1000 mL, and 1500 mL to demonstrate feasibility. Phantom results demonstrate that the proposed technique is in agreement with theoretical values, as well as with a fully sampled two-dimensional Cartesian acquisition. Results from the volunteer study demonstrate that the root mean squared diffusion distance increased significantly from the 500-mL volume to the 1000-mL volume. This technique represents the first demonstration of a spatially resolved hyperpolarized helium-3 q-space imaging technique and shows promise for microstructural evaluation of lung disease in three dimensions. Copyright (c) 2009 Wiley-Liss, Inc.

  17. q-Deformed Minkowski Algebra and Its Space-Time Lattice

    NASA Astrophysics Data System (ADS)

    Wess, J.

    2Max-Planck-Institut für Physik (Werner-Heisenberg-Institut) Föhringer Ring 6, D-80805 MünchenAbstract. We have asked how the Heisenberg relations of space and time change if we replace the Lorentz group by a q-deformed Lorentz group (Lorek et al. 1997).

  18. Simulations of Low-q Disruptions in the Compact Toroidal Hybrid Experiment

    NASA Astrophysics Data System (ADS)

    Howell, E. C.; Hanson, J. D.; Ennis, D. A.; Hartwell, G. J.; Maurer, D. A.

    2017-10-01

    Resistive MHD simulations of low-q disruptions in the Compact Toroidal Hybrid Device (CTH) are performed using the NIMROD code. CTH is a current-carrying stellarator used to study the effects of 3D shaping on MHD stability. Experimentally, it is observed that the application of 3D vacuum fields allows CTH to operate with edge safety factor less than 2.0. However, these low-q discharges often disrupt after peak current if the applied 3D fields are too weak. Nonlinear simulations are initialized using model VMEC equilibria representative of low-q discharges with weak vacuum transform. Initially a series of symmetry preserving island chains are excited at the q=6/5, 7/5, 8/5, and 9/5 rational surfaces. These island chains act as transport barriers preventing stochastic magnetic fields in the edge from penetrating into the core. As the simulation progresses, predominately m/n=3/2 and 4/3 instabilities are destabilized. As these instabilities grow to large amplitude they destroy the symmetry preserving islands leading to large regions of stochastic fields. A current spike and loss of core thermal confinement occurs when the innermost island chain (6/5) is destroyed. Work Supported by US-DOE Grant #DE-FG02-03ER54692.

  19. 25th Space Simulation Conference. Environmental Testing: The Earth-Space Connection

    NASA Technical Reports Server (NTRS)

    Packard, Edward

    2008-01-01

    Topics covered include: Methods of Helium Injection and Removal for Heat Transfer Augmentation; The ESA Large Space Simulator Mechanical Ground Support Equipment for Spacecraft Testing; Temperature Stability and Control Requirements for Thermal Vacuum/Thermal Balance Testing of the Aquarius Radiometer; The Liquid Nitrogen System for Chamber A: A Change from Original Forced Flow Design to a Natural Flow (Thermo Siphon) System; Return to Mercury: A Comparison of Solar Simulation and Flight Data for the MESSENGER Spacecraft; Floating Pressure Conversion and Equipment Upgrades of Two 3.5kw, 20k, Helium Refrigerators; Affect of Air Leakage into a Thermal-Vacuum Chamber on Helium Refrigeration Heat Load; Special ISO Class 6 Cleanroom for the Lunar Reconnaissance Orbiter (LRO) Project; A State-of-the-Art Contamination Effects Research and Test Facility Martian Dust Simulator; Cleanroom Design Practices and Their Influence on Particle Counts; Extra Terrestrial Environmental Chamber Design; Contamination Sources Effects Analysis (CSEA) - A Tool to Balance Cost/Schedule While Managing Facility Availability; SES and Acoustics at GSFC; HST Super Lightweight Interchangeable Carrier (SLIC) Static Test; Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance; Estimating Shock Spectra: Extensions beyond GEVS; Structural Dynamic Analysis of a Spacecraft Multi-DOF Shaker Table; Direct Field Acoustic Testing; Manufacture of Cryoshroud Surfaces for Space Simulation Chambers; The New LOTIS Test Facility; Thermal Vacuum Control Systems Options for Test Facilities; Extremely High Vacuum Chamber for Low Outgassing Processing at NASA Goddard; Precision Cleaning - Path to Premier; The New Anechoic Shielded Chambers Designed for Space and Commercial Applications at LIT; Extraction of Thermal Performance Values from Samples in the Lunar Dust Adhesion Bell Jar; Thermal (Silicon Diode) Data Acquisition System; Aquarius's Instrument Science Data System (ISDS) Automated

  20. Space radiator simulation system analysis

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.

  1. Space Debris Attitude Simulation - IOTA (In-Orbit Tumbling Analysis)

    NASA Astrophysics Data System (ADS)

    Kanzler, R.; Schildknecht, T.; Lips, T.; Fritsche, B.; Silha, J.; Krag, H.

    Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA's Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. The In-Orbit Tumbling Analysis tool (IOTA) is a prototype software, currently in development within the framework of ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), which is led by the Astronomical Institute of the University of Bern (AIUB). The project goal is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). Developed by Hyperschall Technologie Göttingen GmbH (HTG), IOTA will be a highly modular software tool to perform short- (days), medium- (months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour

  2. Dispersion analysis for baseline reference mission 1. [flight simulation and trajectory analysis for space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Kuhn, A. E.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.

  3. Space Station communications and tracking systems modeling and RF link simulation

    NASA Technical Reports Server (NTRS)

    Tsang, Chit-Sang; Chie, Chak M.; Lindsey, William C.

    1986-01-01

    In this final report, the effort spent on Space Station Communications and Tracking System Modeling and RF Link Simulation is described in detail. The effort is mainly divided into three parts: frequency division multiple access (FDMA) system simulation modeling and software implementation; a study on design and evaluation of a functional computerized RF link simulation/analysis system for Space Station; and a study on design and evaluation of simulation system architecture. This report documents the results of these studies. In addition, a separate User's Manual on Space Communications Simulation System (SCSS) (Version 1) documents the software developed for the Space Station FDMA communications system simulation. The final report, SCSS user's manual, and the software located in the NASA JSC system analysis division's VAX 750 computer together serve as the deliverables from LinCom for this project effort.

  4. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    NASA Technical Reports Server (NTRS)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  5. An integrative approach to space-flight physiology using systems analysis and mathematical simulation

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.; White, R. J.; Rummel, J. A.

    1980-01-01

    An approach was developed to aid in the integration of many of the biomedical findings of space flight, using systems analysis. The mathematical tools used in accomplishing this task include an automated data base, a biostatistical and data analysis system, and a wide variety of mathematical simulation models of physiological systems. A keystone of this effort was the evaluation of physiological hypotheses using the simulation models and the prediction of the consequences of these hypotheses on many physiological quantities, some of which were not amenable to direct measurement. This approach led to improvements in the model, refinements of the hypotheses, a tentative integrated hypothesis for adaptation to weightlessness, and specific recommendations for new flight experiments.

  6. Diversity of nursing student views about simulation design: a q-methodological study.

    PubMed

    Paige, Jane B; Morin, Karen H

    2015-05-01

    Education of future nurses benefits from well-designed simulation activities. Skillful teaching with simulation requires educators to be constantly aware of how students experience learning and perceive educators' actions. Because revision of simulation activities considers feedback elicited from students, it is crucial to understand the perspective from which students base their response. In a Q-methodological approach, 45 nursing students rank-ordered 60 opinion statements about simulation design into a distribution grid. Factor analysis revealed that nursing students hold five distinct and uniquely personal perspectives-Let Me Show You, Stand By Me, The Agony of Defeat, Let Me Think It Through, and I'm Engaging and So Should You. Results suggest that nurse educators need to reaffirm that students clearly understand the purpose of each simulation activity. Nurse educators should incorporate presimulation assignments to optimize learning and help allay anxiety. The five perspectives discovered in this study can serve as a tool to discern individual students' learning needs. Copyright 2015, SLACK Incorporated.

  7. On the collaborative design and simulation of space camera: stop structural/thermal/optical) analysis

    NASA Astrophysics Data System (ADS)

    Duan, Pengfei; Lei, Wenping

    2017-11-01

    A number of disciplines (mechanics, structures, thermal, and optics) are needed to design and build Space Camera. Separate design models are normally constructed by each discipline CAD/CAE tools. Design and analysis is conducted largely in parallel subject to requirements that have been levied on each discipline, and technical interaction between the different disciplines is limited and infrequent. As a result a unified view of the Space Camera design across discipline boundaries is not directly possible in the approach above, and generating one would require a large manual, and error-prone process. A collaborative environment that is built on abstract model and performance template allows engineering data and CAD/CAE results to be shared across above discipline boundaries within a common interface, so that it can help to attain speedy multivariate design and directly evaluate optical performance under environment loadings. A small interdisciplinary engineering team from Beijing Institute of Space Mechanics and Electricity has recently conducted a Structural/Thermal/Optical (STOP) analysis of a space camera with this collaborative environment. STOP analysis evaluates the changes in image quality that arise from the structural deformations when the thermal environment of the camera changes throughout its orbit. STOP analyses were conducted for four different test conditions applied during final thermal vacuum (TVAC) testing of the payload on the ground. The STOP Simulation Process begins with importing an integrated CAD model of the camera geometry into the collaborative environment, within which 1. Independent thermal and structural meshes are generated. 2. The thermal mesh and relevant engineering data for material properties and thermal boundary conditions are then used to compute temperature distributions at nodal points in both the thermal and structures mesh through Thermal Desktop, a COTS thermal design and analysis code. 3. Thermally induced structural

  8. SpaceNet: Modeling and Simulating Space Logistics

    NASA Technical Reports Server (NTRS)

    Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen

    2008-01-01

    This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.

  9. RTDS-Based Design and Simulation of Distributed P-Q Power Resources in Smart Grid

    NASA Astrophysics Data System (ADS)

    Taylor, Zachariah David

    In this Thesis, we propose to utilize a battery system together with its power electronics interfaces and bidirectional charger as a distributed P-Q resource in power distribution networks. First, we present an optimization-based approach to operate such distributed P-Q resources based on the characteristics of the battery and charger system as well as the features and needs of the power distribution network. Then, we use the RTDS Simulator, which is an industry-standard simulation tool of power systems, to develop two RTDS-based design approaches. The first design is based on an ideal four-quadrant distributed P-Q power resource. The second design is based on a detailed four-quadrant distributed P-Q power resource that is developed using power electronics components. The hardware and power electronics circuitry as well as the control units are explained for the second design. After that, given the two-RTDS designs, we conducted extensive RTDS simulations to assess the performance of the designed distributed P-Q Power Resource in an IEEE 13 bus test system. We observed that the proposed design can noticeably improve the operational performance of the power distribution grid in at least four key aspects: reducing power loss, active power peak load shaving at substation, reactive power peak load shaving at substation, and voltage regulation. We examine these performance measures across three design cases: Case 1: There is no P-Q Power Resource available on the power distribution network. Case 2: The installed P-Q Power Resource only supports active power, i.e., it only utilizes its battery component. Case 3: The installed P-Q Power Resource supports both active and reactive power, i.e., it utilizes both its battery component and its power electronics charger component. In the end, we present insightful interpretations on the simulation results and suggest some future works.

  10. GSFC Space Simulation Laboratory Contamination Philosophy: Efficient Space Simulation Chamber Cleaning Techniques

    NASA Technical Reports Server (NTRS)

    Roman, Juan A.; Stitt, George F.; Roman, Felix R.

    1997-01-01

    This paper will provide a general overview of the molecular contamination philosophy of the Space Simulation Test Engineering Section and how the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) space simulation laboratory controls and maintains the cleanliness of all its facilities, thereby, minimizing down time between tests. It will also briefly cover the proper selection and safety precautions needed when using some chemical solvents for wiping, washing, or spraying thermal shrouds when molecular contaminants increase to unacceptable background levels.

  11. Analysis of the Thermo-Elastic Response of Space Reflectors to Simulated Space Environment

    NASA Astrophysics Data System (ADS)

    Allegri, G.; Ivagnes, M. M.; Marchetti, M.; Poscente, F.

    2002-01-01

    The evaluation of space environment effects on materials and structures is a key matter to develop a proper design of long duration missions: since a large part of satellites operating in the earth orbital environment are employed for telecommunications, the development of space antennas and reflectors featured by high dimensional stability versus space environment interactions represents a major challenge for designers. The structural layout of state of the art space antennas and reflectors is very complex, since several different sensible elements and materials are employed: particular care must be placed in evaluating the actual geometrical configuration of the reflectors operating in the space environment, since very limited distortions of the designed layout can produce severe effects on the quality of the signal both received and transmitted, especially for antennas operating at high frequencies. The effects of thermal loads due to direct sunlight exposition and to earth and moon albedo can be easily taken into account employing the standard methods of structural analysis: on the other hand the thermal cycling and the exposition to the vacuum environment produce a long term damage accumulation which affects the whole structure. The typical effects of the just mentioned exposition are the outgassing of polymeric materials and the contamination of the exposed surface, which can affect sensibly the thermo-mechanical properties of the materials themselves and, therefore, the structural global response. The main aim of the present paper is to evaluate the synergistic effects of thermal cycling and of the exposition to high vacuum environment on an innovative antenna developed by Alenia Spazio S.p.a.: to this purpose, both an experimental and numerical research activity has been developed. A complete prototype of the antenna has been exposed to the space environment simulated by the SAS facility: this latter is constituted by an high vacuum chamber, equipped by

  12. Sigma model Q-balls and Q-stars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbin, Y.

    2007-10-15

    A new kind of Q-balls is found: Q-balls in a nonlinear sigma model. Their main properties are presented together with those of their self-gravitating generalization, sigma model Q-stars. A simple special limit of solutions which are bound by gravity alone ('sigma stars') is also discussed briefly. The analysis is based on calculating the mass, global U(1) charge and binding energy for families of solutions parametrized by the central value of the scalar field. Two kinds (differing by the potential term) of the new sigma model Q-balls and Q-stars are analyzed. They are found to share some characteristics while differing inmore » other respects like their properties for weak central scalar fields which depend strongly on the form of the potential term. They are also compared with their ordinary counterparts and although similar in some respects, significant differences are found like the existence of an upper bound on the central scalar field. A special subset of the sigma model Q-stars contains those which do not possess a flat space limit. Their relation with sigma star solutions is discussed.« less

  13. The Q continuum simulation: Harnessing the power of GPU accelerated supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitmann, Katrin; Frontiere, Nicholas; Sewell, Chris

    2015-08-01

    Modeling large-scale sky survey observations is a key driver for the continuing development of high-resolution, large-volume, cosmological simulations. We report the first results from the "Q Continuum" cosmological N-body simulation run carried out on the GPU-accelerated supercomputer Titan. The simulation encompasses a volume of (1300 Mpc)(3) and evolves more than half a trillion particles, leading to a particle mass resolution of m(p) similar or equal to 1.5 . 10(8) M-circle dot. At thismass resolution, the Q Continuum run is currently the largest cosmology simulation available. It enables the construction of detailed synthetic sky catalogs, encompassing different modeling methodologies, including semi-analyticmore » modeling and sub-halo abundance matching in a large, cosmological volume. Here we describe the simulation and outputs in detail and present first results for a range of cosmological statistics, such as mass power spectra, halo mass functions, and halo mass-concentration relations for different epochs. We also provide details on challenges connected to running a simulation on almost 90% of Titan, one of the fastest supercomputers in the world, including our usage of Titan's GPU accelerators.« less

  14. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A recommendation and a specification for the visual simulation system design for the space shuttle mission simulator are presented. A recommended visual system is described which most nearly meets the visual design requirements. The cost analysis of the recommended system covering design, development, manufacturing, and installation is reported. Four alternate systems are analyzed.

  15. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  16. Efficient Computation of Anharmonic Force Constants via q-space, with Application to Graphene

    NASA Astrophysics Data System (ADS)

    Kornbluth, Mordechai; Marianetti, Chris

    We present a new approach for extracting anharmonic force constants from a sparse sampling of the anharmonic dynamical tensor. We calculate the derivative of the energy with respect to q-space displacements (phonons) and strain, which guarantees the absence of supercell image errors. Central finite differences provide a well-converged quadratic error tail for each derivative, separating the contribution of each anharmonic order. These derivatives populate the anharmonic dynamical tensor in a sparse mesh that bounds the Brillouin Zone, which ensures comprehensive sampling of q-space while exploiting small-cell calculations for efficient, high-throughput computation. This produces a well-converged and precisely-defined dataset, suitable for big-data approaches. We transform this sparsely-sampled anharmonic dynamical tensor to real-space anharmonic force constants that obey full space-group symmetries by construction. Machine-learning techniques identify the range of real-space interactions. We show the entire process executed for graphene, up to and including the fifth-order anharmonic force constants. This method successfully calculates strain-based phonon renormalization in graphene, even under large strains, which solves a major shortcoming of previous potentials.

  17. BFV-BRST analysis of the classical and quantum q-deformations of the sl(2) algebra

    NASA Astrophysics Data System (ADS)

    Dayi, O. F.

    1994-01-01

    BFV--BRST charge for q-deformed algebras is not unique. Different constructions of it in the classical as well as in the quantum phase space for the $q$-deformed algebra sl_q(2) are discussed. Moreover, deformation of the phase space without deforming the generators of sl(2) is considered. $\\hbar$-q-deformation of the phase space is shown to yield the Witten's second deformation. To study the BFV--BRST cohomology problem when both the quantum phase space and the group are deformed, a two parameter deformation of sl(2) is proposed, and its BFV-BRST charge is given.

  18. Wake Encounter Analysis for a Closely Spaced Parallel Runway Paired Approach Simulation

    NASA Technical Reports Server (NTRS)

    Mckissick,Burnell T.; Rico-Cusi, Fernando J.; Murdoch, Jennifer; Oseguera-Lohr, Rosa M.; Stough, Harry P, III; O'Connor, Cornelius J.; Syed, Hazari I.

    2009-01-01

    A Monte Carlo simulation of simultaneous approaches performed by two transport category aircraft from the final approach fix to a pair of closely spaced parallel runways was conducted to explore the aft boundary of the safe zone in which separation assurance and wake avoidance are provided. The simulation included variations in runway centerline separation, initial longitudinal spacing of the aircraft, crosswind speed, and aircraft speed during the approach. The data from the simulation showed that the majority of the wake encounters occurred near or over the runway and the aft boundaries of the safe zones were identified for all simulation conditions.

  19. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  20. Apu/hydraulic/actuator Subsystem Computer Simulation. Space Shuttle Engineering and Operation Support, Engineering Systems Analysis. [for the space shuttle

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Major developments are examined which have taken place to date in the analysis of the power and energy demands on the APU/Hydraulic/Actuator Subsystem for space shuttle during the entry-to-touchdown (not including rollout) flight regime. These developments are given in the form of two subroutines which were written for use with the Space Shuttle Functional Simulator. The first subroutine calculates the power and energy demand on each of the three hydraulic systems due to control surface (inboard/outboard elevons, rudder, speedbrake, and body flap) activity. The second subroutine incorporates the R. I. priority rate limiting logic which limits control surface deflection rates as a function of the number of failed hydraulic. Typical results of this analysis are included, and listings of the subroutines are presented in appendicies.

  1. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  2. A space systems perspective of graphics simulation integration

    NASA Technical Reports Server (NTRS)

    Brown, R.; Gott, C.; Sabionski, G.; Bochsler, D.

    1987-01-01

    Creation of an interactive display environment can expose issues in system design and operation not apparent from nongraphics development approaches. Large amounts of information can be presented in a short period of time. Processes can be simulated and observed before committing resources. In addition, changes in the economics of computing have enabled broader graphics usage beyond traditional engineering and design into integrated telerobotics and Artificial Intelligence (AI) applications. The highly integrated nature of space operations often tend to rely upon visually intensive man-machine communication to ensure success. Graphics simulation activities at the Mission Planning and Analysis Division (MPAD) of NASA's Johnson Space Center are focusing on the evaluation of a wide variety of graphical analysis within the context of present and future space operations. Several telerobotics and AI applications studies utilizing graphical simulation are described. The presentation includes portions of videotape illustrating technology developments involving: (1) coordinated manned maneuvering unit and remote manipulator system operations, (2) a helmet mounted display system, and (3) an automated rendezous application utilizing expert system and voice input/output technology.

  3. Eighteenth Space Simulation Conference: Space Mission Success Through Testing

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph L., III (Compiler)

    1994-01-01

    The Institute of Environmental Sciences' Eighteenth Space Simulation Conference, 'Space Mission Success Through Testing' provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, program/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme 'Space Mission Success Through Testing.'

  4. The tracking analysis in the Q-weak experiment

    NASA Astrophysics Data System (ADS)

    Pan, J.; Androic, D.; Armstrong, D. S.; Asaturyan, A.; Averett, T.; Balewski, J.; Beaufait, J.; Beminiwattha, R. S.; Benesch, J.; Benmokhtar, F.; Birchall, J.; Carlini, R. D.; Cates, G. D.; Cornejo, J. C.; Covrig, S.; Dalton, M. M.; Davis, C. A.; Deconinck, W.; Diefenbach, J.; Dowd, J. F.; Dunne, J. A.; Dutta, D.; Duvall, W. S.; Elaasar, M.; Falk, W. R.; Finn, J. M.; Forest, T.; Gaskell, D.; Gericke, M. T. W.; Grames, J.; Gray, V. M.; Grimm, K.; Guo, F.; Hoskins, J. R.; Johnston, K.; Jones, D.; Jones, M.; Jones, R.; Kargiantoulakis, M.; King, P. M.; Korkmaz, E.; Kowalski, S.; Leacock, J.; Leckey, J.; Lee, A. R.; Lee, J. H.; Lee, L.; MacEwan, S.; Mack, D.; Magee, J. A.; Mahurin, R.; Mammei, J.; Martin, J. W.; McHugh, M. J.; Meekins, D.; Mei, J.; Michaels, R.; Micherdzinska, A.; Mkrtchyan, A.; Mkrtchyan, H.; Morgan, N.; Myers, K. E.; Narayan, A.; Ndukum, L. Z.; Nelyubin, V.; Nuruzzaman; van Oers, W. T. H.; Opper, A. K.; Page, S. A.; Pan, J.; Paschke, K. D.; Phillips, S. K.; Pitt, M. L.; Poelker, M.; Rajotte, J. F.; Ramsay, W. D.; Roche, J.; Sawatzky, B.; Seva, T.; Shabestari, M. H.; Silwal, R.; Simicevic, N.; Smith, G. R.; Solvignon, P.; Spayde, D. T.; Subedi, A.; Subedi, R.; Suleiman, R.; Tadevosyan, V.; Tobias, W. A.; Tvaskis, V.; Waidyawansa, B.; Wang, P.; Wells, S. P.; Wood, S. A.; Yang, S.; Young, R. D.; Zhamkochyan, S.

    2016-12-01

    The Q-weak experiment at Jefferson Laboratory measured the parity violating asymmetry ( A P V ) in elastic electron-proton scattering at small momentum transfer squared ( Q 2=0.025 ( G e V/ c)2), with the aim of extracting the proton's weak charge ({Q^p_W}) to an accuracy of 5 %. As one of the major uncertainty contribution sources to {Q^p_W}, Q 2 needs to be determined to ˜1 % so as to reach the proposed experimental precision. For this purpose, two sets of high resolution tracking chambers were employed in the experiment, to measure tracks before and after the magnetic spectrometer. Data collected by the tracking system were then reconstructed with dedicated software into individual electron trajectories for experimental kinematics determination. The Q-weak kinematics and the analysis scheme for tracking data are briefly described here. The sources that contribute to the uncertainty of Q 2 are discussed, and the current analysis status is reported.

  5. The tracking analysis in the Q-weak experiment

    DOE PAGES

    Pan, J.; Androic, D.; Armstrong, D. S.; ...

    2016-11-21

    Here, the Q-weak experiment at Jefferson Laboratory measured the parity violating asymmetry (Amore » $$_{PV}$$ ) in elastic electron-proton scattering at small momentum transfer squared (Q$$^{2}$$=0.025 (G e V/c)$$^{2}$$), with the aim of extracting the proton’s weak charge ( $${Q^p_W}$$ ) to an accuracy of 5 %. As one of the major uncertainty contribution sources to $${Q^p_W}$$ , Q$$^{2}$$ needs to be determined to ~1 % so as to reach the proposed experimental precision. For this purpose, two sets of high resolution tracking chambers were employed in the experiment, to measure tracks before and after the magnetic spectrometer. Data collected by the tracking system were then reconstructed with dedicated software into individual electron trajectories for experimental kinematics determination. The Q-weak kinematics and the analysis scheme for tracking data are briefly described here. The sources that contribute to the uncertainty of Q$$^{2}$$ are discussed, and the current analysis status is reported.« less

  6. Fifteenth Space Simulation Conference: Support the Highway to Space Through Testing

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph (Editor)

    1988-01-01

    The Institute of Environmental Sciences Fifteenth Space Simulation Conference, Support the Highway to Space Through Testing, provided participants a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, thermal simulation and protection, contamination, and techniques of test measurements.

  7. Fourteenth Space Simulation Conference: Testing for a Permanent Presence in Space

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph L., III (Editor)

    1986-01-01

    The Institute of Environmental Sciences Fourteenth Space Simulation Conference, Testing for a Permanent Presence in Space, provided participants a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, thermal simulation, and protection, contamination, and techniques of test measurements.

  8. The future of simulations for space applications

    NASA Astrophysics Data System (ADS)

    Matsumoto, H.

    Space development has been rapidly increasing and there will be huge investment by business markets for space development and applications such as space factory and Solar Power Station (SPS). In such a situation, we would like to send a warning message regarding the future space simulations. It is widely recognized that space simulation have been contributing to the quantitative understanding of various plasma phenomena occurring in the solarterrestrial environment. In the current century, however, in addition to the conventional contribution to the solar-terrestrial physics, we also have to pay our attention to the application of space simulation for human activities in space. We believe that space simulations can be a a powerful and helpful tool for the understanding the spacecraft-environment interactions occurring in space development and applications. The global influence by exhausted heavy ions from electric propulsion on the plasmasphere can be also analyzed by the combination of MHD and particle simulations. The results obtained in the simulations can provide us very significant and beneficial information so that we can minimize the undesirable effects in space development and applications. 1 Brief history of ISSS and contribution to the space plasma physics Numerical simulation has been largely recognized as a powerful tool in the advance of space plasma physics. The International School for Space Simulation (ISSS) series was set up in order to emphasize such a recognition in the early eighties, on the common initiative of M. Ashour-Abdalla, R. Gendrin, T. Sato and myself. The preceding five ISSS's (in Japan, USA, France, Japan, and Japan again) have greatly contributed to the promotion of and advance of computer simulations as well as the education of students trying to start the simulation study for their own research objectives.

  9. The Seventeenth Space Simulation Conference. Terrestrial Test for Space Success

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph L., III (Compiler)

    1992-01-01

    The Institute of Environmental Sciences' Seventeenth Space Simulation Conference, 'Terrestrial Test for Space Success' provided participants with a forum to acquire and exchange information on the state of the art in space simulation, test technology, atomic oxygen, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme of 'terrestrial test for space success.'

  10. Ninth Conference on Space Simulation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The papers presented in this conference provided an international dialogue and a meaningful exchange in the simulation of space environments as well as the evolution of these technological advances into other fields. The papers represent a significant contribution to the understanding of space simulation problems and the utilization of this knowledge. The topics of the papers include; spacecraft testing; facilities and test equipment; system and subsystem test; life sciences, medicine and space; physical environmental factors; chemical environmental factors; contamination; space physics; and thermal protection.

  11. Q-Sample Construction: A Critical Step for a Q-Methodological Study.

    PubMed

    Paige, Jane B; Morin, Karen H

    2016-01-01

    Q-sample construction is a critical step in Q-methodological studies. Prior to conducting Q-studies, researchers start with a population of opinion statements (concourse) on a particular topic of interest from which a sample is drawn. These sampled statements are known as the Q-sample. Although literature exists on methodological processes to conduct Q-methodological studies, limited guidance exists on the practical steps to reduce the population of statements to a Q-sample. A case exemplar illustrates the steps to construct a Q-sample in preparation for a study that explored perspectives nurse educators and nursing students hold about simulation design. Experts in simulation and Q-methodology evaluated the Q-sample for readability, clarity, and for representativeness of opinions contained within the concourse. The Q-sample was piloted and feedback resulted in statement refinement. Researchers especially those undertaking Q-method studies for the first time may benefit from the practical considerations to construct a Q-sample offered in this article. © The Author(s) 2014.

  12. Molecular analysis of Hb Q-H disease and Hb Q-Hb E in a Singaporean family.

    PubMed

    Tan, J; Tay, J S; Wong, Y C; Kham, S K; Bte Abd Aziz, N; Teo, S H; Wong, H B

    1995-01-01

    Hb Q (alpha 74Asp-His) results from a mutation in the alpha-gene such that abnormal alpha Q-chains are synthesized. The alpha Q-chains combine with the normal Beta A-chains to form abnormal Hb alpha 2Q beta 2A (Hb Q). Hb Q-H disease is rare, and has been reported only in the Chinese. We report here a Chinese family, were the mother diagnosed with Hb Q-H disease and the father with Hb E heterozygosity and a child with Hb Q-E-thalassemia. Thalassemia screening of the mother's blood revealed a Hb level of 6.8g/dl with low MCV and MCH. Her blood film was indicative of thalassemia. Cellulose acetate electrophoresis showed Hb H and Hb Q with the absence of Hb A. Globin chain biosynthesis was carried out and alpha Q- and beta-chains were detected. Normal alpha- chains were absent. Digestion of the mother's DNA with Bam HI and Bgl II followed by hybridization with the 1.5 kb alpha-Pst probe showed a two alpha-gene deletion on one chromosome and the -alpha Q chain mutant with the -alpha 4.2 defect on the other chromosome. DNA amplification studies indicated the two-gene deletion to be of the -SEA/ defect. The patient was concluded to possess Hb Q-H disease (--SEA/-alpha 4.2Q). Cellulose acetate electrophoresis of the father's blood showed the presence of Hb A, F and E. Molecular analysis of the father's DNA confirmed an intact set of alpha-genes (alpha alpha/alpha alpha). Globin chain biosynthesis of fetal blood of their child showed gamma, beta A, beta E, alpha A and alpha Q-chains. Molecular analysis of the child's DNA showed one alpha-gene deletion, thus giving a genotype of alpha alpha/-alpha 4.2Q beta beta E.

  13. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.

  14. Killing Forms on the Five-Dimensional Einstein-Sasaki Y(p, q) Spaces

    NASA Astrophysics Data System (ADS)

    Visinescu, Mihai

    2012-12-01

    We present the complete set of Killing-Yano tensors on the five-dimensional Einstein-Sasaki Y(p, q) spaces. Two new Killing-Yano tensors are identified, associated with the complex volume form of the Calabi-Yau metric cone. The corresponding hidden symmetries are not anomalous and the geodesic equations are superintegrable.

  15. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  16. Using BMDP and SPSS for a Q factor analysis.

    PubMed

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  17. Evolution of axis ratios from phase space dynamics of triaxial collapse

    NASA Astrophysics Data System (ADS)

    Nadkarni-Ghosh, Sharvari; Arya, Bhaskar

    2018-04-01

    We investigate the evolution of axis ratios of triaxial haloes using the phase space description of triaxial collapse. In this formulation, the evolution of the triaxial ellipsoid is described in terms of the dynamics of eigenvalues of three important tensors: the Hessian of the gravitational potential, the tensor of velocity derivatives, and the deformation tensor. The eigenvalues of the deformation tensor are directly related to the parameters that describe triaxiality, namely, the minor-to-major and intermediate-to-major axes ratios (s and q) and the triaxiality parameter T. Using the phase space equations, we evolve the eigenvalues and examine the evolution of the probability distribution function (PDF) of the axes ratios as a function of mass scale and redshift for Gaussian initial conditions. We find that the ellipticity and prolateness increase with decreasing mass scale and decreasing redshift. These trends agree with previous analytic studies but differ from numerical simulations. However, the PDF of the scaled parameter {\\tilde{q}} = (q-s)/(1-s) follows a universal distribution over two decades in mass range and redshifts which is in qualitative agreement with the universality for conditional PDF reported in simulations. We further show using the phase space dynamics that, in fact, {\\tilde{q}} is a phase space invariant and is conserved individually for each halo. These results demonstrate that the phase space analysis is a useful tool that provides a different perspective on the evolution of perturbations and can be applied to more sophisticated models in the future.

  18. GCR Simulator Development Status at the NASA Space Radiation Laboratory

    NASA Technical Reports Server (NTRS)

    Slaba, T. C.; Norbury, J. W.; Blattnig, S. R.

    2015-01-01

    There are large uncertainties connected to the biological response for exposure to galactic cosmic rays (GCR) on long duration deep space missions. In order to reduce the uncertainties and gain understanding about the basic mechanisms through which space radiation initiates cancer and other endpoints, radiobiology experiments are performed with mono-energetic ions beams. Some of the accelerator facilities supporting such experiments have matured to a point where simulating the broad range of particles and energies characteristic of the GCR environment in a single experiment is feasible from a technology, usage, and cost perspective. In this work, several aspects of simulating the GCR environment at the NASA Space Radiation Laboratory (NSRL) are discussed. First, comparisons are made between direct simulation of the external, free space GCR field, and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at NSRL limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, a reference environment for the GCR simulator and suitable for deep space missions is identified and described in terms of fluence and integrated dosimetric quantities. Analysis results are given to justify the use of a single reference field over a range of shielding conditions and solar activities. Third, an approach for simulating the reference field at NSRL is presented. The approach directly considers the hydrogen and helium energy spectra, and the heavier ions are collectively represented by considering the linear energy transfer (LET) spectrum. While many more aspects of the experimental setup need to be considered before final implementation of the GCR simulator, this preliminary study provides useful information that should aid the final design. Possible drawbacks of the proposed methodology are discussed and weighed

  19. Q-Type Factor Analysis of Healthy Aged Men.

    ERIC Educational Resources Information Center

    Kleban, Morton H.

    Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…

  20. Hypermultiplet gaugings and supersymmetric solutions from 11D and massive IIA supergravity on H^{(p,q)} spaces

    NASA Astrophysics Data System (ADS)

    Guarino, Adolfo

    2018-03-01

    Supersymmetric {AdS}4, {AdS}2 × Σ 2 and asymptotically AdS4 black hole solutions are studied in the context of non-minimal N=2 supergravity models involving three vector multiplets (STU-model) and Abelian gaugings of the universal hypermultiplet moduli space. Such models correspond to consistent subsectors of the {SO}(p,q) and {ISO}(p,q) gauged maximal supergravities that arise from the reduction of 11D and massive IIA supergravity on {H}^{(p,q)} spaces down to four dimensions. A unified description of all the models is provided in terms of a square-root prepotential and the gauging of a duality-hidden symmetry pair of the universal hypermultiplet. Some aspects of M-theory and massive IIA holography are mentioned in passing.

  1. 21st Space Simulation Conference: The Future of Space Simulation Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph L., III (Compiler)

    2000-01-01

    The Institute of Environmental Sciences and Technology's Twenty-first Space Simulation Conference, "The Future of Space Testing in the 21st Century" provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, programs/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme "The Future of Space Testing in the 21st Century."

  2. Lipschitz and Besov spaces in quantum calculus

    NASA Astrophysics Data System (ADS)

    Nemri, Akram; Selmi, Belgacem

    2016-08-01

    The purpose of this paper is to investigate the harmonic analysis on the time scale 𝕋q, q ∈ (0, 1) to introduce q-weighted Besov spaces subspaces of Lp(𝕋 q) generalizing the classical one. Further, using an example of q-weighted wα,β(.; q) which is introduced and studied. We give a new characterization of the q-Besov space using q-Poisson kernel and the g1 Littlewood-Paley operator.

  3. Molecular dynamics simulation studies of the wild type and E92Q/N155H mutant of Elvitegravir-resistance HIV-1 integrase.

    PubMed

    Chen, Qi; Cheng, Xiaolin; Wei, Dongqing; Xu, Qin

    2015-03-01

    Although Elvitegravir (EVG) is a newly developed antiretrovirals drug to treat the acquired immunodeficiency syndrome (AIDS), drug resistance has already been found in clinic, such as E92Q/N155H and Q148H/G140S. Several structural investigations have already been reported to reveal the molecular mechanism of the drug resistance. As full length crystal structure for HIV-1 integrase is still unsolved, we herein use the crystal structure of the full length prototype foamy virus (PFV) in complex with virus DNA and inhibitor Elvitegravir as a template to construct the wild type and E92Q/N155H mutant system of HIV-1 integrase. Molecular dynamic simulations was used to revel the binding mode and the drug resistance of the EVG ligand in E92Q/N155H. Several important interactions were discovered between the mutated residues and the residues in the active site of the E92Q/N155H double mutant pattern, and cross correlation and clustering methods were used for detailed analysis. The results from the MD simulation studies will be used to guide the experimental efforts of developing novel inhibitors against drug-resistant HIV integrase mutants.

  4. Evaluation of radioisotope tracer and activation analysis techniques for contamination monitoring in space environment simulation chambers

    NASA Technical Reports Server (NTRS)

    Smathers, J. B.; Kuykendall, W. E., Jr.; Wright, R. E., Jr.; Marshall, J. R.

    1973-01-01

    Radioisotope measurement techniques and neutron activation analysis are evaluated for use in identifying and locating contamination sources in space environment simulation chambers. The alpha range method allows the determination of total contaminant concentration in vapor state and condensate state. A Cf-252 neutron activation analysis system for detecting oils and greases tagged with stable elements is described. While neutron activation analysis of tagged contaminants offers specificity, an on-site system is extremely costly to implement and provides only marginal detection sensitivity under even the most favorable conditions.

  5. Thermally Induced Vibrations of the Hubble Space Telescope's Solar Array 3 in a Test Simulated Space Environment

    NASA Technical Reports Server (NTRS)

    Early, Derrick A.; Haile, William B.; Turczyn, Mark T.; Griffin, Thomas J. (Technical Monitor)

    2001-01-01

    NASA Goddard Space Flight Center and the European Space Agency (ESA) conducted a disturbance verification test on a flight Solar Array 3 (SA3) for the Hubble Space Telescope using the ESA Large Space Simulator (LSS) in Noordwijk, the Netherlands. The LSS cyclically illuminated the SA3 to simulate orbital temperature changes in a vacuum environment. Data acquisition systems measured signals from force transducers and accelerometers resulting from thermally induced vibrations of the SAI The LSS with its seismic mass boundary provided an excellent background environment for this test. This paper discusses the analysis performed on the measured transient SA3 responses and provides a summary of the results.

  6. Next Generation Simulation Framework for Robotic and Human Space Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven

    2012-01-01

    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  7. Promoting A-Priori Interoperability of HLA-Based Simulations in the Space Domain: The SISO Space Reference FOM Initiative

    NASA Technical Reports Server (NTRS)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2016-01-01

    Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.

  8. Seismic imaging of Q structures by a trans-dimensional coda-wave analysis

    NASA Astrophysics Data System (ADS)

    Takahashi, Tsutomu

    2017-04-01

    Wave scattering and intrinsic attenuation are important processes to describe incoherent and complex wave trains of high frequency seismic wave (>1Hz). The multiple lapse time window analysis (MLTWA) has been used to estimate scattering and intrinsic Q values by assuming constant Q in a study area (e.g., Hoshiba 1993). This study generalizes this MLTWA to estimate lateral variations of Q values under the Bayesian framework in dimension variable space. Study area is partitioned into small areas by means of the Voronoi tessellation. Scattering and intrinsic Q in each small area are constant. We define a misfit function for spatiotemporal variations of wave energy as with the original MLTWA, and maximize the posterior probability with changing not only Q values but the number and spatial layout of the Voronoi cells. This maximization is conducted by means of the reversible jump Markov chain Monte Carlo (rjMCMC) (Green 1995) since the number of unknown parameters (i.e., dimension of posterior probability) is variable. After a convergence to the maximum posterior, we estimate Q structures from the ensemble averages of MCMC samples around the maximum posterior probability. Synthetic tests showed stable reconstructions of input structures with reasonable error distributions. We applied this method for seismic waveform data recorded by ocean bottom seismograms at the outer-rise area off Tohoku, and estimated Q values at 4-8Hz, 8-16Hz and 16-32Hz. Intrinsic Q are nearly constant at all frequency bands, and scattering Q shows two distinct strong scattering regions at petit spot area and high seismicity area. These strong scattering are probably related to magma inclusions and fractured structure, respectively. Difference between these two areas becomes clear at high frequencies. It means that scale dependences of inhomogeneities or smaller scale inhomogeneity is important to discuss medium property and origins of structural variations. While the generalized MLTWA is based on

  9. Compact Q-balls and Q-shells in a scalar electrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arodz, H.; Lis, J.

    2009-02-15

    We investigate spherically symmetric nontopological solitons in electrodynamics with a scalar field self-interaction U{approx}|{psi}| taken from the complex signum-Gordon model. We find Q-balls for small absolute values of the total electric charge Q, and Q-shells when |Q| is large enough. In both cases the charge density exactly vanishes outside certain compact regions in the three-dimensional space. The dependence of the total energy E of small Q-balls on the total electric charge has the form E{approx}|Q|{sup 5/6}, while in the case of very large Q-shells, E{approx}|Q|{sup 7/6}.

  10. Simulator of Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Jennings, Esther; Gao, Jay; Segui, John; Kwong, Winston

    2005-01-01

    Multimission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) is a suite of software tools that simulates the behaviors of communication networks to be used in space exploration, and predict the performance of established and emerging space communication protocols and services. MACHETE consists of four general software systems: (1) a system for kinematic modeling of planetary and spacecraft motions; (2) a system for characterizing the engineering impact on the bandwidth and reliability of deep-space and in-situ communication links; (3) a system for generating traffic loads and modeling of protocol behaviors and state machines; and (4) a system of user-interface for performance metric visualizations. The kinematic-modeling system makes it possible to characterize space link connectivity effects, including occultations and signal losses arising from dynamic slant-range changes and antenna radiation patterns. The link-engineering system also accounts for antenna radiation patterns and other phenomena, including modulations, data rates, coding, noise, and multipath fading. The protocol system utilizes information from the kinematic-modeling and link-engineering systems to simulate operational scenarios of space missions and evaluate overall network performance. In addition, a Communications Effect Server (CES) interface for MACHETE has been developed to facilitate hybrid simulation of space communication networks with actual flight/ground software/hardware embedded in the overall system.

  11. Q-Space Scattering Power Laws and the Interior Fields of Particles

    DTIC Science & Technology

    2016-02-12

    SECURITY CLASSIFICATION OF: This work studied the relationship between light scattered by particles of any shape and the interior field of that...Release; Distribution Unlimited UU UU UU UU 12-02-2016 7-Jul-2014 6-Apr-2015 Final Report: Q-space Scattering Power Laws and the Interior Fields of...the Army position, policy or decision, unless so designated by other documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S

  12. q-deformed superstatistics of the Schrödinger equation in commutative and noncommutative spaces with magnetic field

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-01-01

    We discuss the q-deformed algebra and study the Schrödinger equation in commutative and noncommutative spaces, under an external magnetic field. In this work, we obtain the energy spectrum by an analytical method and the thermodynamic properties of the system by using the q-deformed superstatistics are calculated. Actually, we derive a generalized version of the ordinary superstatistic for the non-equilibrium systems. Also, different effective Boltzmann factor descriptions are derived. In addition, we discuss about the results for various values of θ in commutative and noncommutative spaces and, to illustrate the results, some figures are plotted.

  13. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    NASA Astrophysics Data System (ADS)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated

  14. Statistical Analysis of Q-matrix Based Diagnostic Classification Models

    PubMed Central

    Chen, Yunxiao; Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang

    2014-01-01

    Diagnostic classification models have recently gained prominence in educational assessment, psychiatric evaluation, and many other disciplines. Central to the model specification is the so-called Q-matrix that provides a qualitative specification of the item-attribute relationship. In this paper, we develop theories on the identifiability for the Q-matrix under the DINA and the DINO models. We further propose an estimation procedure for the Q-matrix through the regularized maximum likelihood. The applicability of this procedure is not limited to the DINA or the DINO model and it can be applied to essentially all Q-matrix based diagnostic classification models. Simulation studies are conducted to illustrate its performance. Furthermore, two case studies are presented. The first case is a data set on fraction subtraction (educational application) and the second case is a subsample of the National Epidemiological Survey on Alcohol and Related Conditions concerning the social anxiety disorder (psychiatric application). PMID:26294801

  15. Error minimization algorithm for comparative quantitative PCR analysis: Q-Anal.

    PubMed

    OConnor, William; Runquist, Elizabeth A

    2008-07-01

    Current methods for comparative quantitative polymerase chain reaction (qPCR) analysis, the threshold and extrapolation methods, either make assumptions about PCR efficiency that require an arbitrary threshold selection process or extrapolate to estimate relative levels of messenger RNA (mRNA) transcripts. Here we describe an algorithm, Q-Anal, that blends elements from current methods to by-pass assumptions regarding PCR efficiency and improve the threshold selection process to minimize error in comparative qPCR analysis. This algorithm uses iterative linear regression to identify the exponential phase for both target and reference amplicons and then selects, by minimizing linear regression error, a fluorescence threshold where efficiencies for both amplicons have been defined. From this defined fluorescence threshold, cycle time (Ct) and the error for both amplicons are calculated and used to determine the expression ratio. Ratios in complementary DNA (cDNA) dilution assays from qPCR data were analyzed by the Q-Anal method and compared with the threshold method and an extrapolation method. Dilution ratios determined by the Q-Anal and threshold methods were 86 to 118% of the expected cDNA ratios, but relative errors for the Q-Anal method were 4 to 10% in comparison with 4 to 34% for the threshold method. In contrast, ratios determined by an extrapolation method were 32 to 242% of the expected cDNA ratios, with relative errors of 67 to 193%. Q-Anal will be a valuable and quick method for minimizing error in comparative qPCR analysis.

  16. Space robot simulator vehicle

    NASA Technical Reports Server (NTRS)

    Cannon, R. H., Jr.; Alexander, H.

    1985-01-01

    A Space Robot Simulator Vehicle (SRSV) was constructed to model a free-flying robot capable of doing construction, manipulation and repair work in space. The SRSV is intended as a test bed for development of dynamic and static control methods for space robots. The vehicle is built around a two-foot-diameter air-cushion vehicle that carries batteries, power supplies, gas tanks, computer, reaction jets and radio equipment. It is fitted with one or two two-link manipulators, which may be of many possible designs, including flexible-link versions. Both the vehicle body and its first arm are nearly complete. Inverse dynamic control of the robot's manipulator has been successfully simulated using equations generated by the dynamic simulation package SDEXACT. In this mode, the position of the manipulator tip is controlled not by fixing the vehicle base through thruster operation, but by controlling the manipulator joint torques to achieve the desired tip motion, while allowing for the free motion of the vehicle base. One of the primary goals is to minimize use of the thrusters in favor of intelligent control of the manipulator. Ways to reduce the computational burden of control are described.

  17. STS (Space Transportation System) Task Simulator.

    DTIC Science & Technology

    1985-08-15

    3 Clohessy - Wiltshire Coordinate System • • -1 1- .M 1°... "p ’. -. .’- 0 . _ -~:Q ~. ... . .o. ., 1. INTRODUCTION The Space Transportation System...motion is obtained by applying the Clohessy - Wiltshire equations for terminal rendezvous/docking with the earth modeled as a uni- form sphere...rotational accelerations to the present quaternions. The Clohessy - Wiltshire equations for terminal rendezvous/dockinq are used to model orbital drift

  18. System-Level Reuse of Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Hazen, Michael R.; Williams, Joseph C.

    2004-01-01

    One of the best ways to enhance space systems simulation fidelity is to leverage off of (reuse) existing high-fidelity simulations. But what happens when the model you would like to reuse is in a different coding language or other barriers arise that make one want to just start over with a clean sheet of paper? Three diverse system-level simulation reuse case studies are described based on experience to date in the development of NASA's Space Station Training Facility (SSTF) at the Johnson Space Center in Houston, Texas. Case studies include (a) the Boeing/Rocketdyne-provided Electrical Power Simulation (EPSIM), (b) the NASA Automation and Robotics Division-provided TRICK robotics systems model, and (c) the Russian Space Agency- provided Russian Segment Trainer. In each case, there was an initial tendency to dismiss simulation reuse candidates based on an apparent lack of suitability. A more careful examination based on a more structured assessment of architectural and requirements-oriented representations of the reuse candidates revealed significant reuse potential. Specific steps used to conduct the detailed assessments are discussed. The steps include the following: 1) Identifying reuse candidates; 2) Requirements compatibility assessment; 3) Maturity assessment; 4) Life-cycle cost determination; and 5) Risk assessment. Observations and conclusions are presented related to the real cost of system-level simulation component reuse. Finally, lessons learned that relate to maximizing the benefits of space systems simulation reuse are shared. These concepts should be directly applicable for use in the development of space systems simulations in the future.

  19. Convexity and concavity constants in Lorentz and Marcinkiewicz spaces

    NASA Astrophysics Data System (ADS)

    Kaminska, Anna; Parrish, Anca M.

    2008-07-01

    We provide here the formulas for the q-convexity and q-concavity constants for function and sequence Lorentz spaces associated to either decreasing or increasing weights. It yields also the formula for the q-convexity constants in function and sequence Marcinkiewicz spaces. In this paper we extent and enhance the results from [G.J.O. Jameson, The q-concavity constants of Lorentz sequence spaces and related inequalities, Math. Z. 227 (1998) 129-142] and [A. Kaminska, A.M. Parrish, The q-concavity and q-convexity constants in Lorentz spaces, in: Banach Spaces and Their Applications in Analysis, Conference in Honor of Nigel Kalton, May 2006, Walter de Gruyter, Berlin, 2007, pp. 357-373].

  20. Berthing simulator for space station and orbiter

    NASA Technical Reports Server (NTRS)

    Veerasamy, Sam

    1991-01-01

    The development of a real-time man-in-the-loop berthing simulator is in progress at NASA Lyndon B. Johnson Space Center (JSC) to conduct a parametric study and to measure forces during contact conditions of the actual docking mechanisms for the Space Station Freedom and the orbiter. In berthing, the docking ports of the Space Station and the orbiter are brought together using the orbiter robotic arm to control the relative motion of the vehicles. The berthing simulator consists of a dynamics docking test system (DDTS), computer system, simulator software, and workstations. In the DDTS, the Space Station, and the orbiter docking mechanisms are mounted on a six-degree-of-freedom (6 DOF) table and a fixed platform above the table. Six load cells are used on the fixed platform to measure forces during contact conditions of the docking mechanisms. Two Encore Concept 32/9780 computers are used to simulate the orbiter robotic arm and to operate the berthing simulator. A systematic procedure for a real-time dynamic initialization is being developed to synchronize the Space Station docking port trajectory with the 6 DOF table movement. The berthing test can be conducted manually or automatically and can be extended for any two orbiting vehicles using a simulated robotic arm. The real-time operation of the berthing simulator is briefly described.

  1. Stability of iterative procedures with errors for approximating common fixed points of a couple of q-contractive-like mappings in Banach spaces

    NASA Astrophysics Data System (ADS)

    Zeng, Lu-Chuan; Yao, Jen-Chih

    2006-09-01

    Recently, Agarwal, Cho, Li and Huang [R.P. Agarwal, Y.J. Cho, J. Li, N.J. Huang, Stability of iterative procedures with errors approximating common fixed points for a couple of quasi-contractive mappings in q-uniformly smooth Banach spaces, J. Math. Anal. Appl. 272 (2002) 435-447] introduced the new iterative procedures with errors for approximating the common fixed point of a couple of quasi-contractive mappings and showed the stability of these iterative procedures with errors in Banach spaces. In this paper, we introduce a new concept of a couple of q-contractive-like mappings (q>1) in a Banach space and apply these iterative procedures with errors for approximating the common fixed point of the couple of q-contractive-like mappings. The results established in this paper improve, extend and unify the corresponding ones of Agarwal, Cho, Li and Huang [R.P. Agarwal, Y.J. Cho, J. Li, N.J. Huang, Stability of iterative procedures with errors approximating common fixed points for a couple of quasi-contractive mappings in q-uniformly smooth Banach spaces, J. Math. Anal. Appl. 272 (2002) 435-447], Chidume [C.E. Chidume, Approximation of fixed points of quasi-contractive mappings in Lp spaces, Indian J. Pure Appl. Math. 22 (1991) 273-386], Chidume and Osilike [C.E. Chidume, M.O. Osilike, Fixed points iterations for quasi-contractive maps in uniformly smooth Banach spaces, Bull. Korean Math. Soc. 30 (1993) 201-212], Liu [Q.H. Liu, On Naimpally and Singh's open questions, J. Math. Anal. Appl. 124 (1987) 157-164; Q.H. Liu, A convergence theorem of the sequence of Ishikawa iterates for quasi-contractive mappings, J. Math. Anal. Appl. 146 (1990) 301-305], Osilike [M.O. Osilike, A stable iteration procedure for quasi-contractive maps, Indian J. Pure Appl. Math. 27 (1996) 25-34; M.O. Osilike, Stability of the Ishikawa iteration method for quasi-contractive maps, Indian J. Pure Appl. Math. 28 (1997) 1251-1265] and many others in the literature.

  2. Simulated Wake Characteristics Data for Closely Spaced Parallel Runway Operations Analysis

    NASA Technical Reports Server (NTRS)

    Guerreiro, Nelson M.; Neitzke, Kurt W.

    2012-01-01

    A simulation experiment was performed to generate and compile wake characteristics data relevant to the evaluation and feasibility analysis of closely spaced parallel runway (CSPR) operational concepts. While the experiment in this work is not tailored to any particular operational concept, the generated data applies to the broader class of CSPR concepts, where a trailing aircraft on a CSPR approach is required to stay ahead of the wake vortices generated by a lead aircraft on an adjacent CSPR. Data for wake age, circulation strength, and wake altitude change, at various lateral offset distances from the wake-generating lead aircraft approach path were compiled for a set of nine aircraft spanning the full range of FAA and ICAO wake classifications. A total of 54 scenarios were simulated to generate data related to key parameters that determine wake behavior. Of particular interest are wake age characteristics that can be used to evaluate both time- and distance- based in-trail separation concepts for all aircraft wake-class combinations. A simple first-order difference model was developed to enable the computation of wake parameter estimates for aircraft models having weight, wingspan and speed characteristics similar to those of the nine aircraft modeled in this work.

  3. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  4. The Use of Microgravity Simulators for Space Research

    NASA Technical Reports Server (NTRS)

    Zhang, Ye; Richards, Stephanie E.; Richards, Jeffrey T.; Levine, Howard G.

    2016-01-01

    The spaceflight environment is known to influence biological processes ranging from stimulation of cellular metabolism to possible impacts on cellular damage repair, suppression of immune functions, and bone loss in astronauts. Microgravity is one of the most significant stress factors experienced by living organisms during spaceflight, and therefore, understanding cellular responses to altered gravity at the physiological and molecular level is critical for expanding our knowledge of life in space. Since opportunities to conduct experiments in space are scarce, various microgravity simulators and analogues have been widely used in space biology ground studies. Even though simulated microgravity conditions have produced some, but not all of the biological effects observed in the true microgravity environment, they provide test beds that are effective, affordable, and readily available to facilitate microgravity research. Kennedy Space Center (KSC) provides ground microgravity simulator support to offer a variety of microgravity simulators and platforms for Space Biology investigators. Assistance will be provided by both KSC and external experts in molecular biology, microgravity simulation, and engineering. Comparisons between the physical differences in microgravity simulators, examples of experiments using the simulators, and scientific questions regarding the use of microgravity simulators will be discussed.

  5. The Use of Microgravity Simulators for Space Research

    NASA Technical Reports Server (NTRS)

    Zhang, Ye; Richards, Stephanie E.; Wade, Randall I.; Richards, Jeffrey T.; Fritsche, Ralph F.; Levine, Howard G.

    2016-01-01

    The spaceflight environment is known to influence biological processes ranging from stimulation of cellular metabolism to possible impacts on cellular damage repair, suppression of immune functions, and bone loss in astronauts. Microgravity is one of the most significant stress factors experienced by living organisms during spaceflight, and therefore, understanding cellular responses to altered gravity at the physiological and molecular level is critical for expanding our knowledge of life in space. Since opportunities to conduct experiments in space are scarce, various microgravity simulators and analogues have been widely used in space biology ground studies. Even though simulated microgravity conditions have produced some, but not all of the biological effects observed in the true microgravity environment, they provide test beds that are effective, affordable, and readily available to facilitate microgravity research. A Micro-g Simulator Center is being developed at Kennedy Space Center (KSC) to offer a variety of microgravity simulators and platforms for Space Biology investigators. Assistance will be provided by both KSC and external experts in molecular biology, microgravity simulation, and engineering. Comparisons between the physical differences in microgravity simulators, examples of experiments using the simulators, and scientific questions regarding the use of microgravity simulators will be discussed.

  6. Development of space simulation / net-laboratory system

    NASA Astrophysics Data System (ADS)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  7. Analysis of Shared Haplotypes amongst Palauans Maps Loci for Psychotic Disorders to 4q28 and 5q23-q31.

    PubMed

    Bodea, Corneliu A; Middleton, Frank A; Melhem, Nadine M; Klei, Lambertus; Song, Youeun; Tiobech, Josepha; Marumoto, Pearl; Yano, Victor; Faraone, Stephen V; Roeder, Kathryn; Myles-Worsley, Marina; Devlin, Bernie; Byerley, William

    2017-02-01

    To localize genetic variation affecting risk for psychotic disorders in the population of Palau, we genotyped DNA samples from 203 Palauan individuals diagnosed with psychotic disorders, broadly defined, and 125 control subjects using a genome-wide single nucleotide polymorphism array. Palau has unique features advantageous for this study: due to its population history, Palauans are substantially interrelated; affected individuals often, but not always, cluster in families; and we have essentially complete ascertainment of affected individuals. To localize risk variants to genomic regions, we evaluated long-shared haplotypes, ≥10 Mb, identifying clusters of affected individuals who share such haplotypes. This extensive sharing, typically identical by descent, was significantly greater in cases than population controls, even after controlling for relatedness. Several regions of the genome exhibited substantial excess of shared haplotypes for affected individuals, including 3p21, 3p12, 4q28, and 5q23-q31. Two of these regions, 4q28 and 5q23-q31, showed significant linkage by traditional LOD score analysis and could harbor variants of more sizeable risk for psychosis or a multiplicity of risk variants. The pattern of haplotype sharing in 4q28 highlights PCDH10 , encoding a cadherin-related neuronal receptor, as possibly involved in risk.

  8. The Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Mike G.; Bowman, James D.

    2007-01-01

    The paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which focuses on the investigation and development of technologies, processes and integrated simulations related to the collaborative distributed simulation of complex space systems in support of NASA's Exploration Initiative. This paper describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. In the network work area, DSES is developing a Distributed Simulation Network that will provide agency wide support for distributed simulation between all NASA centers. In the software work area, DSES is developing a collection of software models, tool and procedures that ease the burden of developing distributed simulations and provides a consistent interoperability infrastructure for agency wide participation in integrated simulation. Finally, for simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper will present current status and plans for each of these work areas with specific examples of simulations that support NASA's exploration initiatives.

  9. Space-flight simulations of calcium metabolism using a mathematical model of calcium regulation

    NASA Technical Reports Server (NTRS)

    Brand, S. N.

    1985-01-01

    The results of a series of simulation studies of calcium matabolic changes which have been recorded during human exposure to bed rest and space flight are presented. Space flight and bed rest data demonstrate losses of total body calcium during exposure to hypogravic environments. These losses are evidenced by higher than normal rates of urine calcium excretion and by negative calcium balances. In addition, intestinal absorption rates and bone mineral content are assumed to decrease. The bed rest and space flight simulations were executed on a mathematical model of the calcium metabolic system. The purpose of the simulations is to theoretically test hypotheses and predict system responses which are occurring during given experimental stresses. In this case, hypogravity occurs through the comparison of simulation and experimental data and through the analysis of model structure and system responses. The model reliably simulates the responses of selected bed rest and space flight parameters. When experimental data are available, the simulated skeletal responses and regulatory factors involved in the responses agree with space flight data collected on rodents. In addition, areas within the model that need improvement are identified.

  10. Comparative proteomic analysis of rice after seed ground simulated radiation and spaceflight explains the radiation effects of space environment

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Shi, Jinming; Liang, Shujian; Lei, Huang; Shenyi, Zhang; Sun, Yeqing

    In previous work, we compared the proteomic profiles of rice plants growing after seed space-flights with ground controls by two-dimensional difference gel electrophoresis (2-D DIGE) and found that the protein expression profiles were changed after seed space environment exposures. Spaceflight represents a complex environmental condition in which several interacting factors such as cosmic radiation, microgravity and space magnetic fields are involved. Rice seed is in the process of dormant of plant development, showing high resistance against stresses, so the highly ionizing radiation (HZE) in space is considered as main factor causing biological effects to seeds. To further investigate the radiation effects of space environment, we performed on-ground simulated HZE particle radiation and compared between the proteomes of seed irra-diated plants and seed spaceflight (20th recoverable satellite) plants from the same rice variety. Space ionization shows low-dose but high energy particle effects, for searching the particle effects, ground radiations with the same low-dose (2mGy) but different liner energy transfer (LET) values (13.3KeV/µm-C, 30KeV/µm-C, 31KeV/µm-Ne, 62.2KeV/µm-C, 500Kev/µm-Fe) were performed; using 2-D DIGE coupled with clustering and principle component analysis (PCA) for data process and comparison, we found that the holistic protein expression patterns of plants irradiated by LET-62.2KeV/µm carbon particles were most similar to spaceflight. In addition, although space environment presents a low-dose radiation (0.177 mGy/day on the satellite), the equivalent simulated radiation dose effects should still be evaluated: radiations of LET-62.2KeV/µm carbon particles with different cumulative doses (2mGy, 20mGy, 200mGy, 2000mGy) were further carried out and resulted that the 2mGy radiation still shared most similar proteomic profiles with spaceflight, confirming the low-dose effects of space radiation. Therefore, in the protein expression level

  11. Experimental determination of pore shapes using phase retrieval from q -space NMR diffraction

    NASA Astrophysics Data System (ADS)

    Demberg, Kerstin; Laun, Frederik Bernd; Bertleff, Marco; Bachert, Peter; Kuder, Tristan Anselm

    2018-05-01

    This paper presents an approach to solving the phase problem in nuclear magnetic resonance (NMR) diffusion pore imaging, a method that allows imaging the shape of arbitrary closed pores filled with an NMR-detectable medium for investigation of the microstructure of biological tissue and porous materials. Classical q -space imaging composed of two short diffusion-encoding gradient pulses yields, analogously to diffraction experiments, the modulus squared of the Fourier transform of the pore image which entails an inversion problem: An unambiguous reconstruction of the pore image requires both magnitude and phase. Here the phase information is recovered from the Fourier modulus by applying a phase retrieval algorithm. This allows omitting experimentally challenging phase measurements using specialized temporal gradient profiles. A combination of the hybrid input-output algorithm and the error reduction algorithm was used with dynamically adapting support (shrinkwrap extension). No a priori knowledge on the pore shape was fed to the algorithm except for a finite pore extent. The phase retrieval approach proved successful for simulated data with and without noise and was validated in phantom experiments with well-defined pores using hyperpolarized xenon gas.

  12. Experimental determination of pore shapes using phase retrieval from q-space NMR diffraction.

    PubMed

    Demberg, Kerstin; Laun, Frederik Bernd; Bertleff, Marco; Bachert, Peter; Kuder, Tristan Anselm

    2018-05-01

    This paper presents an approach to solving the phase problem in nuclear magnetic resonance (NMR) diffusion pore imaging, a method that allows imaging the shape of arbitrary closed pores filled with an NMR-detectable medium for investigation of the microstructure of biological tissue and porous materials. Classical q-space imaging composed of two short diffusion-encoding gradient pulses yields, analogously to diffraction experiments, the modulus squared of the Fourier transform of the pore image which entails an inversion problem: An unambiguous reconstruction of the pore image requires both magnitude and phase. Here the phase information is recovered from the Fourier modulus by applying a phase retrieval algorithm. This allows omitting experimentally challenging phase measurements using specialized temporal gradient profiles. A combination of the hybrid input-output algorithm and the error reduction algorithm was used with dynamically adapting support (shrinkwrap extension). No a priori knowledge on the pore shape was fed to the algorithm except for a finite pore extent. The phase retrieval approach proved successful for simulated data with and without noise and was validated in phantom experiments with well-defined pores using hyperpolarized xenon gas.

  13. Simulation analysis of impulse characteristics of space debris irradiated by multi-pulse laser

    NASA Astrophysics Data System (ADS)

    Lin, Zhengguo; Jin, Xing; Chang, Hao; You, Xiangyu

    2018-02-01

    Cleaning space debris with laser is a hot topic in the field of space security research. Impulse characteristics are the basis of cleaning space debris with laser. In order to study the impulse characteristics of rotating irregular space debris irradiated by multi-pulse laser, the impulse calculation method of rotating space debris irradiated by multi-pulse laser is established based on the area matrix method. The calculation method of impulse and impulsive moment under multi-pulse irradiation is given. The calculation process of total impulse under multi-pulse irradiation is analyzed. With a typical non-planar space debris (cube) as example, the impulse characteristics of space debris irradiated by multi-pulse laser are simulated and analyzed. The effects of initial angular velocity, spot size and pulse frequency on impulse characteristics are investigated.

  14. Petascale Kinetic Simulations in Space Sciences: New Simulations and Data Discovery Techniques and Physics Results

    NASA Astrophysics Data System (ADS)

    Karimabadi, Homa

    2012-03-01

    Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.

  15. Planetary and Space Simulation Facilities (PSI) at DLR

    NASA Astrophysics Data System (ADS)

    Panitz, Corinna; Rabbow, E.; Rettberg, P.; Kloss, M.; Reitz, G.; Horneck, G.

    2010-05-01

    The Planetary and Space Simulation facilities at DLR offer the possibility to expose biological and physical samples individually or integrated into space hardware to defined and controlled space conditions like ultra high vacuum, low temperature and extraterrestrial UV radiation. An x-ray facility stands for the simulation of the ionizing component at the disposal. All of the simulation facilities are required for the preparation of space experiments: - for testing of the newly developed space hardware - for investigating the effect of different space parameters on biological systems as a preparation for the flight experiment - for performing the 'Experiment Verification Tests' (EVT) for the specification of the test parameters - and 'Experiment Sequence Tests' (EST) by simulating sample assemblies, exposure to selected space parameters, and sample disassembly. To test the compatibility of the different biological and chemical systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed among many others for the ESA facilities of the ongoing missions EXPOSE-R and EXPOSE-E on board of the International Space Station ISS . Several experiment verification tests EVTs and an experiment sequence test EST have been conducted in the carefully equipped and monitored planetary and space simulation facilities PSI of the Institute of Aerospace Medicine at DLR in Cologne, Germany. These ground based pre-flight studies allowed the investigation of a much wider variety of samples and the selection of the most promising organisms for the flight experiment. EXPOSE-E had been attached to the outer balcony of the European Columbus module of the ISS in February 2008 and stayed for 1,5 years in space; EXPOSE-R has been attached to the Russian Svezda module of the ISS in spring 2009 and mission duration will be approx. 1,5 years. The missions will give new insights into the survivability of terrestrial

  16. Micromechanics analysis of space simulated thermal deformations and stresses in continuous fiber reinforced composites

    NASA Technical Reports Server (NTRS)

    Bowles, David E.

    1990-01-01

    Space simulated thermally induced deformations and stresses in continuous fiber reinforced composites were investigated with a micromechanics analysis. The investigation focused on two primary areas. First, available explicit expressions for predicting the effective coefficients of thermal expansion (CTEs) for a composite were compared with each other, and with a finite element (FE) analysis, developed specifically for this study. Analytical comparisons were made for a wide range of fiber/matrix systems, and predicted values were compared with experimental data. The second area of investigation focused on the determination of thermally induced stress fields in the individual constituents. Stresses predicted from the FE analysis were compared to those predicted from a closed-form solution to the composite cylinder (CC) model, for two carbon fiber/epoxy composites. A global-local formulation, combining laminated plate theory and FE analysis, was used to determine the stresses in multidirectional laminates. Thermally induced damage initiation predictions were also made.

  17. A TREETOPS simulation of the Hubble Space Telescope-High Gain Antenna interaction

    NASA Technical Reports Server (NTRS)

    Sharkey, John P.

    1987-01-01

    Virtually any project dealing with the control of a Large Space Structure (LSS) will involve some level of verification by digital computer simulation. While the Hubble Space Telescope might not normally be included in a discussion of LSS, it is presented to highlight a recently developed simulation and analysis program named TREETOPS. TREETOPS provides digital simulation, linearization, and control system interaction of flexible, multibody spacecraft which admit to a point-connected tree topology. The HST application of TREETOPS is intended to familiarize the LSS community with TREETOPS by presenting a user perspective of its key features.

  18. ACES: Space shuttle flight software analysis expert system

    NASA Technical Reports Server (NTRS)

    Satterwhite, R. Scott

    1990-01-01

    The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.

  19. Baccalaureate nursing students' perspectives of peer tutoring in simulation laboratory, a Q methodology study.

    PubMed

    Li, Ting; Petrini, Marcia A; Stone, Teresa E

    2018-02-01

    The study aim was to identify the perceived perspectives of baccalaureate nursing students toward the peer tutoring in the simulation laboratory. Insight into the nursing students' experiences and baseline data related to their perception of peer tutoring will assist to improve nursing education. Q methodology was applied to explore the students' perspectives of peer tutoring in the simulation laboratory. A convenience P-sample of 40 baccalaureate nursing students was used. Fifty-eight selected Q statements from each participant were classified into the shape of a normal distribution using an 11-point bipolar scale form with a range from -5 to +5. PQ Method software analyzed the collected data. Three discrete factors emerged: Factor I ("Facilitate or empower" knowledge acquisition), Factor II ("Safety Net" Support environment), and Factor III ("Mentoring" learn how to learn). The findings of this study support and indicate that peer tutoring is an effective supplementary strategy to promote baccalaureate students' knowledge acquisition, establishing a supportive safety net and facilitating their abilities to learn in the simulation laboratory. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. 26th Space Simulation Conference Proceedings. Environmental Testing: The Path Forward

    NASA Technical Reports Server (NTRS)

    Packard, Edward A.

    2010-01-01

    Topics covered include: A Multifunctional Space Environment Simulation Facility for Accelerated Spacecraft Materials Testing; Exposure of Spacecraft Surface Coatings in a Simulated GEO Radiation Environment; Gravity-Offloading System for Large-Displacement Ground Testing of Spacecraft Mechanisms; Microscopic Shutters Controlled by cRIO in Sounding Rocket; Application of a Physics-Based Stabilization Criterion to Flight System Thermal Testing; Upgrade of a Thermal Vacuum Chamber for 20 Kelvin Operations; A New Approach to Improve the Uniformity of Solar Simulator; A Perfect Space Simulation Storm; A Planetary Environmental Simulator/Test Facility; Collimation Mirror Segment Refurbishment inside ESA s Large Space; Space Simulation of the CBERS 3 and 4 Satellite Thermal Model in the New Brazilian 6x8m Thermal Vacuum Chamber; The Certification of Environmental Chambers for Testing Flight Hardware; Space Systems Environmental Test Facility Database (SSETFD), Website Development Status; Wallops Flight Facility: Current and Future Test Capabilities for Suborbital and Orbital Projects; Force Limited Vibration Testing of JWST NIRSpec Instrument Using Strain Gages; Investigation of Acoustic Field Uniformity in Direct Field Acoustic Testing; Recent Developments in Direct Field Acoustic Testing; Assembly, Integration and Test Centre in Malaysia: Integration between Building Construction Works and Equipment Installation; Complex Ground Support Equipment for Satellite Thermal Vacuum Test; Effect of Charging Electron Exposure on 1064nm Transmission through Bare Sapphire Optics and SiO2 over HfO2 AR-Coated Sapphire Optics; Environmental Testing Activities and Capabilities for Turkish Space Industry; Integrated Circuit Reliability Simulation in Space Environments; Micrometeoroid Impacts and Optical Scatter in Space Environment; Overcoming Unintended Consequences of Ambient Pressure Thermal Cycling Environmental Tests; Performance and Functionality Improvements to Next Generation

  1. Singular dynamics of a q-difference Painlevé equation in its initial-value space

    NASA Astrophysics Data System (ADS)

    Joshi, N.; Lobb, S. B.

    2016-01-01

    We construct the initial-value space of a q-discrete first Painlevé equation explicitly and describe the behaviours of its solutions w(n) in this space as n\\to ∞ , with particular attention paid to neighbourhoods of exceptional lines and irreducible components of the anti-canonical divisor. These results show that trajectories starting in domains bounded away from the origin in initial value space are repelled away from such singular lines. However, the dynamical behaviours in neighbourhoods containing the origin are complicated by the merger of two simple base points at the origin in the limit. We show that these lead to a saddle-point-type behaviour in a punctured neighbourhood of the origin.

  2. High-performing simulations of the space radiation environment for the International Space Station and Apollo Missions

    NASA Astrophysics Data System (ADS)

    Lund, Matthew Lawrence

    The space radiation environment is a significant challenge to future manned and unmanned space travels. Future missions will rely more on accurate simulations of radiation transport in space through spacecraft to predict astronaut dose and energy deposition within spacecraft electronics. The International Space Station provides long-term measurements of the radiation environment in Low Earth Orbit (LEO); however, only the Apollo missions provided dosimetry data beyond LEO. Thus dosimetry analysis for deep space missions is poorly supported with currently available data, and there is a need to develop dosimetry-predicting models for extended deep space missions. GEANT4, a Monte Carlo Method, provides a powerful toolkit in C++ for simulation of radiation transport in arbitrary media, thus including the spacecraft and space travels. The newest version of GEANT4 supports multithreading and MPI, resulting in faster distributive processing of simulations in high-performance computing clusters. This thesis introduces a new application based on GEANT4 that greatly reduces computational time using Kingspeak and Ember computational clusters at the Center for High Performance Computing (CHPC) to simulate radiation transport through full spacecraft geometry, reducing simulation time to hours instead of weeks without post simulation processing. Additionally, this thesis introduces a new set of detectors besides the historically used International Commission of Radiation Units (ICRU) spheres for calculating dose distribution, including a Thermoluminescent Detector (TLD), Tissue Equivalent Proportional Counter (TEPC), and human phantom combined with a series of new primitive scorers in GEANT4 to calculate dose equivalence based on the International Commission of Radiation Protection (ICRP) standards. The developed models in this thesis predict dose depositions in the International Space Station and during the Apollo missions showing good agreement with experimental measurements

  3. Concurrent processing simulation of the space station

    NASA Technical Reports Server (NTRS)

    Gluck, R.; Hale, A. L.; Sunkel, John W.

    1989-01-01

    The development of a new capability for the time-domain simulation of multibody dynamic systems and its application to the study of a large angle rotational maneuvers of the Space Station is described. The effort was divided into three sequential tasks, which required significant advancements of the state-of-the art to accomplish. These were: (1) the development of an explicit mathematical model via symbol manipulation of a flexible, multibody dynamic system; (2) the development of a methodology for balancing the computational load of an explicit mathematical model for concurrent processing; and (3) the implementation and successful simulation of the above on a prototype Custom Architectured Parallel Processing System (CAPPS) containing eight processors. The throughput rate achieved by the CAPPS operating at only 70 percent efficiency, was 3.9 times greater than that obtained sequentially by the IBM 3090 supercomputer simulating the same problem. More significantly, analysis of the results leads to the conclusion that the relative cost effectiveness of concurrent vs. sequential digital computation will grow substantially as the computational load is increased. This is a welcomed development in an era when very complex and cumbersome mathematical models of large space vehicles must be used as substitutes for full scale testing which has become impractical.

  4. Thermal design and simulation of an attitude-varied space camera

    NASA Astrophysics Data System (ADS)

    Wang, Chenjie; Yang, Wengang; Feng, Liangjie; Li, XuYang; Wang, Yinghao; Fan, Xuewu; Wen, Desheng

    2015-10-01

    An attitude-varied space camera changes attitude continually when it is working, its attitude changes with large angle in short time leads to the significant change of heat flux; Moreover, the complicated inner heat sources, other payloads and the satellite platform will also bring thermal coupling effects to the space camera. According to a space camera which is located on a two dimensional rotating platform, detailed thermal design is accomplished by means of thermal isolation, thermal transmission and temperature compensation, etc. Then the ultimate simulation cases of both high temperature and low temperature are chosen considering the obscuration of the satellite platform and other payloads, and also the heat flux analysis of light entrance and radiator surface of the camera. NEVEDA and SindaG are used to establish the simulation model of the camera and the analysis is carried out. The results indicate that, under both passive and active thermal control, the temperature of optical components is 20+/-1°C,both their radial and axial temperature gradient are less than 0.3°C, while the temperature of the main structural components is 20+/-2°C, and the temperature fluctuation of the focal plane assemblies is 3.0-9.5°C The simulation shows that the thermal control system can meet the need of the mission, and the thermal design is efficient and reasonable.

  5. Molecular cytogenetic analysis of de novo dup(5)(q35.2q35.3) and review of the literature of pure partial trisomy 5q.

    PubMed

    Chen, Chih-Ping; Lin, Shuan-Pei; Lin, Chyi-Chyang; Chen, Yann-Jang; Chern, Schu-Rern; Li, Yueh-Chun; Hsieh, Lie-Jiau; Lee, Chen-Chi; Pan, Chen-Wen; Wang, Wayseen

    2006-07-15

    An 11-year-old girl presented with the phenotype of microcephaly, moderate mental retardation, motor retardation, short stature, strabismus, brachydactyly, and facial dysmorphism. She had undergone surgery for inguinal hernias. Detailed examinations of the heart and other internal organs revealed normal findings. Her karyotype was 46,XX,dup(5)(q35.2q35.3) de novo. Molecular cytogenetic analysis showed a paternally derived 5q35.2 --> q35.3 direct duplication and led to a correlation between the particular genotype and phenotype. This is the first description of a direct duplication of 5q35.2 --> q35.3. Our case represents the smallest distal duplication of chromosome 5q that is not associated with congenital heart defects. Our case also represents the smallest distal duplication of chromosome 5q that is associated with short stature and microcephaly. Mutations or deletions of the NSD1 gene, mapped to 5q35.2 --> q35.3, has been known to cause Sotos syndrome with cerebral gigantism, macrocephaly, advanced bone age and overgrowth. Our case provides evidence that the gene dosage effect of the NSD1 gene causes a reversed phenotype of microcephaly and short stature. Copyright 2006 Wiley-Liss, Inc.

  6. Space radiator simulation manual for computer code

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.

  7. A strategy for analysis of (molecular) equilibrium simulations: Configuration space density estimation, clustering, and visualization

    NASA Astrophysics Data System (ADS)

    Hamprecht, Fred A.; Peter, Christine; Daura, Xavier; Thiel, Walter; van Gunsteren, Wilfred F.

    2001-02-01

    We propose an approach for summarizing the output of long simulations of complex systems, affording a rapid overview and interpretation. First, multidimensional scaling techniques are used in conjunction with dimension reduction methods to obtain a low-dimensional representation of the configuration space explored by the system. A nonparametric estimate of the density of states in this subspace is then obtained using kernel methods. The free energy surface is calculated from that density, and the configurations produced in the simulation are then clustered according to the topography of that surface, such that all configurations belonging to one local free energy minimum form one class. This topographical cluster analysis is performed using basin spanning trees which we introduce as subgraphs of Delaunay triangulations. Free energy surfaces obtained in dimensions lower than four can be visualized directly using iso-contours and -surfaces. Basin spanning trees also afford a glimpse of higher-dimensional topographies. The procedure is illustrated using molecular dynamics simulations on the reversible folding of peptide analoga. Finally, we emphasize the intimate relation of density estimation techniques to modern enhanced sampling algorithms.

  8. Pulmonary manifestations of Q fever: analysis of 38 patients.

    PubMed

    Kelm, Diana J; White, Darin B; Fadel, Hind J; Ryu, Jay H; Maldonado, Fabien; Baqir, Misbah

    2017-10-01

    Lung involvement in both acute and chronic Q fever is not well described with only a few reported cases of pseudotumor or pulmonary fibrosis in chronic Q fever. The aim of this study was to better understand the pulmonary manifestations of Q fever. We conducted a retrospective cohort study of patients with diagnosis of Q fever at Mayo Clinic Rochester. A total of 69 patients were initially identified between 2001 and 2014. Thirty-eight patients were included in this study as 3 were pediatric patients, 20 did not meet serologic criteria for Q fever, and 8 did not have imaging available at time of initial diagnosis. Descriptive analysis was conducted using JMP software. The median age was 57 years [interquartile range (IQR) 43, 62], 84% from the Midwest, and 13% worked in an occupation involving animals. The most common presentation was fevers (61%). Respiratory symptoms, such as cough, were noted in only 4 patients (11%). Twelve patients (29%) had abnormal imaging studies attributed to Q fever. Three patients (25%) with acute Q fever had findings of consolidation, lymphadenopathy, pleural effusions, and nonspecific pulmonary nodules. Radiographic findings of chronic Q fever were seen in 9 patients (75%) and included consolidation, ground-glass opacities, pleural effusions, lymphadenopathy, pulmonary edema, and lung pseudotumor. Our results demonstrate that pulmonary manifestations are uncommon in Q fever but include cough and consolidation for acute Q fever and radiographic findings of pulmonary edema with pleural effusions, consolidation, and pseudotumor in those with chronic Q fever.

  9. Simulation of Martian surface-atmosphere interaction in a space-simulator: Technical considerations and feasibility

    NASA Technical Reports Server (NTRS)

    Moehlmann, D.; Kochan, H.

    1992-01-01

    The Space Simulator of the German Aerospace Research Establishment at Cologne, formerly used for testing satellites, is now, since 1987, the central unit within the research sub-program 'Comet-Simulation' (KOSI). The KOSI team has investigated physical processes relevant to comets and their surfaces. As a byproduct we gained experience in sample-handling under simulated space conditions. In broadening the scope of the research activities of the DLR Institute of Space Simulation an extension to 'Laboratory-Planetology' is planned. Following the KOSI-experiments a Mars Surface-Simulation with realistic minerals and surface soil in a suited environment (temperature, pressure, and CO2-atmosphere) is foreseen as the next step. Here, our main interest is centered on thermophysical properties of the Martian surface and energy transport (and related gas transport) through the surface. These laboratory simulation activities can be related to space missions as typical pre-mission and during-the-mission support of the experiments design and operations (simulation in parallel). Post mission experiments for confirmation and interpretation of results are of great value. The physical dimensions of the Space Simulator (cylinder of about 2.5 m diameter and 5 m length) allows for testing and qualification of experimental hardware under realistic Martian conditions.

  10. NIMROD Simulations of Low-q Disruptions in the Compact Toroidal Hybrid Device (CTH)

    NASA Astrophysics Data System (ADS)

    Howell, E. C.; Pandya, M. D.; Hanson, J. D.; Mauer, D. A.; Ennis, D. A.; Hartwell, G. J.

    2016-10-01

    Nonlinear MHD simulations of low-q disruptions in the CTH are presented. CTH is a current carrying stellarator that is used to study the effects of 3D shaping. The application of 3D shaping stabilizes low-q disruptions in CTH. The amount of 3D shaping is controlled by adjusting the external rotational transform, and it is characterized by the ratio of the external rotational transform to the total transform: f =ιvac / ι . Disruptions are routinely observed during operation with weak shaping (f < 0.05). The frequency of disruptions decreases with increasing amounts of 3D shaping, and the disruptions are completely suppressed for f > 0.1 . Nonlinear simulations are performed using the NIMROD code to better understand how the shaping suppresses the disruptions. Comparisons of runs with weak (f = 0.04) and strong (f = 0.10) shaping are shown. This material is based upon work supported by Auburn University and the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences under Award Numbers DE-FG02-03ER54692 and DE-FG02-00ER54610.

  11. Continuous state-space representation of a bucket-type rainfall-runoff model: a case study with the GR4 model using state-space GR4 (version 1.0)

    NASA Astrophysics Data System (ADS)

    Santos, Léonard; Thirel, Guillaume; Perrin, Charles

    2018-04-01

    In many conceptual rainfall-runoff models, the water balance differential equations are not explicitly formulated. These differential equations are solved sequentially by splitting the equations into terms that can be solved analytically with a technique called <q>operator splittingq>. As a result, only the solutions of the split equations are used to present the different models. This article provides a methodology to make the governing water balance equations of a bucket-type rainfall-runoff model explicit and to solve them continuously. This is done by setting up a comprehensive state-space representation of the model. By representing it in this way, the operator splitting, which makes the structural analysis of the model more complex, could be removed. In this state-space representation, the lag functions (unit hydrographs), which are frequent in rainfall-runoff models and make the resolution of the representation difficult, are first replaced by a so-called <q>Nash cascadeq> and then solved with a robust numerical integration technique. To illustrate this methodology, the GR4J model is taken as an example. The substitution of the unit hydrographs with a Nash cascade, even if it modifies the model behaviour when solved using operator splitting, does not modify it when the state-space representation is solved using an implicit integration technique. Indeed, the flow time series simulated by the new representation of the model are very similar to those simulated by the classic model. The use of a robust numerical technique that approximates a continuous-time model also improves the lag parameter consistency across time steps and provides a more time-consistent model with time-independent parameters.

  12. Atomic clock ensemble in space (ACES) data analysis

    NASA Astrophysics Data System (ADS)

    Meynadier, F.; Delva, P.; le Poncin-Lafitte, C.; Guerlin, C.; Wolf, P.

    2018-02-01

    The Atomic Clocks Ensemble in Space (ACES/PHARAO mission, ESA & CNES) will be installed on board the International Space Station (ISS) next year. A crucial part of this experiment is its two-way microwave link (MWL), which will compare the timescale generated on board with those provided by several ground stations disseminated on the Earth. A dedicated data analysis center is being implemented at SYRTE—Observatoire de Paris, where our team currently develops theoretical modelling, numerical simulations and the data analysis software itself. In this paper, we present some key aspects of the MWL measurement method and the associated algorithms for simulations and data analysis. We show the results of tests using simulated data with fully realistic effects such as fundamental measurement noise, Doppler, atmospheric delays, or cycle ambiguities. We demonstrate satisfactory performance of the software with respect to the specifications of the ACES mission. The main scientific product of our analysis is the clock desynchronisation between ground and space clocks, i.e. the difference of proper times between the space clocks and ground clocks at participating institutes. While in flight, this measurement will allow for tests of general relativity and Lorentz invariance at unprecedented levels, e.g. measurement of the gravitational redshift at the 3×10-6 level. As a specific example, we use real ISS orbit data with estimated errors at the 10 m level to study the effect of such errors on the clock desynchronisation obtained from MWL data. We demonstrate that the resulting effects are totally negligible.

  13. Free-space optical channel simulator for weak-turbulence conditions.

    PubMed

    Bykhovsky, Dima

    2015-11-01

    Free-space optical (FSO) communication may be severely influenced by the inevitable turbulence effect that results in channel gain fluctuations and fading. The objective of this paper is to provide a simple and effective simulator of the weak-turbulence FSO channel that emulates the influence of the temporal covariance effect. Specifically, the proposed model is based on lognormal distributed samples with a corresponding correlation time. The simulator is based on the solution of the first-order stochastic differential equation (SDE). The results of the provided SDE analysis reveal its efficacy for turbulent channel modeling.

  14. Experience of nursing students with standardized patients in simulation-based learning: Q-methodology study.

    PubMed

    Ha, Eun-Ho

    2018-04-23

    Standardized patients (SPs) boost self-confidence, improve problem solving, enhance critical thinking, and advance clinical judgment of nursing students. The aim of this study was to examine nursing students' experience with SPs in simulation-based learning. Q-methodology was used. Department of nursing in Seoul, South Korea. Fourth-year undergraduate nursing students (n = 47). A total of 47 fourth-year undergraduate nursing students ranked 42 Q statements about experiences with SPs into a normal distribution grid. The following three viewpoints were obtained: 1) SPs are helpful for patient care (patient-centered view), 2) SPs roles are important for nursing student learning (SPs roles-centered view), and 3) SPs can promote competency of nursing students (student-centered view). These results indicate that SPs may improve nursing students' confidence and nursing competency. Professors should reflect these three viewpoints in simulation-based learning to effectively engage SPs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Simulations of molecular diffusion in lattices of cells: insights for NMR of red blood cells.

    PubMed

    Regan, David G; Kuchel, Philip W

    2002-07-01

    The pulsed field-gradient spin-echo (PGSE) nuclear magnetic resonance (NMR) experiment, conducted on a suspension of red blood cells (RBC) in a strong magnetic field yields a q-space plot consisting of a series of maxima and minima. This is mathematically analogous to a classical optical diffraction pattern. The method provides a noninvasive and novel means of characterizing cell suspensions that is sensitive to changes in cell shape and packing density. The positions of the features in a q-space plot characterize the rate of exchange across the membrane, cell dimensions, and packing density. A diffusion tensor, containing information regarding the diffusion anisotropy of the system, can also be derived from the PGSE NMR data. In this study, we carried out Monte Carlo simulations of diffusion in suspensions of "virtual" cells that had either biconcave disc (as in RBC) or oblate spheroid geometry. The simulations were performed in a PGSE NMR context thus enabling predictions of q-space and diffusion tensor data. The simulated data were compared with those from real PGSE NMR diffusion experiments on RBC suspensions that had a range of hematocrit values. Methods that facilitate the processing of q-space data were also developed.

  16. 2q37 Deletion syndrome confirmed by high-resolution cytogenetic analysis

    PubMed Central

    Cho, Eun-Kyung; Kim, Jinsup; Yang, Aram; Jin, Dong-Kyu

    2017-01-01

    Chromosome 2q37 deletion syndrome is a rare chromosomal disorder characterized by mild to moderate developmental delay, brachydactyly of the third to fifth digits or toes, short stature, obesity, hypotonia, a characteristic facial appearance, and autism spectrum disorder. Here, we report on a patient with 2q37 deletion presenting with dilated cardiomyopathy (DCMP). Congenital heart malformations have been noted in up to 20% of patients with 2q37 deletions. However, DCMP has not been reported in 2q37 deletion patients previously. The patient exhibited the characteristic facial appearance (a flat nasal bridge, deep-set eyes, arched eyebrows, and a thin upper lip), developmental delay, mild mental retardation, peripheral nerve palsy, and Albright hereditary osteodystrophy (AHO)-like phenotypes (short stature and brachydactyly). Conventional chromosomal analysis results were normal; however, microarray-based comparative genomic hybridization revealed terminal deletion at 2q37.1q37.3. In addition, the patient was confirmed to have partial growth hormone (GH) deficiency and had shown a significant increase in growth rate after substitutive GH therapy. Chromosome 2q37 deletion syndrome should be considered in the differential diagnosis of patients presenting with AHO features, especially in the presence of facial dysmorphism. When patients are suspected of having a 2q37 deletion, high-resolution cytogenetic analysis is recommended. PMID:28690993

  17. Relationship Between pH and Electrochemical Corrosion Behavior of Thermal-Sprayed Ni-Al-Coated Q235 Steel in Simulated Soil Solutions

    NASA Astrophysics Data System (ADS)

    Wei, Wei; Wu, Xin-qiang; Ke, Wei; Xu, Song; Feng, Bing; Hu, Bo-tao

    2017-09-01

    Electrochemical corrosion behavior of a thermal-sprayed Ni-Al-coated Q235 steel was investigated in the simulated soil solutions at different pH values using measurements of potentiodynamic polarization curves and electrochemical impedance spectroscopy as well as surface analyses including x-ray diffraction analysis, scanning electron microscope equipped with an energy-dispersive x-ray spectroscopy and x-ray photoelectron spectroscopy. The results showed that the corrosion resistance of the Ni-Al-coated Q235 steel was dependent on the pH of the test solution. From pH = 3.53 to pH = 4.79, the corrosion resistance of the coated steel increased rapidly. In the pH range from 4.79 to 12.26, the corrosion resistance exhibited no significant change. At pH 13.25, the corrosion resistance of the sample was found to decrease. The calculated corrosion rate of Ni-Al-coated Q235 steel was lower than that of the uncoated Q235 steel and galvanized steel in all the test solutions. Over a wide range of pH values, the Ni-Al-coated Q235 steel exhibited extremely good corrosion resistance. The experimental data together with the potential-pH diagrams provided a basis for a detailed discussion of the related corrosion mechanisms of the coated steel.

  18. A TT&C Performance Simulator for Space Exploration and Scientific Satellites - Architecture and Applications

    NASA Astrophysics Data System (ADS)

    Donà, G.; Faletra, M.

    2015-09-01

    This paper presents the TT&C performance simulator toolkit developed internally at Thales Alenia Space Italia (TAS-I) to support the design of TT&C subsystems for space exploration and scientific satellites. The simulator has a modular architecture and has been designed using a model-based approach using standard engineering tools such as MATLAB/SIMULINK and mission analysis tools (e.g. STK). The simulator is easily reconfigurable to fit different types of satellites, different mission requirements and different scenarios parameters. This paper provides a brief description of the simulator architecture together with two examples of applications used to demonstrate some of the simulator’s capabilities.

  19. Analysis of the Space Propulsion System Problem Using RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    diego mandelli; curtis smith; cristian rabiti

    This paper presents the solution of the space propulsion problem using a PRA code currently under development at Idaho National Laboratory (INL). RAVEN (Reactor Analysis and Virtual control ENviroment) is a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities. It is designed to derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures) and to perform both Monte- Carlo sampling of random distributed events and Event Tree based analysis. In order to facilitate the input/output handling, a Graphical User Interface (GUI) and a post-processing data-mining module are available.more » RAVEN allows also to interface with several numerical codes such as RELAP5 and RELAP-7 and ad-hoc system simulators. For the space propulsion system problem, an ad-hoc simulator has been developed and written in python language and then interfaced to RAVEN. Such simulator fully models both deterministic (e.g., system dynamics and interactions between system components) and stochastic behaviors (i.e., failures of components/systems such as distribution lines and thrusters). Stochastic analysis is performed using random sampling based methodologies (i.e., Monte-Carlo). Such analysis is accomplished to determine both the reliability of the space propulsion system and to propagate the uncertainties associated to a specific set of parameters. As also indicated in the scope of the benchmark problem, the results generated by the stochastic analysis are used to generate risk-informed insights such as conditions under witch different strategy can be followed.« less

  20. Study of ceramic products and processing techniques in space. [using computerized simulation

    NASA Technical Reports Server (NTRS)

    Markworth, A. J.; Oldfield, W.

    1974-01-01

    An analysis of the solidification kinetics of beta alumina in a zero-gravity environment was carried out, using computer-simulation techniques, in order to assess the feasibility of producing high-quality single crystals of this material in space. The two coupled transport processes included were movement of the solid-liquid interface and diffusion of sodium atoms in the melt. Results of the simulation indicate that appreciable crystal-growth rates can be attained in space. Considerations were also made of the advantages offered by high-quality single crystals of beta alumina for use as a solid electrolyte; these clearly indicate that space-grown materials are superior in many respects to analogous terrestrially-grown crystals. Likewise, economic considerations, based on the rapidly expanding technological applications for beta alumina and related fast ionic conductors, reveal that the many superior qualities of space-grown material justify the added expense and experimental detail associated with space processing.

  1. Reduction of Simulation Times for High-Q Structures using the Resonance Equation

    DOE PAGES

    Hall, Thomas Wesley; Bandaru, Prabhakar R.; Rees, Daniel Earl

    2015-11-17

    Simulating steady state performance of high quality factor (Q) resonant RF structures is computationally difficult for structures with sizes on the order of more than a few wavelengths because of the long times (on the order of ~ 0.1 ms) required to achieve steady state in comparison with maximum time step that can be used in the simulation (typically, on the order of ~ 1 ps). This paper presents analytical and computational approaches that can be used to accelerate the simulation of the steady state performance of such structures. The basis of the proposed approach is the utilization of amore » larger amplitude signal at the beginning to achieve steady state earlier relative to the nominal input signal. Finally, the methodology for finding the necessary input signal is then discussed in detail, and the validity of the approach is evaluated.« less

  2. Space plasma simulations; Proceedings of the Second International School for Space Simulations, Kapaa, HI, February 4-15, 1985. Parts 1 & 2

    NASA Technical Reports Server (NTRS)

    Ashour-Abdalla, M. (Editor); Dutton, D. A. (Editor)

    1985-01-01

    Space plasma simulations, observations, and theories are discussed. Papers are presented on the capabilities of various types of simulation codes and simulation models. Consideration is given to plasma waves in the earth's magnetotail, outer planet magnetosphere, geospace, and the auroral and polar cap regions. Topics discussed include space plasma turbulent dissipation, the kinetics of plasma waves, wave-particle interactions, whistler mode propagation, global energy regulation, and auroral arc formation.

  3. Multi-mission space vehicle subsystem analysis tools

    NASA Technical Reports Server (NTRS)

    Kordon, M.; Wood, E.

    2003-01-01

    Spacecraft engineers often rely on specialized simulation tools to facilitate the analysis, design and operation of space systems. Unfortunately these tools are often designed for one phase of a single mission and cannot be easily adapted to other phases or other misions. The Multi-Mission Pace Vehicle Susbsystem Analysis Tools are designed to provide a solution to this problem.

  4. Manufacture of Cryoshroud Surfaces for Space Simulation Chambers

    NASA Technical Reports Server (NTRS)

    Ash, Gary S.

    2008-01-01

    Environmental test chambers for space applications use internal shrouds to simulate temperature conditions encountered in space. Shroud temperatures may range from +150 C to -253 C (20 K), and internal surfaces are coated with special high emissivity/absorptivity paints. To obtain temperature uniformity over large areas, detailed thermal design is required for placement of tubing for gaseous or liquid nitrogen and helium and other exotic heat exchange fluids. The recent increase in space simulation activity related to the James Webb Space Telescope has led to the design of new cryogenic shrouds to meet critical needs in instrument package testing. This paper will review the design and manufacturing of shroud surfaces for several of these programs, including fabrication methods and the selection and application of paints for simulation chambers.

  5. An accurate test for homogeneity of odds ratios based on Cochran's Q-statistic.

    PubMed

    Kulinskaya, Elena; Dollinger, Michael B

    2015-06-10

    A frequently used statistic for testing homogeneity in a meta-analysis of K independent studies is Cochran's Q. For a standard test of homogeneity the Q statistic is referred to a chi-square distribution with K-1 degrees of freedom. For the situation in which the effects of the studies are logarithms of odds ratios, the chi-square distribution is much too conservative for moderate size studies, although it may be asymptotically correct as the individual studies become large. Using a mixture of theoretical results and simulations, we provide formulas to estimate the shape and scale parameters of a gamma distribution to fit the distribution of Q. Simulation studies show that the gamma distribution is a good approximation to the distribution for Q. Use of the gamma distribution instead of the chi-square distribution for Q should eliminate inaccurate inferences in assessing homogeneity in a meta-analysis. (A computer program for implementing this test is provided.) This hypothesis test is competitive with the Breslow-Day test both in accuracy of level and in power.

  6. Space Simulation, 7th. [facilities and testing techniques

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Space simulation facilities and techniques are outlined that encompass thermal scale modeling, computerized simulations, reentry materials, spacecraft contamination, solar simulation, vacuum tests, and heat transfer studies.

  7. Formulation of consumables management models: Consumables analysis/crew simulator interface requirements

    NASA Technical Reports Server (NTRS)

    Zamora, M. A.

    1977-01-01

    Consumables analysis/crew training simulator interface requirements were defined. Two aspects were investigated: consumables analysis support techniques to crew training simulator for advanced spacecraft programs, and the applicability of the above techniques to the crew training simulator for the space shuttle program in particular.

  8. The General-Use Nodal Network Solver (GUNNS) Modeling Package for Space Vehicle Flow System Simulation

    NASA Technical Reports Server (NTRS)

    Harvey, Jason; Moore, Michael

    2013-01-01

    The General-Use Nodal Network Solver (GUNNS) is a modeling software package that combines nodal analysis and the hydraulic-electric analogy to simulate fluid, electrical, and thermal flow systems. GUNNS is developed by L-3 Communications under the TS21 (Training Systems for the 21st Century) project for NASA Johnson Space Center (JSC), primarily for use in space vehicle training simulators at JSC. It has sufficient compactness and fidelity to model the fluid, electrical, and thermal aspects of space vehicles in real-time simulations running on commodity workstations, for vehicle crew and flight controller training. It has a reusable and flexible component and system design, and a Graphical User Interface (GUI), providing capability for rapid GUI-based simulator development, ease of maintenance, and associated cost savings. GUNNS is optimized for NASA's Trick simulation environment, but can be run independently of Trick.

  9. Multi-physics simulations of space weather

    NASA Astrophysics Data System (ADS)

    Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Cohen, Ofer; Glocer, Alex; Manchester, Ward, IV; Ridley, Aaron

    Presently magnetohydrodynamic (MHD) models represent the "workhorse" technology for simulating the space environment from the solar corona to the ionosphere. While these models are very successful in describing many important phenomena, they are based on a low-order moment approximation of the phase-space distribution function. In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magnetosphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. BATS-R-US can solve the equations of "standard" ideal MHD, but it can also go beyond this first approximation. It can solve resistive MHD, Hall MHD, semi-relativistic MHD (that keeps the displacement current), multispecies (different ion species have different continuity equations) and multifluid (all ion species have separate continuity, momentum and energy equations) MHD. Recently we added two-fluid Hall MHD (solving the electron and ion energy equations separately) and are working on extended magnetohydrodynamics with anisotropic pressures. This talk will show the effects of added physics and compare space weather simulation results to "standard" ideal MHD.

  10. Preliminary X-ray analysis of twinned crystals of the Q88Y25_Lacpl esterase from Lactobacillus plantarum WCFS1

    PubMed Central

    Álvarez, Yanaisis; Esteban-Torres, María; Acebrón, Iván; de las Rivas, Blanca; Muñoz, Rosario; Martínez-Ripoll, Martín; Mancheño, José M.

    2011-01-01

    Q88Y25_Lacpl is an esterase produced by the lactic acid bacterium Lactobacillus plantarum WCFS1 that shows amino-acid sequence similarity to carboxyl­esterases from the hormone-sensitive lipase family, in particular the AFEST esterase from the archaeon Archaeoglobus fulgidus and the hyperthermophilic esterase EstEI isolated from a metagenomic library. N-­terminally His6-tagged Q88Y25_Lacpl has been overexpressed in Escherichia coli BL21 (DE3) cells, purified and crystallized at 291 K using the hanging-drop vapour-diffusion method. Mass spectrometry was used to determine the purity and homogeneity of the enzyme. Crystals of His6-tagged Q88Y25_Lacpl were prepared in a solution containing 2.8 M sodium acetate trihydrate pH 7.0. X-ray diffraction data were collected to 2.24 Å resolution on beamline ID29 at the ESRF. The apparent crystal point group was 422; however, initial global analysis of the intensity statistics (data processed with high symmetry in space group I422) and subsequent tests on data processed with low symmetry (space group I4) showed that the crystals were almost perfectly merohedrally twinned. Most probably, the true space group is I4, with unit-cell parameters a = 169.05, b = 169.05, c = 183.62 Å. PMID:22102251

  11. Efficient LBM visual simulation on face-centered cubic lattices.

    PubMed

    Petkov, Kaloian; Qiu, Feng; Fan, Zhe; Kaufman, Arie E; Mueller, Klaus

    2009-01-01

    The Lattice Boltzmann method (LBM) for visual simulation of fluid flow generally employs cubic Cartesian (CC) lattices such as the D3Q13 and D3Q19 lattices for the particle transport. However, the CC lattices lead to suboptimal representation of the simulation space. We introduce the face-centered cubic (FCC) lattice, fD3Q13, for LBM simulations. Compared to the CC lattices, the fD3Q13 lattice creates a more isotropic sampling of the simulation domain and its single lattice speed (i.e., link length) simplifies the computations and data storage. Furthermore, the fD3Q13 lattice can be decomposed into two independent interleaved lattices, one of which can be discarded, which doubles the simulation speed. The resulting LBM simulation can be efficiently mapped to the GPU, further increasing the computational performance. We show the numerical advantages of the FCC lattice on channeled flow in 2D and the flow-past-a-sphere benchmark in 3D. In both cases, the comparison is against the corresponding CC lattices using the analytical solutions for the systems as well as velocity field visualizations. We also demonstrate the performance advantages of the fD3Q13 lattice for interactive simulation and rendering of hot smoke in an urban environment using thermal LBM.

  12. Analysis of Waves in Space Plasma (WISP) near field simulation and experiment

    NASA Technical Reports Server (NTRS)

    Richie, James E.

    1992-01-01

    The WISP payload scheduler for a 1995 space transportation system (shuttle flight) will include a large power transmitter on board at a wide range of frequencies. The levels of electromagnetic interference/electromagnetic compatibility (EMI/EMC) must be addressed to insure the safety of the shuttle crew. This report is concerned with the simulation and experimental verification of EMI/EMC for the WISP payload in the shuttle cargo bay. The simulations have been carried out using the method of moments for both thin wires and patches to stimulate closed solids. Data obtained from simulation is compared with experimental results. An investigation of the accuracy of the modeling approach is also included. The report begins with a description of the WISP experiment. A description of the model used to simulate the cargo bay follows. The results of the simulation are compared to experimental data on the input impedance of the WISP antenna with the cargo bay present. A discussion of the methods used to verify the accuracy of the model is shown to illustrate appropriate methods for obtaining this information. Finally, suggestions for future work are provided.

  13. The structure of liquid water up to 360 MPa from x-ray diffraction measurements using a high Q-range and from molecular simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skinner, L. B.; Mineral Physics Institute, Stony Brook University, Stony Brook, New York, New York 11794-2100; Galib, M.

    2016-04-07

    X-ray diffraction measurements of liquid water are reported at pressures up to 360 MPa corresponding to a density of 0.0373 molecules per Å{sup 3}. The measurements were conducted at a spatial resolution corresponding to Q{sub max} = 16 Å{sup −1}. The method of data analysis and measurement in this study follows the earlier benchmark results reported for water under ambient conditions having a density of 0.0333 molecules per Å{sup 3} and Q{sub max} = 20 Å{sup −1} [J. Chem. Phys. 138, 074506 (2013)] and at 70 °C having a density of 0.0327 molecules per Å{sup 3} and Q{sub max} = 20more » Å{sup −1} [J. Chem. Phys. 141, 214507 (2014)]. The structure of water is very different at these three different T and P state points and thus they provide the basis for evaluating the fidelity of molecular simulation. Measurements show that at 360 MPa, the 4 waters residing in the region between 2.3 and 3 Å are nearly unchanged: the peak position, shape, and coordination number are nearly identical to their values under ambient conditions. However, in the region above 3 Å, large structural changes occur with the collapse of the well-defined 2nd shell and shifting of higher shells to shorter distances. The measured structure is compared to simulated structure using intermolecular potentials described by both first-principles methods (revPBE-D3) and classical potentials (TIP4P/2005, MB-pol, and mW). The DFT-based, revPBE-D3, method and the many-body empirical potential model, MB-pol, provide the best overall representation of the ambient, high-temperature, and high-pressure data. The revPBE-D3, MB-pol, and the TIP4P/2005 models capture the densification mechanism, whereby the non-bonded 5th nearest neighbor molecule, which partially encroaches the 1st shell at ambient pressure, is pushed further into the local tetrahedral arrangement at higher pressures by the more distant molecules filling the void space in the network between the 1st and 2nd shells.« less

  14. Analysis of space reactor system components: Investigation through simulation and non-nuclear testing

    NASA Astrophysics Data System (ADS)

    Bragg-Sitton, Shannon M.

    The use of fission energy in space power and propulsion systems offers considerable advantages over chemical propulsion. Fission provides over six orders of magnitude higher energy density, which translates to higher vehicle specific impulse and lower specific mass. These characteristics enable ambitious space exploration missions. The natural space radiation environment provides an external source of protons and high energy, high Z particles that can result in the production of secondary neutrons through interactions in reactor structures. Applying the approximate proton source in geosynchronous orbit during a solar particle event, investigation using MCNPX 2.5.b for proton transport through the SAFE-400 heat pipe cooled reactor indicates an incoming secondary neutron current of (1.16 +/- 0.03) x 107 n/s at the core-reflector interface. This neutron current may affect reactor operation during low power maneuvers (e.g., start-up) and may provide a sufficient reactor start-up source. It is important that a reactor control system be designed to automatically adjust to changes in reactor power levels, maintaining nominal operation without user intervention. A robust, autonomous control system is developed and analyzed for application during reactor start-up, accounting for fluctuations in the radiation environment that result from changes in vehicle location or to temporal variations in the radiation field. Development of a nuclear reactor for space applications requires a significant amount of testing prior to deployment of a flight unit. High confidence in fission system performance can be obtained through relatively inexpensive non-nuclear tests performed in relevant environments, with the heat from nuclear fission simulated using electric resistance heaters. A series of non-nuclear experiments was performed to characterize various aspects of reactor operation. This work includes measurement of reactor core deformation due to material thermal expansion and

  15. On alternative q-Weibull and q-extreme value distributions: Properties and applications

    NASA Astrophysics Data System (ADS)

    Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin

    2018-01-01

    Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.

  16. Q?rius: An innovative and new interactive educational space at the Smithsonian Institution's National Museum of Natural History, in Washington, D.C

    NASA Astrophysics Data System (ADS)

    Blankenbicker, R.

    2013-12-01

    The Fall of 2013 marks the opening of Q?rius ('curious'), a 10,000 square foot, interactive educational space at the Smithsonian Institution's National Museum of Natural History. Representing the 7 areas of the museum's research divisions, Q?rius includes a publicly accessible collection of over 6,000 natural history objects and multiple opportunities for visitors to engage themselves in natural history and the research conducted at the museum in various settings, including a lab, theater, and studio. A digital component to the space allows visitors to save parts of their experiences to a personal account, which they can later access remotely from their home or school. The space also serves as a tool for scientists to conduct outreach programs for museum visitors and for schools across the country through distance learning capabilities. Geology content for Q?rius was developed through collaboration between the Office of Education and Outreach and the Department of Mineral Sciences, as well as scientists and educators from outside agencies. Current experiences for the public include modeling plate tectonics and how they change rocks on small and large scales, identifying minerals in rocks, and using Earth to understand Martian geology. A school program adds the concept of drill cores and natural resources to the plate tectonics activity, which allows discussion about resource extraction. Developing experiences for Q?rius in all content areas took place over 2 phases; first, through taking prototypes into the museum exhibition halls to test with visitors through several iterations, and second in the new space, where all of the activities could be tested as a group and in the appropriate environment. By the time this abstract has been submitted, the official opening will not have occurred, though Q?rius will have been open for about 1 month by the time of the 2013 AGU annual conference, allowing us to further evaluate the development of the space.

  17. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  18. Simulations of molecular diffusion in lattices of cells: insights for NMR of red blood cells.

    PubMed Central

    Regan, David G; Kuchel, Philip W

    2002-01-01

    The pulsed field-gradient spin-echo (PGSE) nuclear magnetic resonance (NMR) experiment, conducted on a suspension of red blood cells (RBC) in a strong magnetic field yields a q-space plot consisting of a series of maxima and minima. This is mathematically analogous to a classical optical diffraction pattern. The method provides a noninvasive and novel means of characterizing cell suspensions that is sensitive to changes in cell shape and packing density. The positions of the features in a q-space plot characterize the rate of exchange across the membrane, cell dimensions, and packing density. A diffusion tensor, containing information regarding the diffusion anisotropy of the system, can also be derived from the PGSE NMR data. In this study, we carried out Monte Carlo simulations of diffusion in suspensions of "virtual" cells that had either biconcave disc (as in RBC) or oblate spheroid geometry. The simulations were performed in a PGSE NMR context thus enabling predictions of q-space and diffusion tensor data. The simulated data were compared with those from real PGSE NMR diffusion experiments on RBC suspensions that had a range of hematocrit values. Methods that facilitate the processing of q-space data were also developed. PMID:12080109

  19. Measuring patients' satisfaction with their anti-TNF treatment in severe Crohn's disease: scoring and psychometric validation of the Satisfaction for PAtients in Crohn's diseasE Questionnaire (SPACE-Q(©)).

    PubMed

    Gilet, Hélène; Arnould, Benoit; Fofana, Fatoumata; Clerson, Pierre; Colombel, Jean-Frédéric; D'Hondt, Olivier; Faure, Patrick; Hagège, Hervé; Nachury, Maria; Nahon, Stéphane; Tucat, Gilbert; Vandromme, Luc; Cazala-Telinge, Ines; Thibout, Emmanuel

    2014-01-01

    Severe Crohn's disease management includes anti-tumor necrosis factor (anti-TNF) drugs that differ from early-stage treatments regarding efficacy, safety, and convenience. This study aimed to finalize and psychometrically validate the Satisfaction for PAtients in Crohn's diseasE Questionnaire (SPACE-Q(©)), developed to measure satisfaction with anti-TNF treatment in patients with severe Crohn's disease. A total of 279 patients with severe Crohn's disease receiving anti-TNF therapy completed the SPACE-Q 62-item pilot version at inclusion and 12 and 13 weeks after first anti-TNF injection. The final SPACE-Q scoring was defined using multitrait and regression analyses and clinical relevance considerations. Psychometric validation included clinical validity against Harvey-Bradshaw score, concurrent validity against Treatment Satisfaction Questionnaire for Medication (TSQM), internal consistency reliability, test-retest reliability, and responsiveness against the patient global impression of change (PGIC). Quality of completion was good (55%-67% of patients completed all items). Four items were removed from the questionnaire. Eleven scores were defined within the final 58-item SPACE-Q: disease control; symptoms, anal symptoms, and quality of life transition scales; tolerability; convenience; expectation confirmation toward efficacy, side effects, and convenience; satisfaction with treatment; and motivation. Scores met standards for concurrent validity (correlation between SPACE-Q satisfaction with treatment and TSQM satisfaction scores =0.59), internal consistency reliability (Cronbach's α=0.67-0.93), test-retest reliability (intraclass correlations =0.62-0.91), and responsiveness (improvement in treatment experience assessed by the SPACE-Q for patients reporting improvement on the PGIC). Significantly different mean scores were observed between groups of patients with different Harvey-Bradshaw disease severity scores. The SPACE-Q is a valid, reliable, and responsive

  20. Q-controlled amplitude modulation atomic force microscopy in liquids: An analysis

    NASA Astrophysics Data System (ADS)

    Hölscher, H.; Schwarz, U. D.

    2006-08-01

    An analysis of amplitude modulation atomic force microscopy in liquids is presented with respect to the application of the Q-Control technique. The equation of motion is solved by numerical and analytic methods with and without Q-Control in the presence of a simple model interaction force adequate for many liquid environments. In addition, the authors give an explicit analytical formula for the tip-sample indentation showing that higher Q factors reduce the tip-sample force. It is found that Q-Control suppresses unwanted deformations of the sample surface, leading to the enhanced image quality reported in several experimental studies.

  1. Linezolid pharmacokinetics in MDR-TB: a systematic review, meta-analysis and Monte Carlo simulation

    PubMed Central

    Pertinez, Henry; Bonnett, Laura; Hodel, Eva Maria; Dartois, Véronique; Johnson, John L; Caws, Maxine; Bolhuis, Mathieu; Alffenaar, Jan-Willem C; Davies, Geraint; Sloan, Derek J

    2018-01-01

    Abstract Objectives The oxazolidinone linezolid is an effective component of drug-resistant TB treatment, but its use is limited by toxicity and the optimum dose is uncertain. Current strategies are not informed by clinical pharmacokinetic (PK)/pharmacodynamic (PD) data; we aimed to address this gap. Methods We defined linezolid PK/PD targets for efficacy (fAUC0–24:MIC >119 mg/L/h) and safety (fCmin <1.38 mg/L). We extracted individual-level linezolid PK data from existing studies on TB patients and performed meta-analysis, producing summary estimates of fAUC0–24 and fCmin for published doses. Combining these with a published MIC distribution, we performed Monte Carlo simulations of target attainment. Results The efficacy target was attained in all simulated individuals at 300 mg q12h and 600 mg q12h, but only 20.7% missed the safety target at 300 mg q12h versus 98.5% at 600 mg q12h. Although suggesting 300 mg q12h should be used preferentially, these data were reliant on a single centre. Efficacy and safety targets were missed by 41.0% and 24.2%, respectively, at 300 mg q24h and by 44.6% and 27.5%, respectively, at 600 mg q24h. However, the confounding effect of between-study heterogeneity on target attainment for q24h regimens was considerable. Conclusions Linezolid dosing at 300 mg q12h may retain the efficacy of the 600 mg q12h licensed dosing with improved safety. Data to evaluate commonly used 300 mg q24h and 600 mg q24h doses are limited. Comprehensive, prospectively obtained PK/PD data for linezolid doses in drug-resistant TB treatment are required. PMID:29584861

  2. Damping in Space Constructions

    NASA Astrophysics Data System (ADS)

    de Vreugd, Jan; de Lange, Dorus; Winters, Jasper; Human, Jet; Kamphues, Fred; Tabak, Erik

    2014-06-01

    Monolithic structures are often used in optomechanical designs for space applications to achieve high dimensional stability and to prevent possible backlash and friction phenomena. The capacity of monolithic structures to dissipate mechanical energy is however limited due to the high Q-factor, which might result in high stresses during dynamic launch loads like random vibration, sine sweeps and shock. To reduce the Q-factor in space applications, the effect of constrained layer damping (CLD) is investigated in this work. To predict the damping increase, the CLD effect is implemented locally at the supporting struts in an existing FE model of an optical instrument. Numerical simulations show that the effect of local damping treatment in this instrument could reduce the vibrational stresses with 30-50%. Validation experiments on a simple structure showed good agreement between measured and predicted damping properties. This paper presents material characterization, material modeling, numerical implementation of damping models in finite element code, numerical results on space hardware and the results of validation experiments.

  3. A Simulation Testbed for Airborne Merging and Spacing

    NASA Technical Reports Server (NTRS)

    Santos, Michel; Manikonda, Vikram; Feinberg, Art; Lohr, Gary

    2008-01-01

    The key innovation in this effort is the development of a simulation testbed for airborne merging and spacing (AM&S). We focus on concepts related to airports with Super Dense Operations where new airport runway configurations (e.g. parallel runways), sequencing, merging, and spacing are some of the concepts considered. We focus on modeling and simulating a complementary airborne and ground system for AM&S to increase efficiency and capacity of these high density terminal areas. From a ground systems perspective, a scheduling decision support tool generates arrival sequences and spacing requirements that are fed to the AM&S system operating on the flight deck. We enhanced NASA's Airspace Concept Evaluation Systems (ACES) software to model and simulate AM&S concepts and algorithms.

  4. Genome-wide linkage meta-analysis identifies susceptibility loci at 2q34 and 13q31.3 for genetic generalized epilepsies.

    PubMed

    Leu, Costin; de Kovel, Carolien G F; Zara, Federico; Striano, Pasquale; Pezzella, Marianna; Robbiano, Angela; Bianchi, Amedeo; Bisulli, Francesca; Coppola, Antonietta; Giallonardo, Anna Teresa; Beccaria, Francesca; Trenité, Dorothée Kasteleijn-Nolst; Lindhout, Dick; Gaus, Verena; Schmitz, Bettina; Janz, Dieter; Weber, Yvonne G; Becker, Felicitas; Lerche, Holger; Kleefuss-Lie, Ailing A; Hallman, Kerstin; Kunz, Wolfram S; Elger, Christian E; Muhle, Hiltrud; Stephani, Ulrich; Møller, Rikke S; Hjalgrim, Helle; Mullen, Saul; Scheffer, Ingrid E; Berkovic, Samuel F; Everett, Kate V; Gardiner, Mark R; Marini, Carla; Guerrini, Renzo; Lehesjoki, Anna-Elina; Siren, Auli; Nabbout, Rima; Baulac, Stephanie; Leguern, Eric; Serratosa, Jose M; Rosenow, Felix; Feucht, Martha; Unterberger, Iris; Covanis, Athanasios; Suls, Arvid; Weckhuysen, Sarah; Kaneva, Radka; Caglayan, Hande; Turkdogan, Dilsad; Baykan, Betul; Bebek, Nerses; Ozbek, Ugur; Hempelmann, Anne; Schulz, Herbert; Rüschendorf, Franz; Trucks, Holger; Nürnberg, Peter; Avanzini, Giuliano; Koeleman, Bobby P C; Sander, Thomas

    2012-02-01

    Genetic generalized epilepsies (GGEs) have a lifetime prevalence of 0.3% with heritability estimates of 80%. A considerable proportion of families with siblings affected by GGEs presumably display an oligogenic inheritance. The present genome-wide linkage meta-analysis aimed to map: (1) susceptibility loci shared by a broad spectrum of GGEs, and (2) seizure type-related genetic factors preferentially predisposing to either typical absence or myoclonic seizures, respectively. Meta-analysis of three genome-wide linkage datasets was carried out in 379 GGE-multiplex families of European ancestry including 982 relatives with GGEs. To dissect out seizure type-related susceptibility genes, two family subgroups were stratified comprising 235 families with predominantly genetic absence epilepsies (GAEs) and 118 families with an aggregation of juvenile myoclonic epilepsy (JME). To map shared and seizure type-related susceptibility loci, both nonparametric loci (NPL) and parametric linkage analyses were performed for a broad trait model (GGEs) in the entire set of GGE-multiplex families and a narrow trait model (typical absence or myoclonic seizures) in the subgroups of JME and GAE families. For the entire set of 379 GGE-multiplex families, linkage analysis revealed six loci achieving suggestive evidence for linkage at 1p36.22, 3p14.2, 5q34, 13q12.12, 13q31.3, and 19q13.42. The linkage finding at 5q34 was consistently supported by both NPL and parametric linkage results across all three family groups. A genome-wide significant nonparametric logarithm of odds score of 3.43 was obtained at 2q34 in 118 JME families. Significant parametric linkage to 13q31.3 was found in 235 GAE families assuming recessive inheritance (heterogeneity logarithm of odds = 5.02). Our linkage results support an oligogenic predisposition of familial GGE syndromes. The genetic risk factor at 5q34 confers risk to a broad spectrum of familial GGE syndromes, whereas susceptibility loci at 2q34 and 13q31

  5. Exit probability of the one-dimensional q-voter model: Analytical results and simulations for large networks

    NASA Astrophysics Data System (ADS)

    Timpanaro, André M.; Prado, Carmen P. C.

    2014-05-01

    We discuss the exit probability of the one-dimensional q-voter model and present tools to obtain estimates about this probability, both through simulations in large networks (around 107 sites) and analytically in the limit where the network is infinitely large. We argue that the result E(ρ )=ρq/ρq+(1-ρ)q, that was found in three previous works [F. Slanina, K. Sznajd-Weron, and P. Przybyła, Europhys. Lett. 82, 18006 (2008), 10.1209/0295-5075/82/18006; R. Lambiotte and S. Redner, Europhys. Lett. 82, 18007 (2008), 10.1209/0295-5075/82/18007, for the case q =2; and P. Przybyła, K. Sznajd-Weron, and M. Tabiszewski, Phys. Rev. E 84, 031117 (2011), 10.1103/PhysRevE.84.031117, for q >2] using small networks (around 103 sites), is a good approximation, but there are noticeable deviations that appear even for small systems and that do not disappear when the system size is increased (with the notable exception of the case q =2). We also show that, under some simple and intuitive hypotheses, the exit probability must obey the inequality ρq/ρq+(1-ρ)≤E(ρ)≤ρ/ρ +(1-ρ)q in the infinite size limit. We believe this settles in the negative the suggestion made [S. Galam and A. C. R. Martins, Europhys. Lett. 95, 48005 (2001), 10.1209/0295-5075/95/48005] that this result would be a finite size effect, with the exit probability actually being a step function. We also show how the result that the exit probability cannot be a step function can be reconciled with the Galam unified frame, which was also a source of controversy.

  6. EPIC Radiance Simulator for Deep Space Climate ObserVatoRy (DSCOVR)

    NASA Technical Reports Server (NTRS)

    Lyapustin, Alexei; Marshak, Alexander; Wang, Yujie; Korkin, Sergey; Herman, Jay

    2011-01-01

    The Deep Space Climate ObserVatoRy (DSCOVR) is a planned space weather mission for the Sun and Earth observations from the Lagrangian L1 point. Onboard of DSCOVR is a multispectral imager EPIC designed for unique observations of the full illuminated disk of the Earth with high temporal and 10 km spatial resolution. Depending on latitude, EPIC will observe the same Earth surface area during the course of the day in a wide range of solar and view zenith angles in the backscattering view geometry with the scattering angle of 164-172 . To understand the information content of EPIC data for analysis of the Earth clouds, aerosols and surface properties, an EPIC radiance Simulator was developed covering the UV -VIS-NIR range including the oxygen A and B-bands (A=340, 388, 443, 555, 680, 779.5, 687.7, 763.3 nm). The Simulator uses ancillary data (surface pressure/height, NCEP wind speed) as well as MODIS-based geophysical fields such as spectral surface bidirectional reflectance, column water vapor, and properties of aerosols and clouds including optical depth, effective radius, phase and cloud top height. The original simulations are conducted at 1 km resolution using the look-up table approach and then are averaged to 10 km EPIC radiances. This talk will give an overview of the EPIC Simulator with analysis of results over the continental USA and northern Atlantic.

  7. An IBM PC-based math model for space station solar array simulation

    NASA Technical Reports Server (NTRS)

    Emanuel, E. M.

    1986-01-01

    This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.

  8. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory

    PubMed Central

    Norbury, John W.; Schimmerling, Walter; Slaba, Tony C.; Azzam, Edouard I.; Badavi, Francis F.; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A.; Blattnig, Steve R.; Boothman, David A.; Borak, Thomas B.; Britten, Richard A.; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S.; Eisch, Amelia J.; Elgart, S. Robin; Goodhead, Dudley T.; Guida, Peter M.; Heilbronn, Lawrence H.; Hellweg, Christine E.; Huff, Janice L.; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I.; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A.; Norman, Ryan B.; Ottolenghi, Andrea; Patel, Zarana S.; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A.; Semones, Edward; Shay, Jerry W.; Shurshakov, Vyacheslav A.; Sihver, Lembit; Simonsen, Lisa C.; Story, Michael D.; Turker, Mitchell S.; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J.

    2017-01-01

    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. PMID:26948012

  9. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory.

    PubMed

    Norbury, John W; Schimmerling, Walter; Slaba, Tony C; Azzam, Edouard I; Badavi, Francis F; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A; Blattnig, Steve R; Boothman, David A; Borak, Thomas B; Britten, Richard A; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S; Eisch, Amelia J; Robin Elgart, S; Goodhead, Dudley T; Guida, Peter M; Heilbronn, Lawrence H; Hellweg, Christine E; Huff, Janice L; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A; Norman, Ryan B; Ottolenghi, Andrea; Patel, Zarana S; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A; Semones, Edward; Shay, Jerry W; Shurshakov, Vyacheslav A; Sihver, Lembit; Simonsen, Lisa C; Story, Michael D; Turker, Mitchell S; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J

    2016-02-01

    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. Published by Elsevier Ltd.

  10. Three-Dimensional Analysis of Deep Space Network Antenna Coverage

    NASA Technical Reports Server (NTRS)

    Kegege, Obadiah; Fuentes, Michael; Meyer, Nicholas; Sil, Amy

    2012-01-01

    There is a need to understand NASA s Deep Space Network (DSN) coverage gaps and any limitations to provide redundant communication coverage for future deep space missions, especially for manned missions to Moon and Mars. The DSN antennas are required to provide continuous communication coverage for deep space flights, interplanetary missions, and deep space scientific observations. The DSN consists of ground antennas located at three sites: Goldstone in USA, Canberra in Australia, and Madrid in Spain. These locations are not separated by the exactly 120 degrees and some DSN antennas are located in the bowl-shaped mountainous terrain to shield against radiofrequency interference resulting in a coverage gap in the southern hemisphere for the current DSN architecture. To analyze the extent of this gap and other coverage limitations, simulations of the DSN architecture were performed. In addition to the physical properties of the DSN assets, the simulation incorporated communication forward link calculations and azimuth/elevation masks that constrain the effects of terrain for each DSN antenna. Analysis of the simulation data was performed to create coverage profiles with the receiver settings at a deep space altitudes ranging from 2 million to 10 million km and a spherical grid resolution of 0.25 degrees with respect to longitude and latitude. With the results of these simulations, two- and three-dimensional representations of the area without communication coverage and area with coverage were developed, showing the size and shape of the communication coverage gap projected in space. Also, the significance of this communication coverage gap is analyzed from the simulation data.

  11. L718Q mutant EGFR escapes covalent inhibition by stabilizing a non-reactive conformation of the lung cancer drug osimertinib† †Electronic supplementary information (ESI) available: pKa shift for Cys797; geometries of TSs identified with QM/MM calculations; analysis of the minimum free-energy path for Cys797 alkylation; analysis of MD replicas; convergence for US simulations; replica of simulation of Cys797 alkylation; conformational FESs obtained from each MD replica. See DOI: 10.1039/c7sc04761d

    PubMed Central

    Callegari, D.; Ranaghan, K. E.; Woods, C. J.; Minari, R.; Tiseo, M.; Mor, M.; Mulholland, A. J.

    2018-01-01

    Osimertinib is a third-generation inhibitor approved for the treatment of non-small cell lung cancer. It overcomes resistance to first-generation inhibitors by incorporating an acrylamide group which alkylates Cys797 of EGFR T790M. The mutation of a residue in the P-loop (L718Q) was shown to cause resistance to osimertinib, but the molecular mechanism of this process is unknown. Here, we investigated the inhibitory process for EGFR T790M (susceptible to osimertinib) and EGFR T790M/L718Q (resistant to osimertinib), by modelling the chemical step (i.e., alkylation of Cys797) using QM/MM simulations and the recognition step by MD simulations coupled with free-energy calculations. The calculations indicate that L718Q has a negligible impact on both the activation energy for Cys797 alkylation and the free-energy of binding for the formation of the non-covalent complex. The results show that Gln718 affects the conformational space of the EGFR–osimertinib complex, stabilizing a conformation of acrylamide which prevents reaction with Cys797. PMID:29732058

  12. Simulations of the observation of clouds and aerosols with the Experimental Lidar in Space Equipment system.

    PubMed

    Liu, Z; Voelger, P; Sugimoto, N

    2000-06-20

    We carried out a simulation study for the observation of clouds and aerosols with the Japanese Experimental Lidar in Space Equipment (ELISE), which is a two-wavelength backscatter lidar with three detection channels. The National Space Development Agency of Japan plans to launch the ELISE on the Mission Demonstrate Satellite 2 (MDS-2). In the simulations, the lidar return signals for the ELISE are calculated for an artificial, two-dimensional atmospheric model including different types of clouds and aerosols. The signal detection processes are simulated realistically by inclusion of various sources of noise. The lidar signals that are generated are then used as input for simulations of data analysis with inversion algorithms to investigate retrieval of the optical properties of clouds and aerosols. The results demonstrate that the ELISE can provide global data on the structures and optical properties of clouds and aerosols. We also conducted an analysis of the effects of cloud inhomogeneity on retrievals from averaged lidar profiles. We show that the effects are significant for space lidar observations of optically thick broken clouds.

  13. Genetics of recurrent early-onset major depression (GenRED): significant linkage on chromosome 15q25-q26 after fine mapping with single nucleotide polymorphism markers.

    PubMed

    Levinson, Douglas F; Evgrafov, Oleg V; Knowles, James A; Potash, James B; Weissman, Myrna M; Scheftner, William A; Depaulo, J Raymond; Crowe, Raymond R; Murphy-Eberenz, Kathleen; Marta, Diana H; McInnis, Melvin G; Adams, Philip; Gladis, Madeline; Miller, Erin B; Thomas, Jo; Holmans, Peter

    2007-02-01

    The authors studied a dense map of single nucleotide polymorphism (SNP) DNA markers on chromosome 15q25-q26 to maximize the informativeness of genetic linkage analyses in a region where they previously reported suggestive evidence for linkage of recurrent early-onset major depressive disorder. In 631 European-ancestry families with multiple cases of recurrent early-onset major depressive disorder, 88 SNPs were genotyped, and multipoint allele-sharing linkage analyses were carried out. Marker-marker linkage disequilibrium was minimized, and a simulation study with founder haplotypes from these families suggested that linkage scores were not inflated by linkage disequilibrium. The dense SNP map increased the information content of the analysis from around 0.7 to over 0.9. The maximum evidence for linkage was the Z likelihood ratio score statistic of Kong and Cox (Z(LR))=4.69 at 109.8 cM. The exact p value was below the genomewide significance threshold. By contrast, in the genome scan with microsatellite markers at 9 cM spacing, the maximum Z(LR) for European-ancestry families was 3.43 (106.53 cM). It was estimated that the linked locus or loci in this region might account for a 20% or less populationwide increase in risk to siblings of cases. This region has produced modestly positive evidence for linkage to depression and related traits in other studies. These results suggest that DNA sequence variations in one or more genes in the 15q25-q26 region can increase susceptibility to major depression and that efforts are warranted to identify these genes.

  14. OASIS - ORBIT ANALYSIS AND SIMULATION SOFTWARE

    NASA Technical Reports Server (NTRS)

    Wu, S. C.

    1994-01-01

    The Orbit Analysis and Simulation Software, OASIS, is a software system developed for covariance and simulation analyses of problems involving earth satellites, especially the Global Positioning System (GPS). It provides a flexible, versatile and efficient accuracy analysis tool for earth satellite navigation and GPS-based geodetic studies. To make future modifications and enhancements easy, the system is modular, with five major modules: PATH/VARY, REGRES, PMOD, FILTER/SMOOTHER, and OUTPUT PROCESSOR. PATH/VARY generates satellite trajectories. Among the factors taken into consideration are: 1) the gravitational effects of the planets, moon and sun; 2) space vehicle orientation and shapes; 3) solar pressure; 4) solar radiation reflected from the surface of the earth; 5) atmospheric drag; and 6) space vehicle gas leaks. The REGRES module reads the user's input, then determines if a measurement should be made based on geometry and time. PMOD modifies a previously generated REGRES file to facilitate various analysis needs. FILTER/SMOOTHER is especially suited to a multi-satellite precise orbit determination and geodetic-type problems. It can be used for any situation where parameters are simultaneously estimated from measurements and a priori information. Examples of nonspacecraft areas of potential application might be Very Long Baseline Interferometry (VLBI) geodesy and radio source catalogue studies. OUTPUT PROCESSOR translates covariance analysis results generated by FILTER/SMOOTHER into user-desired easy-to-read quantities, performs mapping of orbit covariances and simulated solutions, transforms results into different coordinate systems, and computes post-fit residuals. The OASIS program was developed in 1986. It is designed to be implemented on a DEC VAX 11/780 computer using VAX VMS 3.7 or higher. It can also be implemented on a Micro VAX II provided sufficient disk space is available.

  15. Maternal uniparental disomy of chromosome 14 in a boy with t(14q14q) associated with a paternal t(13q14q)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomkins, D.J.; Waye, J.S.; Whelan, D.T.

    An 11-year-old boy was referred for chromosomal analysis because of precocious development and behavioral problems suggestive of the fragile X syndrome. The cytogenetic fragile X studies were normal, but a routine GTG-banded karyotype revealed an abnormal male karyotype with a Robertsonian translocation between the two chromosome 14`s: 46,XY,t(14q14q). Paternal karyotyping revealed another abnormal karyotype: 46,XY,t(13q14q). A brother had the same karyotype as the father; the mother was deceased. In order to determine if the apparently balanced t(14q14q) in the proband might be the cause of the clinical findings, molecular analysis of the origin of the chromosome 14`s was initiated. Southernmore » blotting and hybridization with D4S13 showed that the proband had two copies of one maternal allele which was shared by his brother. The brother`s second allele corresponded to one of the paternal alleles; the proband had no alleles from the father. Analysis of four other VNTRs demonstrated the probability of paternity to be greater than 99%. Thus, the t(14q14q) was most likely composed of two maternal chromosome 14`s. Further characterization of the t(14q14q) by dinucleotide repeat polymorphic markers is in progress to determine whether it has arisen from maternal isodisomy or heterodisomy. Several cases of uniparental disomy for chromosome 14 have been reported recently. Paternal disomy appears to be associated with more severe congenital anomalies and mental retardation, whereas maternal disomy may be associated with premature puberty and minimal intellectual impairment. The origin of the t(14q14q) in the present case may be related to the paternal translocation, as the segregation of the t(13q14q) in meiosis could lead to sperm that are nullisomic for chromosome 14.« less

  16. Optical Analysis of Transparent Polymeric Material Exposed to Simulated Space Environment

    NASA Technical Reports Server (NTRS)

    Edwards, David L.; Finckenor, Miria M.

    1999-01-01

    Transparent polymeric materials are being designed and utilized as solar concentrating lenses for spacecraft power and propulsion systems. These polymeric lenses concentrate solar energy onto energy conversion devices such as solar cells and thermal energy systems. The conversion efficiency is directly related to the transmissivity of the polymeric lens. The Environmental Effects Group of the Marshall Space Flight Center's Materials, Processes, and Manufacturing Department exposed a variety of materials to a simulated space environment and evaluated them for an, change in optical transmission. These materials include Lexan(TM), polyethylene terephthalate (PET). several formulations of Tefzel(TM). and Teflon(TM), and silicone DC 93-500. Samples were exposed to a minimum of 1000 Equivalent Sun Hours (ESH) of near-UV radiation (250 - 400 nm wavelength). Data will be presented on materials exposed to charged particle radiation equivalent to a five-year dose in geosynchronous orbit. These exposures were performed in MSFC's Combined Environmental Effects Test Chamber, a unique facility with the capability to expose materials simultaneously or sequentially to protons, low-energy electrons, high-energy electrons, near UV radiation and vacuum UV radiation.Prolonged exposure to the space environment will decrease the polymer film's transmission and thus reduce the conversion efficiency. A method was developed to normalize the transmission loss and thus rank the materials according to their tolerance to space environmental exposure. Spectral results and the material ranking according to transmission loss are presented.

  17. Space Simulation Chamber Rescues Water Damaged Books.

    ERIC Educational Resources Information Center

    American School and University, 1981

    1981-01-01

    More than 4,000 valuable water-damaged books were restored by using a space-simulation chamber at the Lockheed Missile and Space Company. It was the fifth time that the chamber has been used for the restoration of valuable books and documents. (Author/MLF)

  18. To Create Space on Earth: The Space Environment Simulation Laboratory and Project Apollo

    NASA Technical Reports Server (NTRS)

    Walters, Lori C.

    2003-01-01

    Few undertakings in the history of humanity can compare to the great technological achievement known as Project Apollo. Among those who witnessed Armstrong#s flickering television image were thousands of people who had directly contributed to this historic moment. Amongst those in this vast anonymous cadre were the personnel of the Space Environment Simulation Laboratory (SESL) at the Manned Spacecraft Center (MSC) in Houston, Texas. SESL houses two large thermal-vacuum chambers with solar simulation capabilities. At a time when NASA engineers had a limited understanding of the effects of extremes of space on hardware and crews, SESL was designed to literally create the conditions of space on Earth. With interior dimensions of 90 feet in height and a 55-foot diameter, Chamber A dwarfed the Apollo command/service module (CSM) it was constructed to test. The chamber#s vacuum pumping capacity of 1 x 10(exp -6) torr can simulate an altitude greater than 130 miles above the Earth. A "lunar plane" capable of rotating a 150,000-pound test vehicle 180 deg replicates the revolution of a craft in space. To reproduce the temperature extremes of space, interior chamber walls cool to -280F as two banks of carbon arc modules simulate the unfiltered solar light/heat of the Sun. With capabilities similar to that of Chamber A, early Chamber B tests included the Gemini modular maneuvering unit, Apollo EVA mobility unit and the lunar module. Since Gemini astronaut Charles Bassett first ventured into the chamber in 1966, Chamber B has assisted astronauts in testing hardware and preparing them for work in the harsh extremes of space.

  19. Modeling and simulation for space medicine operations: preliminary requirements considered

    NASA Technical Reports Server (NTRS)

    Dawson, D. L.; Billica, R. D.; McDonald, P. V.

    2001-01-01

    The NASA Space Medicine program is now developing plans for more extensive use of high-fidelity medical simulation systems. The use of simulation is seen as means to more effectively use the limited time available for astronaut medical training. Training systems should be adaptable for use in a variety of training environments, including classrooms or laboratories, space vehicle mockups, analog environments, and in microgravity. Modeling and simulation can also provide the space medicine development program a mechanism for evaluation of other medical technologies under operationally realistic conditions. Systems and procedures need preflight verification with ground-based testing. Traditionally, component testing has been accomplished, but practical means for "human in the loop" verification of patient care systems have been lacking. Medical modeling and simulation technology offer potential means to accomplish such validation work. Initial considerations in the development of functional requirements and design standards for simulation systems for space medicine are discussed.

  20. Space shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1980-01-01

    The effects of atmospheric turbulence in both horizontal and near horizontal flight, during the return of the space shuttle, are important for determining design, control, and 'pilot-in-the-loop' effects. A nonrecursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model, the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes which are entitled shuttle simulation turbulence tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 10,000 meters. The turbulence generation procedure is described as well as the results of validating the simulated turbulence. Conclusions and recommendations are presented and references cited. The tabulated one dimensional von Karman spectra and the results of spectral and statistical analyses of the SSTT are contained in the appendix.

  1. A Real-time 3D Visualization of Global MHD Simulation for Space Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Murata, K.; Matsuoka, D.; Kubo, T.; Shimazu, H.; Tanaka, T.; Fujita, S.; Watari, S.; Miyachi, H.; Yamamoto, K.; Kimura, E.; Ishikura, S.

    2006-12-01

    Recently, many satellites for communication networks and scientific observation are launched in the vicinity of the Earth (geo-space). The electromagnetic (EM) environments around the spacecraft are always influenced by the solar wind blowing from the Sun and induced electromagnetic fields. They occasionally cause various troubles or damages, such as electrification and interference, to the spacecraft. It is important to forecast the geo-space EM environment as well as the ground weather forecasting. Owing to the recent remarkable progresses of super-computer technologies, numerical simulations have become powerful research methods in the solar-terrestrial physics. For the necessity of space weather forecasting, NICT (National Institute of Information and Communications Technology) has developed a real-time global MHD simulation system of solar wind-magnetosphere-ionosphere couplings, which has been performed on a super-computer SX-6. The real-time solar wind parameters from the ACE spacecraft at every one minute are adopted as boundary conditions for the simulation. Simulation results (2-D plots) are updated every 1 minute on a NICT website. However, 3D visualization of simulation results is indispensable to forecast space weather more accurately. In the present study, we develop a real-time 3D webcite for the global MHD simulations. The 3-D visualization results of simulation results are updated every 20 minutes in the following three formats: (1)Streamlines of magnetic field lines, (2)Isosurface of temperature in the magnetosphere and (3)Isoline of conductivity and orthogonal plane of potential in the ionosphere. For the present study, we developed a 3-D viewer application working on Internet Explorer browser (ActiveX) is implemented, which was developed on the AVS/Express. Numerical data are saved in the HDF5 format data files every 1 minute. Users can easily search, retrieve and plot past simulation results (3D visualization data and numerical data) by using

  2. Open-flavor charm and bottom s q q ¯ Q ¯ and q q q ¯ Q ¯ tetraquark states

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Chen, Hua-Xing; Liu, Xiang; Steele, T. G.; Zhu, Shi-Lin

    2017-06-01

    We provide comprehensive investigations for the mass spectrum of exotic open-flavor charmed/bottom s q q ¯ c ¯ , q q q ¯ c ¯ , s q q ¯ b ¯ , q q q ¯ b ¯ tetraquark states with various spin-parity assignments JP=0+,1+,2+ and 0- , 1- in the framework of QCD sum rules. In the diquark configuration, we construct the diquark-antidiquark interpolating tetraquark currents using the color-antisymmetric scalar and axial-vector diquark fields. The stable mass sum rules are established in reasonable parameter working ranges, which are used to give reliable mass predictions for these tetraquark states. We obtain the mass spectra for the open-flavor charmed/bottom s q q ¯c ¯, q q q ¯c ¯, s q q ¯b ¯, q q q ¯b ¯ tetraquark states with various spin-parity quantum numbers. In addition, we suggest searching for exotic doubly-charged tetraquarks, such as [s d ][u ¯ c ¯ ]→Ds(*)-π- in future experiments at facilities such as BESIII, BelleII, PANDA, LHCb, and CMS, etc.

  3. Neutral Buoyancy Simulator - Space Station

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Skylab's success proved that scientific experimentation in a low gravity environment was essential to scientific progress. A more permanent structure was needed to provide this space laboratory. President Ronald Reagan, on January 25, 1984, during his State of the Union address, claimed that the United States should exploit the new frontier of space, and directed NASA to build a permanent marned space station within a decade. The idea was that the space station would not only be used as a laboratory for the advancement of science and medicine, but would also provide a staging area for building a lunar base and manned expeditions to Mars and elsewhere in the solar system. President Reagan invited the international community to join with the United States in this endeavour. NASA and several countries moved forward with this concept. By December 1985, the first phase of the space station was well underway with the design concept for the crew compartments and laboratories. Pictured are two NASA astronauts, at Marshall Space Flight Center's (MSFC) Neutral Buoyancy Simulator (NBS), practicing construction techniques they later used to construct the space station after it was deployed.

  4. Modeling and Simulation for Multi-Missions Space Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Chang, Max

    2011-01-01

    Asteroids and Near-Earth Objects [NEOs] are of great interest for future space missions. The Multi-Mission Space Exploration Vehicle [MMSEV] is being considered for future Near Earth Object missions and requires detailed planning and study of its Guidance, Navigation, and Control [GNC]. A possible mission of the MMSEV to a NEO would be to navigate the spacecraft to a stationary orbit with respect to the rotating asteroid and proceed to anchor into the surface of the asteroid with robotic arms. The Dynamics and Real-Time Simulation [DARTS] laboratory develops reusable models and simulations for the design and analysis of missions. In this paper, the development of guidance and anchoring models are presented together with their role in achieving mission objectives and relationships to other parts of the simulation. One important aspect of guidance is in developing methods to represent the evolution of kinematic frames related to the tasks to be achieved by the spacecraft and its robot arms. In this paper, we compare various types of mathematical interpolation methods for position and quaternion frames. Subsequent work will be on analyzing the spacecraft guidance system with different movements of the arms. With the analyzed data, the guidance system can be adjusted to minimize the errors in performing precision maneuvers.

  5. Robotic space simulation integration of vision algorithms into an orbital operations simulation

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.

    1987-01-01

    In order to successfully plan and analyze future space activities, computer-based simulations of activities in low earth orbit will be required to model and integrate vision and robotic operations with vehicle dynamics and proximity operations procedures. The orbital operations simulation (OOS) is configured and enhanced as a testbed for robotic space operations. Vision integration algorithms are being developed in three areas: preprocessing, recognition, and attitude/attitude rates. The vision program (Rice University) was modified for use in the OOS. Systems integration testing is now in progress.

  6. Two-Stage Design Method for Enhanced Inductive Energy Transmission with Q-Constrained Planar Square Loops.

    PubMed

    Eteng, Akaa Agbaeze; Abdul Rahim, Sharul Kamal; Leow, Chee Yen; Chew, Beng Wah; Vandenbosch, Guy A E

    2016-01-01

    Q-factor constraints are usually imposed on conductor loops employed as proximity range High Frequency Radio Frequency Identification (HF-RFID) reader antennas to ensure adequate data bandwidth. However, pairing such low Q-factor loops in inductive energy transmission links restricts the link transmission performance. The contribution of this paper is to assess the improvement that is reached with a two-stage design method, concerning the transmission performance of a planar square loop relative to an initial design, without compromise to a Q-factor constraint. The first stage of the synthesis flow is analytical in approach, and determines the number and spacing of turns by which coupling between similar paired square loops can be enhanced with low deviation from the Q-factor limit presented by an initial design. The second stage applies full-wave electromagnetic simulations to determine more appropriate turn spacing and widths to match the Q-factor constraint, and achieve improved coupling relative to the initial design. Evaluating the design method in a test scenario yielded a more than 5% increase in link transmission efficiency, as well as an improvement in the link fractional bandwidth by more than 3%, without violating the loop Q-factor limit. These transmission performance enhancements are indicative of a potential for modifying proximity HF-RFID reader antennas for efficient inductive energy transfer and data telemetry links.

  7. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  8. Geometry of the q-exponential distribution with dependent competing risks and accelerated life testing

    NASA Astrophysics Data System (ADS)

    Zhang, Fode; Shi, Yimin; Wang, Ruibing

    2017-02-01

    In the information geometry suggested by Amari (1985) and Amari et al. (1987), a parametric statistical model can be regarded as a differentiable manifold with the parameter space as a coordinate system. Note that the q-exponential distribution plays an important role in Tsallis statistics (see Tsallis, 2009), this paper investigates the geometry of the q-exponential distribution with dependent competing risks and accelerated life testing (ALT). A copula function based on the q-exponential function, which can be considered as the generalized Gumbel copula, is discussed to illustrate the structure of the dependent random variable. Employing two iterative algorithms, simulation results are given to compare the performance of estimations and levels of association under different hybrid progressively censoring schemes (HPCSs).

  9. Space flight visual simulation.

    PubMed

    Xu, L

    1985-01-01

    In this paper, based on the scenes of stars seen by astronauts in their orbital flights, we have studied the mathematical model which must be constructed for CGI system to realize the space flight visual simulation. Considering such factors as the revolution and rotation of the Earth, exact date, time and site of orbital injection of the spacecraft, as well as its orbital flight and attitude motion, etc., we first defined all the instantaneous lines of sight and visual fields of astronauts in space. Then, through a series of coordinate transforms, the pictures of the scenes of stars changing with time-space were photographed one by one mathematically. In the procedure, we have designed a method of three-times "mathematical cutting." Finally, we obtained each instantaneous picture of the scenes of stars observed by astronauts through the window of the cockpit. Also, the dynamic conditions shaded by the Earth in the varying pictures of scenes of stars could be displayed.

  10. The evolution of space simulation

    NASA Technical Reports Server (NTRS)

    Edwards, Arthur A.

    1992-01-01

    Thirty years have passed since the first large (more than 15 ft diameter) thermal vacuum space simulation chambers were built in this country. Many changes have been made since then, and the industry has learned a great deal as the designs have evolved in that time. I was fortunate to have been part of that beginning, and have participated in many of the changes that have occurred since. While talking with vacuum friends recently, I realized that many of the engineers working in the industry today may not be aware of the evolution of space simulation because they did not experience the changes that brought us today's technology. With that in mind, it seems to be appropriate to take a moment and review some of the events that were a big part of the past thirty years in the thermal vacuum business. Perhaps this review will help to understand a little of the 'why' as well as the 'how' of building and operating large thermal vacuum chambers.

  11. Simulation of MEMS for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Mott, Brent; Kuhn, Jonathan; Broduer, Steve (Technical Monitor)

    2001-01-01

    The NASA Goddard Space Flight Center (GSFC) is developing optical micro-electromechanical system (MEMS) components for potential application in Next Generation Space Telescope (NGST) science instruments. In this work, we present an overview of the electro-mechanical simulation of three MEMS components for NGST, which include a reflective micro-mirror array and transmissive microshutter array for aperture control for a near infrared (NIR) multi-object spectrometer and a large aperture MEMS Fabry-Perot tunable filter for a NIR wide field camera. In all cases the device must operate at cryogenic temperatures with low power consumption and low, complementary metal oxide semiconductor (CMOS) compatible, voltages. The goal of our simulation efforts is to adequately predict both the performance and the reliability of the devices during ground handling, launch, and operation to prevent failures late in the development process and during flight. This goal requires detailed modeling and validation of complex electro-thermal-mechanical interactions and very large non-linear deformations, often involving surface contact. Various parameters such as spatial dimensions and device response are often difficult to measure reliably at these small scales. In addition, these devices are fabricated from a wide variety of materials including surface micro-machined aluminum, reactive ion etched (RIE) silicon nitride, and deep reactive ion etched (DRIE) bulk single crystal silicon. The above broad set of conditions combine to be a formidable challenge for space flight qualification analysis. These simulations represent NASA/GSFC's first attempts at implementing a comprehensive strategy to address complex MEMS structures.

  12. Ascent trajectory dispersion analysis for WTR heads-up space shuttle trajectory

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The results of a Space Transportation System ascent trajectory dispersion analysis are discussed. The purpose is to provide critical trajectory parameter values for assessing the Space Shuttle in a heads-up configuration launched from the Western Test Range (STR). This analysis was conducted using a trajectory profile based on a launch from the WTR in December. The analysis consisted of the following steps: (1) nominal trajectories were simulated under the conditions as specified by baseline reference mission guidelines; (2) dispersion trajectories were simulated using predetermined parametric variations; (3) requirements for a system-related composite trajectory were determined by a root-sum-square (RSS) analysis of the positive deviations between values of the aerodynamic heating indicator (AHI) generated by the dispersion and nominal trajectories; (4) using the RSS assessment as a guideline, the system related composite trajectory was simulated by combinations of dispersion parameters which represented major contributors; (5) an assessment of environmental perturbations via a RSS analysis was made by the combination of plus or minus 2 sigma atmospheric density variation and 95% directional design wind dispersions; (6) maximum aerodynamic heating trajectories were simulated by variation of dispersion parameters which would emulate the summation of the system-related RSS and environmental RSS values of AHI. The maximum aerodynamic heating trajectories were simulated consistent with the directional winds used in the environmental analysis.

  13. Processing of Lunar Soil Simulant for Space Exploration Applications

    NASA Technical Reports Server (NTRS)

    Sen, Subhayu; Ray, Chandra S.; Reddy, Ramana

    2005-01-01

    NASA's long-term vision for space exploration includes developing human habitats and conducting scientific investigations on planetary bodies, especially on Moon and Mars. To reduce the level of up-mass processing and utilization of planetary in-situ resources is recognized as an important element of this vision. Within this scope and context, we have undertaken a general effort aimed primarily at extracting and refining metals, developing glass, glass-ceramic, or traditional ceramic type materials using lunar soil simulants. In this paper we will present preliminary results on our effort on carbothermal reduction of oxides for elemental extraction and zone refining for obtaining high purity metals. In additions we will demonstrate the possibility of developing glasses from lunar soil simulant for fixing nuclear waste from potential nuclear power generators on planetary bodies. Compositional analysis, x-ray diffraction patterns and differential thermal analysis of processed samples will be presented.

  14. Simulated space environment tests on cadmium sulfide solar cells

    NASA Technical Reports Server (NTRS)

    Clarke, D. R.; Oman, H.

    1971-01-01

    Cadmium sulfide (Cu2s - CdS) solar cells were tested under simulated space environmental conditions. Some cells were thermally cycled with illumination from a Xenon-arc solar simulator. A cycle was one hour of illumination followed immediately with one-half hour of darkness. In the light, the cells reached an equilibrium temperature of 60 C (333 K) and in the dark the cell temperature dropped to -120 C (153 K). Other cells were constantly illuminated with a Xenon-arc solar simulator. The equilibrium temperature of these cells was 55 C (328 K). The black vacuum chamber walls were cooled with liquid nitrogen to simulate a space heat sink. Chamber pressure was maintained at 0.000001 torr or less. Almost all of the solar cells tested degraded in power when exposed to a simulated space environment of either thermal cycling or constant illumination. The cells tested the longest were exposed to 10.050 thermal cycles.

  15. Simulation Modeling and Performance Evaluation of Space Networks

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John

    2006-01-01

    In space exploration missions, the coordinated use of spacecraft as communication relays increases the efficiency of the endeavors. To conduct trade-off studies of the performance and resource usage of different communication protocols and network designs, JPL designed a comprehensive extendable tool, the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE). The design and development of MACHETE began in 2000 and is constantly evolving. Currently, MACHETE contains Consultative Committee for Space Data Systems (CCSDS) protocol standards such as Proximity-1, Advanced Orbiting Systems (AOS), Packet Telemetry/Telecommand, Space Communications Protocol Specification (SCPS), and the CCSDS File Delivery Protocol (CFDP). MACHETE uses the Aerospace Corporation s Satellite Orbital Analysis Program (SOAP) to generate the orbital geometry information and contact opportunities. Matlab scripts provide the link characteristics. At the core of MACHETE is a discrete event simulator, QualNet. Delay Tolerant Networking (DTN) is an end-to-end architecture providing communication in and/or through highly stressed networking environments. Stressed networking environments include those with intermittent connectivity, large and/or variable delays, and high bit error rates. To provide its services, the DTN protocols reside at the application layer of the constituent internets, forming a store-and-forward overlay network. The key capabilities of the bundling protocols include custody-based reliability, ability to cope with intermittent connectivity, ability to take advantage of scheduled and opportunistic connectivity, and late binding of names to addresses. In this presentation, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the use of MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol

  16. Simulation of Range Safety for the NASA Space Shuttle

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Sepulveda, Jose; Compton, Jeppie; Turner, Robert

    2005-01-01

    This paper describes a simulation environment that seamlessly combines a number of safety and environmental models for the launch phase of a NASA Space Shuttle mission. The components of this simulation environment represent the different systems that must interact in order to determine the Expectation of casualties (E(sub c)) resulting from the toxic effects of the gas dispersion that occurs after a disaster affecting a Space Shuttle within 120 seconds of lift-off. The utilization of the Space Shuttle reliability models, trajectory models, weather dissemination systems, population models, amount and type of toxicants, gas dispersion models, human response functions to toxicants, and a geographical information system are all integrated to create this environment. This simulation environment can help safety managers estimate the population at risk in order to plan evacuation, make sheltering decisions, determine the resources required to provide aid and comfort, and mitigate damages in case of a disaster. This simulation environment may also be modified and used for the landing phase of a space vehicle but will not be discussed in this paper.

  17. The space transformation in the simulation of multidimensional random fields

    USGS Publications Warehouse

    Christakos, G.

    1987-01-01

    Space transformations are proposed as a mathematically meaningful and practically comprehensive approach to simulate multidimensional random fields. Within this context the turning bands method of simulation is reconsidered and improved in both the space and frequency domains. ?? 1987.

  18. Optical Analysis of Transparent Polymeric Material Exposed to Simulated Space Environment

    NASA Technical Reports Server (NTRS)

    Edwards, David L.; Finckenor, Miria M.

    2000-01-01

    Many innovations in spacecraft power and propulsion have been recently tested at NASA, particularly in non-chemical propulsion. One improvement in solar array technology is solar concentration using thin polymer film Fresnel lenses. Weight and cost savings were proven with the Solar Concentrator Arrays with Refractive Linear Element Technology (SCARLET)-II array on NASA's Deep Space I spacecraft. The Fresnel lens concentrates solar energy onto high-efficiency solar cells, decreasing the area of solar cells needed for power. Continued efficiency of this power system relies on the thin film's durability in the space environment and maintaining transmission in the 300 - 1000 nm bandwidth. Various polymeric materials have been tested for use in solar concentrators, including Lexan(TM), polyethylene terephthalate (PET), several formulations of Tefzel(Tm) and Teflon(TM), and DC 93-500, the material selected for SCARLET-II. Also tested were several innovative materials including Langley Research Center's CPI and CP2 polymers and atomic oxygen- resistant polymers developed by Triton Systems, Inc. The Environmental Effects Group of the Marshall Space Flight Center's Materials, Processes, and Manufacturing Department exposed these materials to simulated space environment and evaluated them for any change in optical transmission. Samples were exposed to a minimum of 1000 equivalent Sun hours of near-UV radiation (250 - 400 nm wavelength). Materials that appeared robust after near-UV exposure were then exposed to charged particle radiation equivalent to a five-year dose in geosynchronous orbit. These exposures were performed in MSFC's Combined Environmental Effects Test Chamber, a unique facility with the capability to expose materials simultaneously or sequentially to protons, low-energy electrons, high-energy electrons, near UV radiation and vacuum UV radiation. Reflectance measurements can be made on the samples in vacuum. Prolonged exposure to the space environment will

  19. Pulsed optical fibre lasers: Self-pulsation, Q-switching and tissue interactions

    NASA Astrophysics Data System (ADS)

    El-Sherif, Ashraf Fathy

    near 2 mum is presented. Appropriate design precautions have been undertaken to ensure that prelasing does not occur. In this system, the main Q-switched pulse may be followed by one pulse of lower amplitude "postlasing" when an optimised quarter wave voltage of 750 V is applied. It was found that the laser produced 320 ns pulses with 2.5 mJ pulse energy and 3.3 kW peak power at low repetition rates of 50-70 Hz. This is the first time that such studies of electro-optic modulator (EOM) Q-switched Tm3+ fibre lasers have been reported. The maximum peak power was obtained for an optimum cavity length of 2.15 meters, made up of fibre length, broadband beamsplitter polarizer, Q-switch crystal and passive space. Computer simulation of Tm3+doped silica and Er2-doped fluorozirconate fibre lasers using general laser analysis and design (GLAD) software has been successfully investigated for the first time. Input files, which are very similar to language are created to model three designs of fibre lasers, two for Tm3+-doped silica fibre lasers, core pumped at 1.57 mum and cladding pumped at 790 nm, and one for a 2.7 mum Er3+-doped fluorozirconate fibre laser cladding pumped at 975 nm. Results are presented from a relatively comprehensive computer model, which simulates CW operation of the fibre lasers. The simulation suggests that to enhance the conversion energy we have to optimise between the absorption coefficient of the fibre and the diffraction algorithms. Comparison of soft and hard tissue ablation with high peak power Q-switched and CW Tm3+-silica fibre lasers are presented. The ablation of chicken breast and lamb liver tissues as a soft tissue and cartilage as a hard tissue have been investigated using a free running CW-Tm3+-doped fibre laser (wavelength 1.99 mum, with self-pulsation duration ranging over 1 to few tens of microseconds) and for Q-switched operation of the same laser (pulse duration ranging from 150 ns to 900 ns and pulse repetition rates from 100 Hz to 17 k

  20. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  1. Space Communications and Navigation (SCaN) Network Simulation Tool Development and Its Use Cases

    NASA Technical Reports Server (NTRS)

    Jennings, Esther; Borgen, Richard; Nguyen, Sam; Segui, John; Stoenescu, Tudor; Wang, Shin-Ywan; Woo, Simon; Barritt, Brian; Chevalier, Christine; Eddy, Wesley

    2009-01-01

    In this work, we focus on the development of a simulation tool to assist in analysis of current and future (proposed) network architectures for NASA. Specifically, the Space Communications and Navigation (SCaN) Network is being architected as an integrated set of new assets and a federation of upgraded legacy systems. The SCaN architecture for the initial missions for returning humans to the moon and beyond will include the Space Network (SN) and the Near-Earth Network (NEN). In addition to SCaN, the initial mission scenario involves a Crew Exploration Vehicle (CEV), the International Space Station (ISS) and NASA Integrated Services Network (NISN). We call the tool being developed the SCaN Network Integration and Engineering (SCaN NI&E) Simulator. The intended uses of such a simulator are: (1) to characterize performance of particular protocols and configurations in mission planning phases; (2) to optimize system configurations by testing a larger parameter space than may be feasible in either production networks or an emulated environment; (3) to test solutions in order to find issues/risks before committing more significant resources needed to produce real hardware or flight software systems. We describe two use cases of the tool: (1) standalone simulation of CEV to ISS baseline scenario to determine network performance, (2) participation in Distributed Simulation Integration Laboratory (DSIL) tests to perform function testing and verify interface and interoperability of geographically dispersed simulations/emulations.

  2. A continuum model for dynamic analysis of the Space Station

    NASA Technical Reports Server (NTRS)

    Thomas, Segun

    1989-01-01

    Dynamic analysis of the International Space Station using MSC/NASTRAN had 1312 rod elements, 62 beam elements, 489 nodes and 1473 dynamic degrees of freedom. A realtime, man-in-the-loop simulation of such a model is impractical. This paper discusses the mathematical model for realtime dynamic simulation of the Space Station. Several key questions in structures and structural dynamics are addressed. First, to achieve a significant reduction in the number of dynamic degrees of freedom, a continuum equivalent representation of the Space Station truss structure which accounted for the unsymmetry of the basic configuration and resulted in the coupling of extensional and transverse deformation, is developed. Next, dynamic equations for the continuum equivalent of the Space Station truss structure are formulated using a matrix version of Kane's dynamical equations. Flexibility is accounted for by using a theory that accommodates extension, bending in two principal planes and shear displacement. Finally, constraint equations suitable for dynamic analysis of flexible bodies with closed loop configuration are developed and solution of the resulting system of equations is based on the zero eigenvalue theorem.

  3. IM and Q-D Rules: An Analysis by French Club MURAT

    DTIC Science & Technology

    1996-08-01

    and Q-D Rules: An Analysis by French Club Murat 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...IM(1) AND Q-D RULES: AN ANALYSIS BY FRENCH CLUB MURAT by Jean ISLER (2) - Jean G. GOLIGER (3) - Daniel BOCHAND (4) Georges QUEROL (5) - Louis PICARD...6) and Joël FERRON (7) CLUB MURAT - BP 129 78 148 VELIZY CEDEX - FRANCE Tel : (33) - (1) 39.46.15.50 Fax : (33) - (1) 39.46.15.38 ABSTRACT The

  4. Dissipative stability analysis and control of two-dimensional Fornasini-Marchesini local state-space model

    NASA Astrophysics Data System (ADS)

    Wang, Lanning; Chen, Weimin; Li, Lizhen

    2017-06-01

    This paper is concerned with the problems of dissipative stability analysis and control of the two-dimensional (2-D) Fornasini-Marchesini local state-space (FM LSS) model. Based on the characteristics of the system model, a novel definition of 2-D FM LSS (Q, S, R)-α-dissipativity is given first, and then a sufficient condition in terms of linear matrix inequality (LMI) is proposed to guarantee the asymptotical stability and 2-D (Q, S, R)-α-dissipativity of the systems. As its special cases, 2-D passivity performance and 2-D H∞ performance are also discussed. Furthermore, by use of this dissipative stability condition and projection lemma technique, 2-D (Q, S, R)-α-dissipative state-feedback control problem is solved as well. Finally, a numerical example is given to illustrate the effectiveness of the proposed method.

  5. Performance Analysis of Hybrid PON (WDM-TDM) with Equal and Unequal Channel Spacing

    NASA Astrophysics Data System (ADS)

    Sharma, Ramandeep; Dewra, Sanjeev; Rani, Aruna

    2016-06-01

    In this hybrid WDM-TDM PON has been evaluated and compared the downstream wavelengths with equal and unequal channel spacing at 5 Gbit/s per wavelength in the scenario of triple play services with 128 optical network units (ONUs). The triple play services: data, voice and video signals are transmitted up to 50 km distance having Q factor of 6.68 and BER of 3.64e-012 with unequal channel spacing and 45 km distance having Q factor of 6.33 and BER of 2.40e-011 with equal channel spacing in downstream direction. It has been observed that downstream wavelengths with unequal channel spacing provide better results than equal channel spacing.

  6. Context-invariant quasi hidden variable (qHV) modelling of all joint von Neumann measurements for an arbitrary Hilbert space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loubenets, Elena R.

    We prove the existence for each Hilbert space of the two new quasi hidden variable (qHV) models, statistically noncontextual and context-invariant, reproducing all the von Neumann joint probabilities via non-negative values of real-valued measures and all the quantum product expectations—via the qHV (classical-like) average of the product of the corresponding random variables. In a context-invariant model, a quantum observable X can be represented by a variety of random variables satisfying the functional condition required in quantum foundations but each of these random variables equivalently models X under all joint von Neumann measurements, regardless of their contexts. The proved existence ofmore » this model negates the general opinion that, in terms of random variables, the Hilbert space description of all the joint von Neumann measurements for dimH≥3 can be reproduced only contextually. The existence of a statistically noncontextual qHV model, in particular, implies that every N-partite quantum state admits a local quasi hidden variable model introduced in Loubenets [J. Math. Phys. 53, 022201 (2012)]. The new results of the present paper point also to the generality of the quasi-classical probability model proposed in Loubenets [J. Phys. A: Math. Theor. 45, 185306 (2012)].« less

  7. 20th Space Simulation Conference: The Changing Testing Paradigm

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph L., III (Compiler)

    1998-01-01

    The Institute of Environmental Sciences' Twentieth Space Simulation Conference, "The Changing Testing Paradigm" provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, program/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme "The Changing Testing Paradigm."

  8. Indian LSSC (Large Space Simulation Chamber) facility

    NASA Technical Reports Server (NTRS)

    Brar, A. S.; Prasadarao, V. S.; Gambhir, R. D.; Chandramouli, M.

    1988-01-01

    The Indian Space Agency has undertaken a major project to acquire in-house capability for thermal and vacuum testing of large satellites. This Large Space Simulation Chamber (LSSC) facility will be located in Bangalore and is to be operational in 1989. The facility is capable of providing 4 meter diameter solar simulation with provision to expand to 4.5 meter diameter at a later date. With such provisions as controlled variations of shroud temperatures and availability of infrared equipment as alternative sources of thermal radiation, this facility will be amongst the finest anywhere. The major design concept and major aspects of the LSSC facility are presented here.

  9. A Transportation Model for a Space Colonization and Manufacturing System: A Q-GERT Simulation.

    DTIC Science & Technology

    1982-12-01

    34 .- •..................................."............ ;,,,=, ; ,, =,..,t , .. =-- j -’ - 24. Heppenheimer , Thomas A . rnLnnies in Space. Harrisburg, Pa...Colonel Thomas D. Clark. Captain John D. Rask, my co-worker on that project, and I developed a simple model for the transportation system during this...K. O’Neill and Thomas A . Heppenhiemer. (An example of a Delphi for a space problem is given in Ref 8.) Some of the parameters needing 78 .* better, or

  10. Interplanetary Transit Simulations Using the International Space Station

    NASA Technical Reports Server (NTRS)

    Charles, John B.; Arya, M.; Kundrot, C. E.

    2010-01-01

    We evaluated the space life sciences utility of the International Space Station (ISS) to simulate the outbound transit portion of missions to Mars and Near Earth Asteroids (NEA) to investigate biomedical and psychological aspects of such transits, to develop and test space operation procedures compatible with communication delays and outages, and to demonstrate and validate technologies and countermeasures. Two major categories of space life sciences activities can capitalize on ISS capabilities. The first includes studies that require ISS (or a comparable facility), typically for access to prolonged weightlessness. The second includes studies that do not strictly require ISS but can exploit it to maximize their scientific return more efficiently and productively than in ground-based simulations. For these studies, ISS offers a high fidelity analog for fundamental factors on future missions, such as crew composition, mission control personnel, operational tasks and workload, real-world risk, and isolation, and can mimic the effects of distance and limited accessibility. In addition to conducting Mars- and NEA-transit simulations on 6-month ISS increments, extending the current ISS increment duration from 6 months to 9 or even 12 months will provide opportunities for enhanced and focused research relevant to long duration Mars and NEA missions. Increasing the crew duration may pose little additional risk to crewmembers beyond that currently accepted on 6-month increments, but additional medical monitoring capabilities will be required beyond those currently used for ISS operations. Finally, while presenting major logistical challenges, such a simulation followed by a post-landing simulation of Mars exploration could provide quantitative evidence of capabilities in an actual mission. Thus, the use of ISS to simulate aspects of Mars and NEA missions seems practical. If it were to be implemented without major disruption of on-going ISS activities, then planning should

  11. Siblings with opposite chromosome constitutions, dup(2q)/del(7q) and del(2q)/dup(7q).

    PubMed

    Shim, Sung Han; Shim, Jae Sun; Min, Kyunghoon; Lee, Hee Song; Park, Ji Eun; Park, Sang Hee; Hwang, Euna; Kim, Minyoung

    2014-01-15

    Chromosome 7q36 microdeletion syndrome is a rare genomic disorder characterized by underdevelopment of the brain, microcephaly, anomalies of the sex organs, and language problems. Developmental delay, intellectual disability, autistic spectrum disorders, BDMR syndrome, and unusual facial morphology are the key features of the chromosome 2q37 microdeletion syndrome. A genetic screening for two brothers with global developmental delay using high-resolution chromosomal analysis and subtelomeric multiplex ligation-dependent probe amplification revealed subtelomeric rearrangements on the same sites of 2q37.2 and 7q35, with reversed deletion and duplication. Both of them showed dysmorphic facial features, severe disability of physical and intellectual development, and abnormal genitalia with differential abnormalities in their phenotypes. The family did not have abnormal genetic phenotypes. According to the genetic analysis of their parents, adjacent-1 segregation from their mother's was suggested as a mechanism of their gene mutation. By comparing the phenotypes of our patients with previous reports on similar patients, we tried to obtain the information of related genes and their chromosomal locations. © 2013.

  12. Q&A: The space poet

    NASA Astrophysics Data System (ADS)

    Hoffman, Jascha

    2011-11-01

    Tracy K. Smith has her head in the stars. Thanks to her late father's job as an engineer on the Hubble Space Telescope, the US poet gathers inspiration from astrophysics and cosmology. Published this year, her third collection, Life on Mars, explores the future of human life, the great beyond and her father's death. As she prepares for a poetry reading at the Space Telescope Science Institute in Baltimore, Maryland, Smith talks about the limits of space and time.

  13. Time-domain self-consistent theory of frequency-locking regimes in gyrotrons with low-Q resonators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ginzburg, N. S., E-mail: ginzburg@appl.sci-nnov.ru; Nizhny Novgorod State University, 603950, gagarin av., 23, Nizhny Novgorod; Sergeev, A. S.

    2015-03-15

    A time-domain theory of frequency-locking gyrotron oscillators with low-Q resonators has been developed. The presented theory is based on the description of wave propagation by a parabolic equation taking into account the external signal by modification of boundary conditions. We show that the developed model can be effectively used for simulations of both single- and multi-mode operation regimes in gyrotrons driven by an external signal. For the case of low-Q resonators typical for powerful gyrotrons, the external signal can influence the axial field profile inside the interaction space significantly and, correspondingly, the value of the electron orbital efficiency.

  14. Data Analysis of Sequences and qPCR for Microbial Communities during Algal Blooms

    EPA Pesticide Factsheets

    A training opportunity is open to a highly microbial-research-motivated student to conduct sequence analysis, explore novel genes and metabolic pathways, validate resultant findings using qPCR/RT-qPCR and summarize the findings

  15. Performance of cancer cluster Q-statistics for case-control residential histories

    PubMed Central

    Sloan, Chantel D.; Jacquez, Geoffrey M.; Gallagher, Carolyn M.; Ward, Mary H.; Raaschou-Nielsen, Ole; Nordsborg, Rikke Baastrup; Meliker, Jaymie R.

    2012-01-01

    Few investigations of health event clustering have evaluated residential mobility, though causative exposures for chronic diseases such as cancer often occur long before diagnosis. Recently developed Q-statistics incorporate human mobility into disease cluster investigations by quantifying space- and time-dependent nearest neighbor relationships. Using residential histories from two cancer case-control studies, we created simulated clusters to examine Q-statistic performance. Results suggest the intersection of cases with significant clustering over their life course, Qi, with cases who are constituents of significant local clusters at given times, Qit, yielded the best performance, which improved with increasing cluster size. Upon comparison, a larger proportion of true positives were detected with Kulldorf’s spatial scan method if the time of clustering was provided. We recommend using Q-statistics to identify when and where clustering may have occurred, followed by the scan method to localize the candidate clusters. Future work should investigate the generalizability of these findings. PMID:23149326

  16. Dutch modality exclusivity norms: Simulating perceptual modality in space.

    PubMed

    Speed, Laura J; Majid, Asifa

    2017-12-01

    Perceptual information is important for the meaning of nouns. We present modality exclusivity norms for 485 Dutch nouns rated on visual, auditory, haptic, gustatory, and olfactory associations. We found these nouns are highly multimodal. They were rated most dominant in vision, and least in olfaction. A factor analysis identified two main dimensions: one loaded strongly on olfaction and gustation (reflecting joint involvement in flavor), and a second loaded strongly on vision and touch (reflecting joint involvement in manipulable objects). In a second study, we validated the ratings with similarity judgments. As expected, words from the same dominant modality were rated more similar than words from different dominant modalities; but - more importantly - this effect was enhanced when word pairs had high modality strength ratings. We further demonstrated the utility of our ratings by investigating whether perceptual modalities are differentially experienced in space, in a third study. Nouns were categorized into their dominant modality and used in a lexical decision experiment where the spatial position of words was either in proximal or distal space. We found words dominant in olfaction were processed faster in proximal than distal space compared to the other modalities, suggesting olfactory information is mentally simulated as "close" to the body. Finally, we collected ratings of emotion (valence, dominance, and arousal) to assess its role in perceptual space simulation, but the valence did not explain the data. So, words are processed differently depending on their perceptual associations, and strength of association is captured by modality exclusivity ratings.

  17. Experimental study and simulation of space charge stimulated discharge

    NASA Astrophysics Data System (ADS)

    Noskov, M. D.; Malinovski, A. S.; Cooke, C. M.; Wright, K. A.; Schwab, A. J.

    2002-11-01

    The electrical discharge of volume distributed space charge in poly(methylmethacrylate) (PMMA) has been investigated both experimentally and by computer simulation. The experimental space charge was implanted in dielectric samples by exposure to a monoenergetic electron beam of 3 MeV. Electrical breakdown through the implanted space charge region within the sample was initiated by a local electric field enhancement applied to the sample surface. A stochastic-deterministic dynamic model for electrical discharge was developed and used in a computer simulation of these breakdowns. The model employs stochastic rules to describe the physical growth of the discharge channels, and deterministic laws to describe the electric field, the charge, and energy dynamics within the discharge channels and the dielectric. Simulated spatial-temporal and current characteristics of the expanding discharge structure during physical growth are quantitatively compared with the experimental data to confirm the discharge model. It was found that a single fixed set of physically based dielectric parameter values was adequate to simulate the complete family of experimental space charge discharges in PMMA. It is proposed that such a set of parameters also provides a useful means to quantify the breakdown properties of other dielectrics.

  18. 20th Space Simulation Conference: The Changing Testing Paradigm

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph L., III (Compiler)

    1999-01-01

    The Institute of Environmental Sciences and Technology's Twentieth Space Simulation Conference, "The Changing Testing Paradigm" provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, program/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme "The Changing Testing Paradigm."

  19. Q-mode versus R-mode principal component analysis for linear discriminant analysis (LDA)

    NASA Astrophysics Data System (ADS)

    Lee, Loong Chuen; Liong, Choong-Yeun; Jemain, Abdul Aziz

    2017-05-01

    Many literature apply Principal Component Analysis (PCA) as either preliminary visualization or variable con-struction methods or both. Focus of PCA can be on the samples (R-mode PCA) or variables (Q-mode PCA). Traditionally, R-mode PCA has been the usual approach to reduce high-dimensionality data before the application of Linear Discriminant Analysis (LDA), to solve classification problems. Output from PCA composed of two new matrices known as loadings and scores matrices. Each matrix can then be used to produce a plot, i.e. loadings plot aids identification of important variables whereas scores plot presents spatial distribution of samples on new axes that are also known as Principal Components (PCs). Fundamentally, the scores matrix always be the input variables for building classification model. A recent paper uses Q-mode PCA but the focus of analysis was not on the variables but instead on the samples. As a result, the authors have exchanged the use of both loadings and scores plots in which clustering of samples was studied using loadings plot whereas scores plot has been used to identify important manifest variables. Therefore, the aim of this study is to statistically validate the proposed practice. Evaluation is based on performance of external error obtained from LDA models according to number of PCs. On top of that, bootstrapping was also conducted to evaluate the external error of each of the LDA models. Results show that LDA models produced by PCs from R-mode PCA give logical performance and the matched external error are also unbiased whereas the ones produced with Q-mode PCA show the opposites. With that, we concluded that PCs produced from Q-mode is not statistically stable and thus should not be applied to problems of classifying samples, but variables. We hope this paper will provide some insights on the disputable issues.

  20. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  1. Evaluation of the effects of solar radiation on glass. [space environment simulation

    NASA Technical Reports Server (NTRS)

    Firestone, R. F.; Harada, Y.

    1979-01-01

    The degradation of glass used on space structures due to electromagnetic and particulate radiation in a space environment was evaluated. The space environment was defined and a simulated space exposure apparatus was constructed. Four optical materials were exposed to simulated solar and particulate radiation in a space environment. Sapphire and fused silica experienced little change in transmittance, while optical crown glass and ultra low expansion glass darkened appreciably. Specimen selection and preparation, exposure conditions, and the effect of simulated exposure are discussed. A selective bibliography of the effect of radiation on glass is included.

  2. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  3. Requirements for Modeling and Simulation for Space Medicine Operations: Preliminary Considerations

    NASA Technical Reports Server (NTRS)

    Dawson, David L.; Billica, Roger D.; Logan, James; McDonald, P. Vernon

    2001-01-01

    The NASA Space Medicine program is now developing plans for more extensive use of high-fidelity medical Simulation systems. The use of simulation is seen as means to more effectively use the limited time available for astronaut medical training. Training systems should be adaptable for use in a variety of training environments, including classrooms or laboratories, space vehicle mockups, analog environments, and in microgravity. Modeling and simulation can also provide the space medicine development program a mechanism for evaluation of other medical technologies under operationally realistic conditions. Systems and procedures need preflight verification with ground-based testing. Traditionally, component testing has been accomplished, but practical means for "human in the loop" verification of patient care systems have been lacking. Medical modeling and simulation technology offer potential means to accomplish such validation work. Initial considerations in the development of functional requirements and design standards for simulation systems for space medicine are discussed.

  4. Distributed interactive communication in simulated space-dwelling groups.

    PubMed

    Brady, Joseph V; Hienz, Robert D; Hursh, Steven R; Ragusa, Leonard C; Rouse, Charles O; Gasior, Eric D

    2004-03-01

    This report describes the development and preliminary application of an experimental test bed for modeling human behavior in the context of a computer generated environment to analyze the effects of variations in communication modalities, incentives and stressful conditions. In addition to detailing the methodological development of a simulated task environment that provides for electronic monitoring and recording of individual and group behavior, the initial substantive findings from an experimental analysis of distributed interactive communication in simulated space dwelling groups are described. Crews of three members each (male and female) participated in simulated "planetary missions" based upon a synthetic scenario task that required identification, collection, and analysis of geologic specimens with a range of grade values. The results of these preliminary studies showed clearly that cooperative and productive interactions were maintained between individually isolated and distributed individuals communicating and problem-solving effectively in a computer-generated "planetary" environment over extended time intervals without benefit of one another's physical presence. Studies on communication channel constraints confirmed the functional interchangeability between available modalities with the highest degree of interchangeability occurring between Audio and Text modes of communication. The effects of task-related incentives were determined by the conditions under which they were available with Positive Incentives effectively attenuating decrements in performance under stressful time pressure. c2003 Elsevier Ltd. All rights reserved.

  5. Planetary and Space Simulation Facilities PSI at DLR for Astrobiology

    NASA Astrophysics Data System (ADS)

    Rabbow, E.; Rettberg, P.; Panitz, C.; Reitz, G.

    2008-09-01

    Ground based experiments, conducted in the controlled planetary and space environment simulation facilities PSI at DLR, are used to investigate astrobiological questions and to complement the corresponding experiments in LEO, for example on free flying satellites or on space exposure platforms on the ISS. In-orbit exposure facilities can only accommodate a limited number of experiments for exposure to space parameters like high vacuum, intense radiation of galactic and solar origin and microgravity, sometimes also technically adapted to simulate extraterrestrial planetary conditions like those on Mars. Ground based experiments in carefully equipped and monitored simulation facilities allow the investigation of the effects of simulated single environmental parameters and selected combinations on a much wider variety of samples. In PSI at DLR, international science consortia performed astrobiological investigations and space experiment preparations, exposing organic compounds and a wide range of microorganisms, reaching from bacterial spores to complex microbial communities, lichens and even animals like tardigrades to simulated planetary or space environment parameters in pursuit of exobiological questions on the resistance to extreme environments and the origin and distribution of life. The Planetary and Space Simulation Facilities PSI of the Institute of Aerospace Medicine at DLR in Köln, Germany, providing high vacuum of controlled residual composition, ionizing radiation of a X-ray tube, polychromatic UV radiation in the range of 170-400 nm, VIS and IR or individual monochromatic UV wavelengths, and temperature regulation from -20°C to +80°C at the sample size individually or in selected combinations in 9 modular facilities of varying sizes are presented with selected experiments performed within.

  6. pcr: an R package for quality assessment, analysis and testing of qPCR data

    PubMed Central

    Ahmed, Mahmoud

    2018-01-01

    Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953

  7. Advanced Thermal Simulator Testing: Thermal Analysis and Test Results

    NASA Technical Reports Server (NTRS)

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Reid, Robert; Adams, Mike; Davis, Joe

    2008-01-01

    Work at the NASA Marshall Space Flight Center seeks to develop high fidelity, electrically heated thermal simulators that represent fuel elements in a nuclear reactor design to support non-nuclear testing applicable to the development of a space nuclear power or propulsion system. Comparison between the fuel pins and thermal simulators is made at the outer fuel clad surface, which corresponds to the outer sheath surface in the thermal simulator. The thermal simulators that are currently being tested correspond to a SNAP derivative reactor design that could be applied for Lunar surface power. These simulators are designed to meet the geometric and power requirements of a proposed surface power reactor design, accommodate testing of various axial power profiles, and incorporate imbedded instrumentation. This paper reports the results of thermal simulator analysis and testing in a bare element configuration, which does not incorporate active heat removal, and testing in a water-cooled calorimeter designed to mimic the heat removal that would be experienced in a reactor core.

  8. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  9. High resolution chromosome 3p, 8p, 9q and 22q allelotyping analysis in the pathogenesis of gallbladder carcinoma

    PubMed Central

    Wistuba, I I; Maitra, A; Carrasco, R; Tang, M; Troncoso, P; Minna, J D; Gazdar, A F

    2002-01-01

    Our recent genome-wide allelotyping analysis of gallbladder carcinoma identified 3p, 8p, 9q and 22q as chromosomal regions with frequent loss of heterozygosity. The present study was undertaken to more precisely identify the presence and location of regions of frequent allele loss involving those chromosomes in gallbladder carcinoma. Microdissected tissue from 24 gallbladder carcinoma were analysed for PCR-based loss of heterozygosity using 81 microsatellite markers spanning chromosome 3p (n=26), 8p (n=14), 9q (n=29) and 22q (n=12) regions. We also studied the role of those allele losses in gallbladder carcinoma pathogenesis by examining 45 microdissected normal and dysplastic gallbladder epithelia accompanying gallbladder carcinoma, using 17 microsatellite markers. Overall frequencies of loss of heterozygosity at 3p (100%), 8p (100%), 9q (88%), and 22q (92%) sites were very high in gallbladder carcinoma, and we identified 13 distinct regions undergoing frequent loss of heterozygosity in tumours. Allele losses were frequently detected in normal and dysplastic gallbladder epithelia. There was a progressive increase of the overall loss of heterozygosity frequency with increasing severity of histopathological changes. Allele losses were not random and followed a sequence. This study refines several distinct chromosome 3p, 8p, 9q and 22q regions undergoing frequent allele loss in gallbladder carcinoma that will aid in the positional identification of tumour suppressor genes involved in gallbladder carcinoma pathogenesis. British Journal of Cancer (2002) 87, 432–440. doi:10.1038/sj.bjc.6600490 www.bjcancer.com © 2002 Cancer Research UK PMID:12177780

  10. Launching a Dream. A Teachers Guide to a Simulated Space Shuttle Mission.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Cleveland, OH. Lewis Research Center.

    This publication is about imagination, teamwork, creativity, and a host of other ingredients required to carry out a dream. It is about going into space--going into space as part of a simulated space shuttle mission. The publication highlights two simulated shuttle missions cosponsored by the National Aeronautics and Space Administration (NASA)…

  11. Symplectic multiparticle tracking model for self-consistent space-charge simulation

    DOE PAGES

    Qiang, Ji

    2017-01-23

    Symplectic tracking is important in accelerator beam dynamics simulation. So far, to the best of our knowledge, there is no self-consistent symplectic space-charge tracking model available in the accelerator community. In this paper, we present a two-dimensional and a three-dimensional symplectic multiparticle spectral model for space-charge tracking simulation. This model includes both the effect from external fields and the effect of self-consistent space-charge fields using a split-operator method. Such a model preserves the phase space structure and shows much less numerical emittance growth than the particle-in-cell model in the illustrative examples.

  12. Symplectic multiparticle tracking model for self-consistent space-charge simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiang, Ji

    Symplectic tracking is important in accelerator beam dynamics simulation. So far, to the best of our knowledge, there is no self-consistent symplectic space-charge tracking model available in the accelerator community. In this paper, we present a two-dimensional and a three-dimensional symplectic multiparticle spectral model for space-charge tracking simulation. This model includes both the effect from external fields and the effect of self-consistent space-charge fields using a split-operator method. Such a model preserves the phase space structure and shows much less numerical emittance growth than the particle-in-cell model in the illustrative examples.

  13. Special "space" suit for the Reduced Gravity Walking Simulator

    NASA Image and Video Library

    1965-05-05

    Special "space" suit for the Reduced Gravity Walking Simulator located at the Lunar Landing Facility. The purpose of this simulator was to study the subject while walking, jumping or running. Researchers conducted studies of various factors such as fatigue limit, energy expenditure, and speed of locomotion. A.W. Vigil described the purpose of the simulator in his paper "Discussion of Existing and Planned Simulators for Space Research," "When the astronauts land on the moon they will be in an unfamiliar environment involving, particularly, a gravitational field only one-sixth as strong as on earth. A novel method of simulating lunar gravity has been developed and is supported by a puppet-type suspension system at the end of a long pendulum. A floor is provided at the proper angle so that one-sixth of the subject's weight is supported by the floor with the remainder being supported by the suspension system. This simulator allows almost complete freedom in vertical translation and pitch and is considered to be a very realistic simulation of the lunar walking problem. For this problem this simulator suffers only slightly from the restrictions in lateral movement it puts on the test subject. This is not considered a strong disadvantage for ordinary walking problems since most of the motions do, in fact, occur in the vertical plane. However, this simulation technique would be severely restrictive if applied to the study of the extra-vehicular locomotion problem, for example, because in this situation complete six degrees of freedom are rather necessary. This technique, in effect, automatically introduces a two-axis attitude stabilization system into the problem. The technique could, however, be used in preliminary studies of extra-vehicular locomotion where, for example, it might be assumed that one axis of the attitude control system on the astronaut maneuvering unit may have failed." -- Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center

  14. An Orion/Ares I Launch and Ascent Simulation: One Segment of the Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Chung, Victoria I.; Crues, Edwin Z.; Blum, Mike G.; Alofs, Cathy; Busto, Juan

    2007-01-01

    This paper describes the architecture and implementation of a distributed launch and ascent simulation of NASA's Orion spacecraft and Ares I launch vehicle. This simulation is one segment of the Distributed Space Exploration Simulation (DSES) Project. The DSES project is a research and development collaboration between NASA centers which investigates technologies and processes for distributed simulation of complex space systems in support of NASA's Exploration Initiative. DSES is developing an integrated end-to-end simulation capability to support NASA development and deployment of new exploration spacecraft and missions. This paper describes the first in a collection of simulation capabilities that DSES will support.

  15. Fuzzy Q-Learning for Generalization of Reinforcement Learning

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1996-01-01

    Fuzzy Q-Learning, introduced earlier by the author, is an extension of Q-Learning into fuzzy environments. GARIC is a methodology for fuzzy reinforcement learning. In this paper, we introduce GARIC-Q, a new method for doing incremental Dynamic Programming using a society of intelligent agents which are controlled at the top level by Fuzzy Q-Learning and at the local level, each agent learns and operates based on GARIC. GARIC-Q improves the speed and applicability of Fuzzy Q-Learning through generalization of input space by using fuzzy rules and bridges the gap between Q-Learning and rule based intelligent systems.

  16. Thirteenth Space Simulation Conference. The Payload: Testing for Success

    NASA Technical Reports Server (NTRS)

    Stecher, J. (Editor)

    1984-01-01

    Information on the state of the art in space simulation, test technology, thermal simulation and protection, contamination, and test measurements and techniques are presented. Simulation of upper atmosphere oxygen was discussed. Problems and successes of retrieving and repairing orbiting spacecrafts by utilizing the shuttle are outlined.

  17. Development of Models for High Precision Simulation of the Space Mission Microscope

    NASA Astrophysics Data System (ADS)

    Bremer, Stefanie; List, Meike; Selig, Hanns; Lämmerzahl, Claus

    MICROSCOPE is a French space mission for testing the Weak Equivalence Principle (WEP). The mission goal is the determination of the Eötvös parameter with an accuracy of 10-15. This will be achieved by means of two high-precision capacitive differential accelerometers, that are built by the French institute ONERA. At the German institute ZARM drop tower tests are carried out to verify the payload performance. Additionally, the mission data evaluation is prepared in close cooperation with the French partners CNES, ONERA and OCA. Therefore a comprehensive simulation of the real system including the science signal and all error sources is built for the development and testing of data reduction and data analysis algorithms to extract the WEP violation signal. Currently, the High Performance Satellite Dynamics Simulator (HPS), a cooperation project of ZARM and the DLR Institute of Space Systems, is adapted to the MICROSCOPE mission for the simulation of test mass and satellite dynamics. Models of environmental disturbances like solar radiation pressure are considered, too. Furthermore detailed modeling of the on-board capacitive sensors is done.

  18. Understanding the Primary School Students' Van Hiele Levels of Geometry Thinking in Learning Shapes and Spaces: A Q-Methodology

    ERIC Educational Resources Information Center

    Hock, Tan Tong; Tarmizi, Rohani Ahmad; Yunus, Aida Suraya Md.; Ayub, Ahmad Fauzi

    2015-01-01

    This study was conducted using a new hybrid method of research which combined qualitative and quantitative designs to investigate the viewpoints of primary school students' conceptual understanding in learning geometry from the aspect of shapes and spaces according to van Hiele theory. Q-methodology is used in this research to find out what…

  19. Using Numerical Modeling to Simulate Space Capsule Ground Landings

    NASA Technical Reports Server (NTRS)

    Heymsfield, Ernie; Fasanella, Edwin L.

    2009-01-01

    Experimental work is being conducted at the National Aeronautics and Space Administration s (NASA) Langley Research Center (LaRC) to investigate ground landing capabilities of the Orion crew exploration vehicle (CEV). The Orion capsule is NASA s replacement for the Space Shuttle. The Orion capsule will service the International Space Station and be used for future space missions to the Moon and to Mars. To evaluate the feasibility of Orion ground landings, a series of capsule impact tests are being performed at the NASA Langley Landing and Impact Research Facility (LandIR). The experimental results derived at LandIR provide means to validate and calibrate nonlinear dynamic finite element models, which are also being developed during this study. Because of the high cost and time involvement intrinsic to full-scale testing, numerical simulations are favored over experimental work. Subsequent to a numerical model validated by actual test responses, impact simulations will be conducted to study multiple impact scenarios not practical to test. Twenty-one swing tests using the LandIR gantry were conducted during the June 07 through October 07 time period to evaluate the Orion s impact response. Results for two capsule initial pitch angles, 0deg and -15deg , along with their computer simulations using LS-DYNA are presented in this article. A soil-vehicle friction coefficient of 0.45 was determined by comparing the test stopping distance with computer simulations. In addition, soil modeling accuracy is presented by comparing vertical penetrometer impact tests with computer simulations for the soil model used during the swing tests.

  20. Neuropsychological Endophenotype Approach to Genome-wide Linkage Analysis Identifies Susceptibility Loci for ADHD on 2q21.1 and 13q12.11

    PubMed Central

    Rommelse, Nanda N.J.; Arias-Vásquez, Alejandro; Altink, Marieke E.; Buschgens, Cathelijne J.M.; Fliers, Ellen; Asherson, Philip; Faraone, Stephen V.; Buitelaar, Jan K.; Sergeant, Joseph A.; Oosterlaan, Jaap; Franke, Barbara

    2008-01-01

    ADHD linkage findings have not all been consistently replicated, suggesting that other approaches to linkage analysis in ADHD might be necessary, such as the use of (quantitative) endophenotypes (heritable traits associated with an increased risk for ADHD). Genome-wide linkage analyses were performed in the Dutch subsample of the International Multi-Center ADHD Genetics (IMAGE) study comprising 238 DSM-IV combined-type ADHD probands and their 112 affected and 195 nonaffected siblings. Eight candidate neuropsychological ADHD endophenotypes with heritabilities > 0.2 were used as quantitative traits. In addition, an overall component score of neuropsychological functioning was used. A total of 5407 autosomal single-nucleotide polymorphisms (SNPs) were used to run multipoint regression-based linkage analyses. Two significant genome-wide linkage signals were found, one for Motor Timing on chromosome 2q21.1 (LOD score: 3.944) and one for Digit Span on 13q12.11 (LOD score: 3.959). Ten suggestive linkage signals were found (LOD scores ≥ 2) on chromosomes 2p, 2q, 3p, 4q, 8q, 12p, 12q, 14q, and 17q. The suggestive linkage signal for the component score that was found at 2q14.3 (LOD score: 2.878) overlapped with the region significantly linked to Motor Timing. Endophenotype approaches may increase power to detect susceptibility loci in ADHD and possibly in other complex disorders. PMID:18599010

  1. Swift Observatory Space Simulation Testing

    NASA Technical Reports Server (NTRS)

    Espiritu, Mellina; Choi, Michael K.; Scocik, Christopher S.

    2004-01-01

    The Swift Observatory is a Middle-Class Explorer (MIDEX) mission that is a rapidly re-pointing spacecraft with immediate data distribution capability to the astronomical community. Its primary objectives are to characterize and determine the origin of Gamma Ray Bursts (GRBs) and to use the collected data on GRB phenomena in order to probe the universe and gain insight into the physics of black hole formation and early universe. The main components of the spacecraft are the Burst Alert Telescope (BAT), Ultraviolet and Optical Telescope (UVOT), X-Ray Telescope (XRT), and Optical Bench (OB) instruments coupled with the Swift spacecraft (S/C) bus. The Swift Observatory will be tested at the Space Environment Simulation (SES) chamber at the Goddard Space Flight Center from May to June 2004 in order to characterize its thermal behavior in a vacuum environment. In order to simulate the independent thermal zones required by the BAT, XRT, UVOT, and OB instruments, the spacecraft is mounted on a chariot structure capable of maintaining adiabatic interfaces and enclosed in a modified, four section MSX fixture in order to accommodate the strategic placement of seven cryopanels (on four circuits), four heater panels, and a radiation source burst simulator mechanism. There are additionally 55 heater circuits on the spacecraft. To mitigate possible migration of silicone contaminants from BAT to the XRT and UVOT instruments, a contamination enclosure is to be fabricated around the BAT at the uppermost section of the MSX fixture. This paper discuses the test requirements and implemented thermal vacuum test configuration for the Swift Observatory.

  2. Partial trisomy 12q24.31----qter.

    PubMed Central

    Tajara, E H; Varella-Garcia, M; Gusson, A C

    1985-01-01

    Clinical details of a male child with the karyotype 46,XY,-4,+der(4),t(4;12) (p16;q24.31)mat are reported and compared with those of other known cases of partial trisomy of the distal region of 12q. This condition is apparently associated with mental and psychomotor retardation, widely spaced eyes, flat nasal bridge, low set ears, down-turned mouth, micrognathia, loose skin at the nape, widely spaced nipples, simian creases, clinodactyly, abnormalities of the genitourinary system, alterations in the sacrococcygeal region, and deformities of the lower limbs. In the majority of the reported cases, the break-point was in the 12q24 region and resulted from adjacent 1 segregation of a maternal balanced translocation. Images PMID:3981585

  3. Time-varying q-deformed dark energy interacts with dark matter

    NASA Astrophysics Data System (ADS)

    Dil, Emre; Kolay, Erdinç

    We propose a new model for studying the dark constituents of the universe by regarding the dark energy as a q-deformed scalar field interacting with the dark matter, in the framework of standard general relativity. Here we assume that the number of particles in each mode of the q-deformed scalar field varies in time by the particle creation and annihilation. We first describe the q-deformed scalar field dark energy quantum-field theoretically, then construct the action and the dynamical structure of these interacting dark sectors, in order to study the dynamics of the model. We perform the phase space analysis of the model to confirm and interpret our proposal by searching the stable attractor solutions implying the late-time accelerating phase of the universe. We then obtain the result that when interaction and equation-of-state parameter of the dark matter evolve from the present day values into a particular value, the dark energy turns out to be a q-deformed scalar field.

  4. Energy consumption analysis of the Venus Deep Space Station (DSS-13)

    NASA Technical Reports Server (NTRS)

    Hayes, N. V.

    1983-01-01

    This report continues the energy consumption analysis and verification study of the tracking stations of the Goldstone Deep Space Communications Complex, and presents an audit of the Venus Deep Space Station (DSS 13). Due to the non-continuous radioastronomy research and development operations at the station, estimations of energy usage were employed in the energy consumption simulation of both the 9-meter and 26-meter antenna buildings. A 17.9% decrease in station energy consumption was experienced over the 1979-1981 years under study. A comparison of the ECP computer simulations and the station's main watt-hour meter readings showed good agreement.

  5. Virtual Observatories for Space Physics Observations and Simulations: New Routes to Efficient Access and Visualization

    NASA Technical Reports Server (NTRS)

    Roberts, Aaron

    2005-01-01

    New tools for data access and visualization promise to make the analysis of space plasma data both more efficient and more powerful, especially for answering questions about the global structure and dynamics of the Sun-Earth system. We will show how new existing tools (particularly the Virtual Space Physics Observatory-VSPO-and the Visual System for Browsing, Analysis and Retrieval of Data-ViSBARD; look for the acronyms in Google) already provide rapid access to such information as spacecraft orbits, browse plots, and detailed data, as well as visualizations that can quickly unite our view of multispacecraft observations. We will show movies illustrating multispacecraft observations of the solar wind and magnetosphere during a magnetic storm, and of simulations of 3 0-spacecraft observations derived from MHD simulations of the magnetosphere sampled along likely trajectories of the spacecraft for the MagCon mission. An important issue remaining to be solved is how best to integrate simulation data and services into the Virtual Observatory environment, and this talk will hopefully stimulate further discussion along these lines.

  6. CheS-Mapper 2.0 for visual validation of (Q)SAR models

    PubMed Central

    2014-01-01

    Background Sound statistical validation is important to evaluate and compare the overall performance of (Q)SAR models. However, classical validation does not support the user in better understanding the properties of the model or the underlying data. Even though, a number of visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allow the investigation of model validation results are still lacking. Results We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. The approach applies the 3D viewer CheS-Mapper, an open-source application for the exploration of small molecules in virtual 3D space. The present work describes the new functionalities in CheS-Mapper 2.0, that facilitate the analysis of (Q)SAR information and allows the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. The approach is generic: It is model-independent and can handle physico-chemical and structural input features as well as quantitative and qualitative endpoints. Conclusions Visual validation with CheS-Mapper enables analyzing (Q)SAR information in the data and indicates how this information is employed by the (Q)SAR model. It reveals, if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org. Graphical abstract Comparing actual and predicted activity values with CheS-Mapper.

  7. Dual keel Space Station payload pointing system design and analysis feasibility study

    NASA Technical Reports Server (NTRS)

    Smagala, Tom; Class, Brian F.; Bauer, Frank H.; Lebair, Deborah A.

    1988-01-01

    A Space Station attached Payload Pointing System (PPS) has been designed and analyzed. The PPS is responsible for maintaining fixed payload pointing in the presence of disturbance applied to the Space Station. The payload considered in this analysis is the Solar Optical Telescope. System performance is evaluated via digital time simulations by applying various disturbance forces to the Space Station. The PPS meets the Space Station articulated pointing requirement for all disturbances except Shuttle docking and some centrifuge cases.

  8. Essential energy space random walk via energy space metadynamics method to accelerate molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Li, Hongzhi; Min, Donghong; Liu, Yusong; Yang, Wei

    2007-09-01

    To overcome the possible pseudoergodicity problem, molecular dynamic simulation can be accelerated via the realization of an energy space random walk. To achieve this, a biased free energy function (BFEF) needs to be priori obtained. Although the quality of BFEF is essential for sampling efficiency, its generation is usually tedious and nontrivial. In this work, we present an energy space metadynamics algorithm to efficiently and robustly obtain BFEFs. Moreover, in order to deal with the associated diffusion sampling problem caused by the random walk in the total energy space, the idea in the original umbrella sampling method is generalized to be the random walk in the essential energy space, which only includes the energy terms determining the conformation of a region of interest. This essential energy space generalization allows the realization of efficient localized enhanced sampling and also offers the possibility of further sampling efficiency improvement when high frequency energy terms irrelevant to the target events are free of activation. The energy space metadynamics method and its generalization in the essential energy space for the molecular dynamics acceleration are demonstrated in the simulation of a pentanelike system, the blocked alanine dipeptide model, and the leucine model.

  9. Improving Tribological Properties of Multialkylated Cyclopentanes under Simulated Space Environment: Two Feasible Approaches.

    PubMed

    Fan, Xiaoqiang; Wang, Liping; Li, Wen; Wan, Shanhong

    2015-07-08

    Space mechanisms require multialkylated cyclopentanes (MACs) more lubricious, more reliable, more durable, and better adaptive to harsh space environments. In this study, two kinds of additives were added into MACs for improving the tribological properties under simulated space environments: (a) solid nanoparticles (tungsten disulfide (WS2), tungsten trioxide (WO3), lanthanum oxide (La2O3), and lanthanum trifluoride (LaF3)) for steel/steel contacts; (b) liquid additives like zinc dialkyldithiophosphate (ZDDP) and molybdenum dialkyldithiocarbamate (MoDTC) for steel/steel and steel/diamond-like carbon (DLC) contacts. The results show that, under harsh simulated space environments, addition of the solid nanoparticles into MACs allows the wear to be reduced by up to one order magnitude, while liquid additives simultaneously reduce friction and wear by 80% and 93%, respectively. Friction mechanisms were proposed according to surface/interface analysis techniques, such as X-ray photoelectron spectroscopy (XPS) and time-of-flight secondary ion mass spectroscopy (TOF-SIMS). The role of solid nanoparticles in reducing friction and wear mainly depends on their surface enhancement effect, and the liquid additives are attributed to the formation of tribochemical reaction film derived from ZDDP and MoDTC on the sliding surfaces.

  10. Interplanetary Transit Simulations Using the International Space Station

    NASA Technical Reports Server (NTRS)

    Charles, J. B.; Arya, Maneesh

    2010-01-01

    It has been suggested that the International Space Station (ISS) be utilized to simulate the transit portion of long-duration missions to Mars and near-Earth asteroids (NEA). The ISS offers a unique environment for such simulations, providing researchers with a high-fidelity platform to study, enhance, and validate technologies and countermeasures for these long-duration missions. From a space life sciences perspective, two major categories of human research activities have been identified that will harness the various capabilities of the ISS during the proposed simulations. The first category includes studies that require the use of the ISS, typically because of the need for prolonged weightlessness. The ISS is currently the only available platform capable of providing researchers with access to a weightless environment over an extended duration. In addition, the ISS offers high fidelity for other fundamental space environmental factors, such as isolation, distance, and accessibility. The second category includes studies that do not require use of the ISS in the strictest sense, but can exploit its use to maximize their scientific return more efficiently and productively than in ground-based simulations. In addition to conducting Mars and NEA simulations on the ISS, increasing the current increment duration on the ISS from 6 months to a longer duration will provide opportunities for enhanced and focused research relevant to long-duration Mars and NEA missions. Although it is currently believed that increasing the ISS crew increment duration to 9 or even 12 months will pose little additional risk to crewmembers, additional medical monitoring capabilities may be required beyond those currently used for the ISS operations. The use of the ISS to simulate aspects of Mars and NEA missions seems practical, and it is recommended that planning begin soon, in close consultation with all international partners.

  11. Principles of magnetohydrodynamic simulation in space plasmas

    NASA Technical Reports Server (NTRS)

    Sato, T.

    1985-01-01

    Attention is given to the philosophical as well as physical principles that are essential to the establishment of MHD simulation studies for solar plasma research, assuming the capabilities of state-of-the-art computers and emphasizing the importance of 'local' MHD simulation. Solar-terrestrial plasma space is divided into several elementary regions where a macroscopic elementary energy conversion process could conceivably occur; the local MHD simulation is defined as self-contained in each of the regions. The importance of, and the difficulties associated with, the boundary condition are discussed in detail. The roles of diagnostics and of the finite difference method are noted.

  12. Intracellular flow cytometry may be combined with good quality and high sensitivity RT-qPCR analysis.

    PubMed

    Sandstedt, Mikael; Jonsson, Marianne; Asp, Julia; Dellgren, Göran; Lindahl, Anders; Jeppsson, Anders; Sandstedt, Joakim

    2015-12-01

    Flow cytometry (FCM) has become a well-established method for analysis of both intracellular and cell-surface proteins, while quantitative RT-PCR (RT-qPCR) is used to determine gene expression with high sensitivity and specificity. Combining these two methods would be of great value. The effects of intracellular staining on RNA integrity and RT-qPCR sensitivity and quality have not, however, been fully examined. We, therefore, intended to assess these effects further. Cells from the human lung cancer cell line A549 were fixed, permeabilized and sorted by FCM. Sorted cells were analyzed using RT-qPCR. RNA integrity was determined by RNA quality indicator analysis. A549 cells were then mixed with cells of the mouse cardiomyocyte cell line HL-1. A549 cells were identified by the cell surface marker ABCG2, while HL-1 cells were identified by intracellular cTnT. Cells were sorted and analyzed by RT-qPCR. Finally, cell cultures from human atrial biopsies were used to evaluate the effects of fixation and permeabilization on RT-qPCR analysis of nonimmortalized cells stored prior to analysis by FCM. A large amount of RNA could be extracted even when cells had been fixed and permeabilized. Permeabilization resulted in increased RNA degradation and a moderate decrease in RT-qPCR sensitivity. Gene expression levels were also affected to a moderate extent. Sorted populations from the mixed A549 and HL-1 cell samples showed gene expression patterns that corresponded to FCM data. When samples were stored before FCM sorting, the RT-qPCR analysis could still be performed with high sensitivity and quality. In summary, our results show that intracellular FCM may be performed with only minor impairment of the RT-qPCR sensitivity and quality when analyzing sorted cells; however, these effects should be considered when comparing RT-qPCR data of not fixed samples with those of fixed and permeabilized samples. © 2015 International Society for Advancement of Cytometry.

  13. Axon diameter and intra-axonal volume fraction of the corticospinal tract in idiopathic normal pressure hydrocephalus measured by q-space imaging.

    PubMed

    Kamiya, Kouhei; Hori, Masaaki; Miyajima, Masakazu; Nakajima, Madoka; Suzuki, Yuriko; Kamagata, Koji; Suzuki, Michimasa; Arai, Hajime; Ohtomo, Kuni; Aoki, Shigeki

    2014-01-01

    Previous studies suggest that compression and stretching of the corticospinal tract (CST) potentially cause treatable gait disturbance in patients with idiopathic normal pressure hydrocephalus (iNPH). Measurement of axon diameter with diffusion MRI has recently been used to investigate microstructural alterations in neurological diseases. In this study, we investigated alterations in the axon diameter and intra-axonal fraction of the CST in iNPH by q-space imaging (QSI) analysis. Nineteen patients with iNPH and 10 age-matched controls were recruited. QSI data were obtained with a 3-T system by using a single-shot echo planar imaging sequence with the diffusion gradient applied parallel to the antero-posterior axis. By using a two-component low-q fit model, the root mean square displacements of intra-axonal space ( =  axon diameter) and intra-axonal volume fraction of the CST were calculated at the levels of the internal capsule and body of the lateral ventricle, respectively. Wilcoxon's rank-sum test revealed a significant increase in CST intra-axonal volume fraction at the paraventricular level in patients (p<0.001), whereas no significant difference was observed in the axon diameter. At the level of the internal capsule, neither axon diameter nor intra-axonal volume fraction differed significantly between the two groups. Our results suggest that in patients with iNPH, the CST does not undergo irreversible axonal damage but is rather compressed and/or stretched owing to pressure from the enlarged ventricle. These analyses of axon diameter and intra-axonal fraction yield insights into microstructural alterations of the CST in iNPH.

  14. Psychosocial value of space simulation for extended spaceflight

    NASA Technical Reports Server (NTRS)

    Kanas, N.

    1997-01-01

    There have been over 60 studies of Earth-bound activities that can be viewed as simulations of manned spaceflight. These analogs have involved Antarctic and Arctic expeditions, submarines and submersible simulators, land-based simulators, and hypodynamia environments. None of these analogs has accounted for all the variables related to extended spaceflight (e.g., microgravity, long-duration, heterogeneous crews), and some of the stimulation conditions have been found to be more representative of space conditions than others. A number of psychosocial factors have emerged from the simulation literature that correspond to important issues that have been reported from space. Psychological factors include sleep disorders, alterations in time sense, transcendent experiences, demographic issues, career motivation, homesickness, and increased perceptual sensitivities. Psychiatric factors include anxiety, depression, psychosis, psychosomatic symptoms, emotional reactions related to mission stage, asthenia, and postflight personality, and marital problems. Finally, interpersonal factors include tension resulting from crew heterogeneity, decreased cohesion over time, need for privacy, and issues involving leadership roles and lines of authority. Since future space missions will usually involve heterogeneous crews working on complicated objectives over long periods of time, these features require further study. Socio-cultural factors affecting confined crews (e.g., language and dialect, cultural differences, gender biases) should be explored in order to minimize tension and sustain performance. Career motivation also needs to be examined for the purpose of improving crew cohesion and preventing subgrouping, scapegoating, and territorial behavior. Periods of monotony and reduced activity should be addressed in order to maintain morale, provide meaningful use of leisure time, and prevent negative consequences of low stimulation, such as asthenia and crew member withdrawal

  15. Aeroacoustic Simulations of Tandem Cylinders with Subcritical Spacing

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Choudhari, Meelan M.; Khorrami, Mehdi R.; Neuhart, Dan H.; Hutcheson, Florence V.; Brooks, Thomas F.; Stead, Daniel J.

    2008-01-01

    Tandem cylinders are being studied because they model a variety of component level interactions of landing gear. The present effort is directed at the case of two identical cylinders with their centroids separated in the streamwise direction by 1.435 diameters. Experiments in the Basic Aerodynamic Research Tunnel and Quiet Flow Facility at NASA Langley Research Center have provided an extensive experimental database of the nearfield flow and radiated noise. The measurements were conducted at a Mach number of 0.1285 and Reynolds number of 1.66x10(exp 5) based on the cylinder diameter. A trip was used on the upstream cylinder to insure a fully turbulent flow separation and, hence, to simulate a major aspect of high Reynolds number flow. The parallel computational effort uses the three-dimensional Navier-Stokes solver CFL3D with a hybrid, zonal turbulence model that turns off the turbulence production term everywhere except in a narrow ring surrounding solid surfaces. The experiments exhibited an asymmetry in the surface pressure that was persistent despite attempts to eliminate it through small changes in the configuration. To model the asymmetry, the simulations were run with the cylinder configuration at a nonzero but small angle of attack. The computed results and experiments are in general agreement that vortex shedding for the spacing studied herein is weak relative to that observed at supercritical spacings. Although the shedding was subdued in the simulations, it was still more prominent than in the experiments. Overall, the simulation comparisons with measured near-field data and the radiated acoustics are reasonable, especially if one is concerned with capturing the trends relative to larger cylinder spacings. However, the flow details of the 1.435 diameter spacing have not been captured in full even though very fine grid computations have been performed. Some of the discrepancy may be associated with the simulation s inexact representation of the

  16. Space-based infrared sensors of space target imaging effect analysis

    NASA Astrophysics Data System (ADS)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  17. Nonlinear Landau damping and formation of Bernstein-Greene-Kruskal structures for plasmas with q-nonextensive velocity distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raghunathan, M.; Ganesh, R.

    2013-03-15

    In the past, long-time evolution of an initial perturbation in collisionless Maxwellian plasma (q = 1) has been simulated numerically. The controversy over the nonlinear fate of such electrostatic perturbations was resolved by Manfredi [Phys. Rev. Lett. 79, 2815-2818 (1997)] using long-time simulations up to t=1600{omega}{sub p}{sup -1}. The oscillations were found to continue indefinitely leading to Bernstein-Greene-Kruskal (BGK)-like phase-space vortices (from here on referred as 'BGK structures'). Using a newly developed, high resolution 1D Vlasov-Poisson solver based on piecewise-parabolic method (PPM) advection scheme, we investigate the nonlinear Landau damping in 1D plasma described by toy q-distributions for long times,more » up to t=3000{omega}{sub p}{sup -1}. We show that BGK structures are found only for a certain range of q-values around q = 1. Beyond this window, for the generic parameters, no BGK structures were observed. We observe that for values of q<1 where velocity distributions have long tails, strong Landau damping inhibits the formation of BGK structures. On the other hand, for q>1 where distribution has a sharp fall in velocity, the formation of BGK structures is rendered difficult due to high wave number damping imposed by the steep velocity profile, which had not been previously reported. Wherever relevant, we compare our results with past work.« less

  18. Development of a Space Radiation Monte Carlo Computer Simulation

    NASA Technical Reports Server (NTRS)

    Pinsky, Lawrence S.

    1997-01-01

    The ultimate purpose of this effort is to undertake the development of a computer simulation of the radiation environment encountered in spacecraft which is based upon the Monte Carlo technique. The current plan is to adapt and modify a Monte Carlo calculation code known as FLUKA, which is presently used in high energy and heavy ion physics, to simulate the radiation environment present in spacecraft during missions. The initial effort would be directed towards modeling the MIR and Space Shuttle environments, but the long range goal is to develop a program for the accurate prediction of the radiation environment likely to be encountered on future planned endeavors such as the Space Station, a Lunar Return Mission, or a Mars Mission. The longer the mission, especially those which will not have the shielding protection of the earth's magnetic field, the more critical the radiation threat will be. The ultimate goal of this research is to produce a code that will be useful to mission planners and engineers who need to have detailed projections of radiation exposures at specified locations within the spacecraft and for either specific times during the mission or integrated over the entire mission. In concert with the development of the simulation, it is desired to integrate it with a state-of-the-art interactive 3-D graphics-capable analysis package known as ROOT, to allow easy investigation and visualization of the results. The efforts reported on here include the initial development of the program and the demonstration of the efficacy of the technique through a model simulation of the MIR environment. This information was used to write a proposal to obtain follow-on permanent funding for this project.

  19. Laboratory simulation of space plasma phenomena*

    NASA Astrophysics Data System (ADS)

    Amatucci, B.; Tejero, E. M.; Ganguli, G.; Blackwell, D.; Enloe, C. L.; Gillman, E.; Walker, D.; Gatling, G.

    2017-12-01

    Laboratory devices, such as the Naval Research Laboratory's Space Physics Simulation Chamber, are large-scale experiments dedicated to the creation of large-volume plasmas with parameters realistically scaled to those found in various regions of the near-Earth space plasma environment. Such devices make valuable contributions to the understanding of space plasmas by investigating phenomena under carefully controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. By working in collaboration with in situ experimentalists to create realistic conditions scaled to those found during the observations of interest, the microphysics responsible for the observed events can be investigated in detail not possible in space. To date, numerous investigations of phenomena such as plasma waves, wave-particle interactions, and particle energization have been successfully performed in the laboratory. In addition to investigations such as plasma wave and instability studies, the laboratory devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this presentation, we will describe several examples of the laboratory investigation of space plasma waves and instabilities and diagnostic development. *This work supported by the NRL Base Program.

  20. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  1. Interstitial duplication 8q22-q24: Report of a case proven by FISH with mapped cosmid probes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wakui, Keiko; Ohashi, Hirofumi; Yamagishi, Akira

    We report on a 6-month-old malformed female infant with a de novo interstitial duplication of an 8q22-q24 segment. She had an excess dark-band on the 8q distal region by GTG-banded chromosome analysis, which was likely to be 8q23. We performed FISH analysis using cosmid probes mapped to 8q23 and proved that the patient had an 8q duplication including the 8q23 region. 20 refs., 2 figs., 1 tab.

  2. Simulation of medical Q-switch flash-pumped Er:YAG laser

    NASA Astrophysics Data System (ADS)

    -Yan-lin, Wang; Huang-Chuyun; Yao-Yucheng; Xiaolin, Zou

    2011-01-01

    Er: YAG laser, the wavelength is 2940nm, can be absorbed strongly by water. The absorption coefficient is as high as 13000 cm-1. As the water strong absorption, Erbium laser can bring shallow penetration depth and smaller surrounding tissue injury in most soft tissue and hard tissue. At the same time, the interaction between 2940nm radiation and biological tissue saturated with water is equivalent to instantaneous heating within limited volume, thus resulting in the phenomenon of micro-explosion to removal organization. Different parameters can be set up to cut enamel, dentin, caries and soft tissue. For the development and optimization of laser system, it is a practical choice to use laser modeling to predict the influence of various parameters for laser performance. Aim at the status of low Erbium laser output power, flash-pumped Er: YAG laser performance was simulated to obtain optical output in theory. the rate equation model was obtained and used to predict the change of population densities in various manifolds and use the technology of Q-switch the simulate laser output for different design parameters and results showed that Er: YAG laser output energy can achieve the maximum average output power of 9.8W under the given parameters. The model can be used to find the potential laser systems that meet application requirements.

  3. The structure of liquid water up to 360 MPa from x-ray diffraction measurements using a high Q-range and from molecular simulation

    DOE PAGES

    Skinner, L. B.; Galib, M.; Fulton, J. L.; ...

    2016-04-04

    In this study, x-ray diffraction measurements of liquid water are reported at pressures up to 360 MPa corresponding to a density of 0.0373 molecules per Å 3. The measurements were conducted at a spatial resolution corresponding to Q max = 16 Å -1. The method of data analysis and measurement in this study follows the earlier benchmark results reported for water under ambient conditions having a density of 0.0333 molecules per Å 3 and Q max = 20 Å -1 [J. Chem. Phys. 138, 074506 (2013)] and at 70°C having a density of 0.0327 molecules per Å 3 and Qmore » max = 20 Å -1 [J. Chem. Phys. 141, 214507 (2014)]. The structure of water is very different at these three different T and P state points and thus they provide the basis for evaluating the fidelity of molecular simulation. Measurements show that at 360 MPa, the 4 waters residing in the region between 2.3 and 3 Å are nearly unchanged: the peak position, shape, and coordination number are nearly identical to their values under ambient conditions. However, in the region above 3 Å, large structural changes occur with the collapse of the well-defined 2nd shell and shifting of higher shells to shorter distances. The measured structure is compared to simulated structure using intermolecular potentials described by both first-principles methods (revPBE-D3) and classical potentials (TIP4P/2005, MB-pol, and mW). The DFT-based, revPBE-D3, method and the many-body empirical potential model, MB-pol, provide the best overall representation of the ambient, high-temperature, and high-pressure data. Finally, the revPBE-D3, MB-pol, and the TIP4P/2005 models capture the densification mechanism, whereby the non-bonded 5th nearest neighbor molecule, which partially encroaches the 1st shell at ambient pressure, is pushed further into the local tetrahedral arrangement at higher pressures by the more distant molecules filling the void space in the network between the 1st and 2nd shells.« less

  4. SU(p,q) coherent states and a Gaussian de Finetti theorem

    NASA Astrophysics Data System (ADS)

    Leverrier, Anthony

    2018-04-01

    We prove a generalization of the quantum de Finetti theorem when the local space is an infinite-dimensional Fock space. In particular, instead of considering the action of the permutation group on n copies of that space, we consider the action of the unitary group U(n) on the creation operators of the n modes and define a natural generalization of the symmetric subspace as the space of states invariant under unitaries in U(n). Our first result is a complete characterization of this subspace, which turns out to be spanned by a family of generalized coherent states related to the special unitary group SU(p, q) of signature (p, q). More precisely, this construction yields a unitary representation of the noncompact simple real Lie group SU(p, q). We therefore find a dual unitary representation of the pair of groups U(n) and SU(p, q) on an n(p + q)-mode Fock space. The (Gaussian) SU(p, q) coherent states resolve the identity on the symmetric subspace, which implies a Gaussian de Finetti theorem stating that tracing over a few modes of a unitary-invariant state yields a state close to a mixture of Gaussian states. As an application of this de Finetti theorem, we show that the n × n upper-left submatrix of an n × n Haar-invariant unitary matrix is close in total variation distance to a matrix of independent normal variables if n3 = O(m).

  5. Modelling and simulation of Space Station Freedom berthing dynamics and control

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Garrison, James L., Jr.; Montgomery, Raymond C.; Wu, Shih-Chin; Stockwell, Alan E.; Demeo, Martha E.

    1994-01-01

    A large-angle, flexible, multibody, dynamic modeling capability has been developed to help validate numerical simulations of the dynamic motion and control forces which occur during berthing of Space Station Freedom to the Shuttle Orbiter in the early assembly flights. This paper outlines the dynamics and control of the station, the attached Shuttle Remote Manipulator System, and the orbiter. The simulation tool developed for the analysis is described and the results of two simulations are presented. The first is a simulated maneuver from a gravity-gradient attitude to a torque equilibrium attitude using the station reaction control jets. The second simulation is the berthing of the station to the orbiter with the station control moment gyros actively maintaining an estimated torque equilibrium attitude. The influence of the elastic dynamic behavior of the station and of the Remote Manipulator System on the attitude control of the station/orbiter system during each maneuver was investigated. The flexibility of the station and the arm were found to have only a minor influence on the attitude control of the system during the maneuvers.

  6. A rare de novo interstitial duplication of 15q15.3q21.2 in a boy with severe short stature, hypogonadism, global developmental delay and intellectual disability.

    PubMed

    Yuan, Haiming; Meng, Zhe; Zhang, Lina; Luo, Xiangyang; Liu, Liping; Chen, Mengfan; Li, Xinwei; Zhao, Weiwei; Liang, Liyang

    2016-01-01

    Interstitial duplications distal to 15q13 are very rare. Here, we reported a 14-year-old boy with severe short stature, delayed bone age, hypogonadism, global developmental delay and intellectual disability. His had distinctive facial features including macrocephaly, broad forehead, deep-set and widely spaced eyes, broad nose bridge, shallow philtrum and thick lips. A de novo 6.4 Mb interstitial duplication of 15q15.3q21.2 was detected by chromosomal microarray analysis. We compared our patient's clinical phenotypes with those of several individuals with overlapping duplications and several candidate genes responsible for the phenotypes were identified as well. The results suggest a novel contiguous gene duplication syndrome characterized with shared features including short stature, hypogonadism, global developmental delay and other congenital anomalies.

  7. Issues in visual support to real-time space system simulation solved in the Systems Engineering Simulator

    NASA Technical Reports Server (NTRS)

    Yuen, Vincent K.

    1989-01-01

    The Systems Engineering Simulator has addressed the major issues in providing visual data to its real-time man-in-the-loop simulations. Out-the-window views and CCTV views are provided by three scene systems to give the astronauts their real-world views. To expand the window coverage for the Space Station Freedom workstation a rotating optics system is used to provide the widest field of view possible. To provide video signals to as many viewpoints as possible, windows and CCTVs, with a limited amount of hardware, a video distribution system has been developed to time-share the video channels among viewpoints at the selection of the simulation users. These solutions have provided the visual simulation facility for real-time man-in-the-loop simulations for the NASA space program.

  8. Triply heavy Q Q Q ¯ q ¯ tetraquark states

    NASA Astrophysics Data System (ADS)

    Jiang, Jin-Feng; Chen, Wei; Zhu, Shi-Lin

    2017-11-01

    Within the framework of QCD sum rules, we have investigated the tetraquark states with three heavy quarks. We systematically construct the interpolating currents for the possible c c c ¯ q ¯ , c c b ¯q ¯, b c b ¯q ¯, b b b ¯q ¯ tetraquark states with quantum numbers JP=0+ and JP=1+. Using these interpolating currents, we have calculated the two-point correlation functions and extracted the mass spectra for the above tetraquark states. We also discuss the decay patterns of these tetraquarks, and notice that the c c c ¯q ¯, c c b ¯q ¯, b c b ¯q ¯ may decay quickly with a narrow width due to their mass spectra. The b b b ¯q ¯ tetraquarks are expected to be very narrow resonances since their OZI (Okubo-Zweig-Iizuka)-allowed decay modes are kinematically forbidden. These states may be searched for in the final states with a B meson plus a light meson or photon.

  9. Investigation of 15q11-q13, 16p11.2 and 22q13 CNVs in Autism Spectrum Disorder Brazilian Individuals with and without Epilepsy

    PubMed Central

    Moreira, Danielle P.; Griesi-Oliveira, Karina; Bossolani-Martins, Ana L.; Lourenço, Naila C. V.; Takahashi, Vanessa N. O.; da Rocha, Kátia M.; Moreira, Eloisa S.; Vadasz, Estevão; Meira, Joanna Goes Castro; Bertola, Debora; Halloran, Eoghan O’; Magalhães, Tiago R.; Fett-Conte, Agnes C.; Passos-Bueno, Maria Rita

    2014-01-01

    Copy number variations (CNVs) are an important cause of ASD and those located at 15q11-q13, 16p11.2 and 22q13 have been reported as the most frequent. These CNVs exhibit variable clinical expressivity and those at 15q11-q13 and 16p11.2 also show incomplete penetrance. In the present work, through multiplex ligation-dependent probe amplification (MLPA) analysis of 531 ethnically admixed ASD-affected Brazilian individuals, we found that the combined prevalence of the 15q11-q13, 16p11.2 and 22q13 CNVs is 2.1% (11/531). Parental origin could be determined in 8 of the affected individuals, and revealed that 4 of the CNVs represent de novo events. Based on CNV prediction analysis from genome-wide SNP arrays, the size of those CNVs ranged from 206 kb to 2.27 Mb and those at 15q11-q13 were limited to the 15q13.3 region. In addition, this analysis also revealed 6 additional CNVs in 5 out of 11 affected individuals. Finally, we observed that the combined prevalence of CNVs at 15q13.3 and 22q13 in ASD-affected individuals with epilepsy (6.4%) was higher than that in ASD-affected individuals without epilepsy (1.3%; p<0.014). Therefore, our data show that the prevalence of CNVs at 15q13.3, 16p11.2 and 22q13 in Brazilian ASD-affected individuals is comparable to that estimated for ASD-affected individuals of pure or predominant European ancestry. Also, it suggests that the likelihood of a greater number of positive MLPA results might be found for the 15q13.3 and 22q13 regions by prioritizing ASD-affected individuals with epilepsy. PMID:25255310

  10. Investigation of 15q11-q13, 16p11.2 and 22q13 CNVs in autism spectrum disorder Brazilian individuals with and without epilepsy.

    PubMed

    Moreira, Danielle P; Griesi-Oliveira, Karina; Bossolani-Martins, Ana L; Lourenço, Naila C V; Takahashi, Vanessa N O; da Rocha, Kátia M; Moreira, Eloisa S; Vadasz, Estevão; Meira, Joanna Goes Castro; Bertola, Debora; O'Halloran, Eoghan; Magalhães, Tiago R; Fett-Conte, Agnes C; Passos-Bueno, Maria Rita

    2014-01-01

    Copy number variations (CNVs) are an important cause of ASD and those located at 15q11-q13, 16p11.2 and 22q13 have been reported as the most frequent. These CNVs exhibit variable clinical expressivity and those at 15q11-q13 and 16p11.2 also show incomplete penetrance. In the present work, through multiplex ligation-dependent probe amplification (MLPA) analysis of 531 ethnically admixed ASD-affected Brazilian individuals, we found that the combined prevalence of the 15q11-q13, 16p11.2 and 22q13 CNVs is 2.1% (11/531). Parental origin could be determined in 8 of the affected individuals, and revealed that 4 of the CNVs represent de novo events. Based on CNV prediction analysis from genome-wide SNP arrays, the size of those CNVs ranged from 206 kb to 2.27 Mb and those at 15q11-q13 were limited to the 15q13.3 region. In addition, this analysis also revealed 6 additional CNVs in 5 out of 11 affected individuals. Finally, we observed that the combined prevalence of CNVs at 15q13.3 and 22q13 in ASD-affected individuals with epilepsy (6.4%) was higher than that in ASD-affected individuals without epilepsy (1.3%; p<0.014). Therefore, our data show that the prevalence of CNVs at 15q13.3, 16p11.2 and 22q13 in Brazilian ASD-affected individuals is comparable to that estimated for ASD-affected individuals of pure or predominant European ancestry. Also, it suggests that the likelihood of a greater number of positive MLPA results might be found for the 15q13.3 and 22q13 regions by prioritizing ASD-affected individuals with epilepsy.

  11. UCLA IGPP Space Plasma Simulation Group

    NASA Technical Reports Server (NTRS)

    1998-01-01

    During the past 10 years the UCLA IGPP Space Plasma Simulation Group has pursued its theoretical effort to develop a Mission Oriented Theory (MOT) for the International Solar Terrestrial Physics (ISTP) program. This effort has been based on a combination of approaches: analytical theory, large scale kinetic (LSK) calculations, global magnetohydrodynamic (MHD) simulations and self-consistent plasma kinetic (SCK) simulations. These models have been used to formulate a global interpretation of local measurements made by the ISTP spacecraft. The regions of applications of the MOT cover most of the magnetosphere: the solar wind, the low- and high-latitude magnetospheric boundary, the near-Earth and distant magnetotail, and the auroral region. Most recent investigations include: plasma processes in the electron foreshock, response of the magnetospheric cusp, particle entry in the magnetosphere, sources of observed distribution functions in the magnetotail, transport of oxygen ions, self-consistent evolution of the magnetotail, substorm studies, effects of explosive reconnection, and auroral acceleration simulations.

  12. Quantum Bundle Description of Quantum Projective Spaces

    NASA Astrophysics Data System (ADS)

    Ó Buachalla, Réamonn

    2012-12-01

    We realise Heckenberger and Kolb's canonical calculus on quantum projective ( N - 1)-space C q [ C p N-1] as the restriction of a distinguished quotient of the standard bicovariant calculus for the quantum special unitary group C q [ SU N ]. We introduce a calculus on the quantum sphere C q [ S 2 N-1] in the same way. With respect to these choices of calculi, we present C q [ C p N-1] as the base space of two different quantum principal bundles, one with total space C q [ SU N ], and the other with total space C q [ S 2 N-1]. We go on to give C q [ C p N-1] the structure of a quantum framed manifold. More specifically, we describe the module of one-forms of Heckenberger and Kolb's calculus as an associated vector bundle to the principal bundle with total space C q [ SU N ]. Finally, we construct strong connections for both bundles.

  13. A Q-Band Free-Space Characterization of Carbon Nanotube Composites

    PubMed Central

    Hassan, Ahmed M.; Garboczi, Edward J.

    2016-01-01

    We present a free-space measurement technique for non-destructive non-contact electrical and dielectric characterization of nano-carbon composites in the Q-band frequency range of 30 GHz to 50 GHz. The experimental system and error correction model accurately reconstruct the conductivity of composite materials that are either thicker than the wave penetration depth, and therefore exhibit negligible microwave transmission (less than −40 dB), or thinner than the wave penetration depth and, therefore, exhibit significant microwave transmission. This error correction model implements a fixed wave propagation distance between antennas and corrects the complex scattering parameters of the specimen from two references, an air slab having geometrical propagation length equal to that of the specimen under test, and a metallic conductor, such as an aluminum plate. Experimental results were validated by reconstructing the relative dielectric permittivity of known dielectric materials and then used to determine the conductivity of nano-carbon composite laminates. This error correction model can simplify routine characterization of thin conducting laminates to just one measurement of scattering parameters, making the method attractive for research, development, and for quality control in the manufacturing environment. PMID:28057959

  14. Simulation of Constrained Musculoskeletal Systems in Task Space.

    PubMed

    Stanev, Dimitar; Moustakas, Konstantinos

    2018-02-01

    This paper proposes an operational task space formalization of constrained musculoskeletal systems, motivated by its promising results in the field of robotics. The change of representation requires different algorithms for solving the inverse and forward dynamics simulation in the task space domain. We propose an extension to the direct marker control and an adaptation of the computed muscle control algorithms for solving the inverse kinematics and muscle redundancy problems, respectively. Experimental evaluation demonstrates that this framework is not only successful in dealing with the inverse dynamics problem, but also provides an intuitive way of studying and designing simulations, facilitating assessment prior to any experimental data collection. The incorporation of constraints in the derivation unveils an important extension of this framework toward addressing systems that use absolute coordinates and topologies that contain closed kinematic chains. Task space projection reveals a more intuitive encoding of the motion planning problem, allows for better correspondence between observed and estimated variables, provides the means to effectively study the role of kinematic redundancy, and most importantly, offers an abstract point of view and control, which can be advantageous toward further integration with high level models of the precommand level. Task-based approaches could be adopted in the design of simulation related to the study of constrained musculoskeletal systems.

  15. Q-learning residual analysis: application to the effectiveness of sequences of antipsychotic medications for patients with schizophrenia.

    PubMed

    Ertefaie, Ashkan; Shortreed, Susan; Chakraborty, Bibhas

    2016-06-15

    Q-learning is a regression-based approach that uses longitudinal data to construct dynamic treatment regimes, which are sequences of decision rules that use patient information to inform future treatment decisions. An optimal dynamic treatment regime is composed of a sequence of decision rules that indicate how to optimally individualize treatment using the patients' baseline and time-varying characteristics to optimize the final outcome. Constructing optimal dynamic regimes using Q-learning depends heavily on the assumption that regression models at each decision point are correctly specified; yet model checking in the context of Q-learning has been largely overlooked in the current literature. In this article, we show that residual plots obtained from standard Q-learning models may fail to adequately check the quality of the model fit. We present a modified Q-learning procedure that accommodates residual analyses using standard tools. We present simulation studies showing the advantage of the proposed modification over standard Q-learning. We illustrate this new Q-learning approach using data collected from a sequential multiple assignment randomized trial of patients with schizophrenia. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Research and development at the Marshall Space Flight Center Neutral Buoyancy Simulator

    NASA Technical Reports Server (NTRS)

    Kulpa, Vygantas P.

    1987-01-01

    The Neutral Buoyancy Simulator (NBS), a facility designed to imitate zero-gravity conditions, was used to test the Experimental Assembly of Structures in Extravehicular Activity (EASE) and the Assembly Concept for Construction of Erectable Space Structures (ACCESS). Neutral Buoyancy Simulator applications and operations; early space structure research; development of the EASE/ACCESS experiments; and improvement of NBS simulation are summarized.

  17. Fiber estimation and tractography in diffusion MRI: Development of simulated brain images and comparison of multi-fiber analysis methods at clinical b-values

    PubMed Central

    Wilkins, Bryce; Lee, Namgyun; Gajawelli, Niharika; Law, Meng; Leporé, Natasha

    2015-01-01

    Advances in diffusion-weighted magnetic resonance imaging (DW-MRI) have led to many alternative diffusion sampling strategies and analysis methodologies. A common objective among methods is estimation of white matter fiber orientations within each voxel, as doing so permits in-vivo fiber-tracking and the ability to study brain connectivity and networks. Knowledge of how DW-MRI sampling schemes affect fiber estimation accuracy, and consequently tractography and the ability to recover complex white-matter pathways, as well as differences between results due to choice of analysis method and which method(s) perform optimally for specific data sets, all remain important problems, especially as tractography-based studies become common. In this work we begin to address these concerns by developing sets of simulated diffusion-weighted brain images which we then use to quantitatively evaluate the performance of six DW-MRI analysis methods in terms of estimated fiber orientation accuracy, false-positive (spurious) and false-negative (missing) fiber rates, and fiber-tracking. The analysis methods studied are: 1) a two-compartment “ball and stick” model (BSM) (Behrens et al., 2003); 2) a non-negativity constrained spherical deconvolution (CSD) approach (Tournier et al., 2007); 3) analytical q-ball imaging (QBI) (Descoteaux et al., 2007); 4) q-ball imaging with Funk-Radon and Cosine Transform (FRACT) (Haldar and Leahy, 2013); 5) q-ball imaging within constant solid angle (CSA) (Aganj et al., 2010); and 6) a generalized Fourier transform approach known as generalized q-sampling imaging (GQI) (Yeh et al., 2010). We investigate these methods using 20, 30, 40, 60, 90 and 120 evenly distributed q-space samples of a single shell, and focus on a signal-to-noise ratio (SNR = 18) and diffusion-weighting (b = 1000 s/mm2) common to clinical studies. We found the BSM and CSD methods consistently yielded the least fiber orientation error and simultaneously greatest detection rate of

  18. Fiber estimation and tractography in diffusion MRI: development of simulated brain images and comparison of multi-fiber analysis methods at clinical b-values.

    PubMed

    Wilkins, Bryce; Lee, Namgyun; Gajawelli, Niharika; Law, Meng; Leporé, Natasha

    2015-04-01

    Advances in diffusion-weighted magnetic resonance imaging (DW-MRI) have led to many alternative diffusion sampling strategies and analysis methodologies. A common objective among methods is estimation of white matter fiber orientations within each voxel, as doing so permits in-vivo fiber-tracking and the ability to study brain connectivity and networks. Knowledge of how DW-MRI sampling schemes affect fiber estimation accuracy, tractography and the ability to recover complex white-matter pathways, differences between results due to choice of analysis method, and which method(s) perform optimally for specific data sets, all remain important problems, especially as tractography-based studies become common. In this work, we begin to address these concerns by developing sets of simulated diffusion-weighted brain images which we then use to quantitatively evaluate the performance of six DW-MRI analysis methods in terms of estimated fiber orientation accuracy, false-positive (spurious) and false-negative (missing) fiber rates, and fiber-tracking. The analysis methods studied are: 1) a two-compartment "ball and stick" model (BSM) (Behrens et al., 2003); 2) a non-negativity constrained spherical deconvolution (CSD) approach (Tournier et al., 2007); 3) analytical q-ball imaging (QBI) (Descoteaux et al., 2007); 4) q-ball imaging with Funk-Radon and Cosine Transform (FRACT) (Haldar and Leahy, 2013); 5) q-ball imaging within constant solid angle (CSA) (Aganj et al., 2010); and 6) a generalized Fourier transform approach known as generalized q-sampling imaging (GQI) (Yeh et al., 2010). We investigate these methods using 20, 30, 40, 60, 90 and 120 evenly distributed q-space samples of a single shell, and focus on a signal-to-noise ratio (SNR = 18) and diffusion-weighting (b = 1000 s/mm(2)) common to clinical studies. We found that the BSM and CSD methods consistently yielded the least fiber orientation error and simultaneously greatest detection rate of fibers. Fiber detection

  19. Ordered-subsets linkage analysis detects novel Alzheimer disease loci on chromosomes 2q34 and 15q22.

    PubMed

    Scott, William K; Hauser, Elizabeth R; Schmechel, Donald E; Welsh-Bohmer, Kathleen A; Small, Gary W; Roses, Allen D; Saunders, Ann M; Gilbert, John R; Vance, Jeffery M; Haines, Jonathan L; Pericak-Vance, Margaret A

    2003-11-01

    Alzheimer disease (AD) is a complex disorder characterized by a wide range, within and between families, of ages at onset of symptoms. Consideration of age at onset as a covariate in genetic-linkage studies may reduce genetic heterogeneity and increase statistical power. Ordered-subsets analysis includes continuous covariates in linkage analysis by rank ordering families by a covariate and summing LOD scores to find a subset giving a significantly increased LOD score relative to the overall sample. We have analyzed data from 336 markers in 437 multiplex (>/=2 sampled individuals with AD) families included in a recent genomic screen for AD loci. To identify genetic heterogeneity by age at onset, families were ordered by increasing and decreasing mean and minimum ages at onset. Chromosomewide significance of increases in the LOD score in subsets relative to the overall sample was assessed by permutation. A statistically significant increase in the nonparametric multipoint LOD score was observed on chromosome 2q34, with a peak LOD score of 3.2 at D2S2944 (P=.008) in 31 families with a minimum age at onset between 50 and 60 years. The LOD score in the chromosome 9p region previously linked to AD increased to 4.6 at D9S741 (P=.01) in 334 families with minimum age at onset between 60 and 75 years. LOD scores were also significantly increased on chromosome 15q22: a peak LOD score of 2.8 (P=.0004) was detected at D15S1507 (60 cM) in 38 families with minimum age at onset >/=79 years, and a peak LOD score of 3.1 (P=.0006) was obtained at D15S153 (62 cM) in 43 families with mean age at onset >80 years. Thirty-one families were contained in both 15q22 subsets, indicating that these results are likely detecting the same locus. There is little overlap in these subsets, underscoring the utility of age at onset as a marker of genetic heterogeneity. These results indicate that linkage to chromosome 9p is strongest in late-onset AD and that regions on chromosome 2q34 and 15q22 are

  20. Quantitative chemical exchange saturation transfer (qCEST) MRI - omega plot analysis of RF-spillover-corrected inverse CEST ratio asymmetry for simultaneous determination of labile proton ratio and exchange rate.

    PubMed

    Wu, Renhua; Xiao, Gang; Zhou, Iris Yuwen; Ran, Chongzhao; Sun, Phillip Zhe

    2015-03-01

    Chemical exchange saturation transfer (CEST) MRI is sensitive to labile proton concentration and exchange rate, thus allowing measurement of dilute CEST agent and microenvironmental properties. However, CEST measurement depends not only on the CEST agent properties but also on the experimental conditions. Quantitative CEST (qCEST) analysis has been proposed to address the limitation of the commonly used simplistic CEST-weighted calculation. Recent research has shown that the concomitant direct RF saturation (spillover) effect can be corrected using an inverse CEST ratio calculation. We postulated that a simplified qCEST analysis is feasible with omega plot analysis of the inverse CEST asymmetry calculation. Specifically, simulations showed that the numerically derived labile proton ratio and exchange rate were in good agreement with input values. In addition, the qCEST analysis was confirmed experimentally in a phantom with concurrent variation in CEST agent concentration and pH. Also, we demonstrated that the derived labile proton ratio increased linearly with creatine concentration (P < 0.01) while the pH-dependent exchange rate followed a dominantly base-catalyzed exchange relationship (P < 0.01). In summary, our study verified that a simplified qCEST analysis can simultaneously determine labile proton ratio and exchange rate in a relatively complex in vitro CEST system. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Numerical simulation on residual stress in Y-slit type cracking test of Q690E

    NASA Astrophysics Data System (ADS)

    Huang, Wenjian; Lin, Guozhen; Chen, Zhanglan; Chen, Wu

    2018-03-01

    Numerical simulation on residual stress in Y-slit type cracking test of Q690E is carried out by using ANSYS. First, the dynamic distribution of welding temperature field is calculated; second, the results of the temperature field are converted into corresponding stress by the method of indirect coupling. The testing results show that the longitudinal residual stress of the weld is greater than the transverse residual stress and the peak of transverse residual stress is on the weld groove.

  2. Conformational analysis of oligosaccharides and polysaccharides using molecular dynamics simulations.

    PubMed

    Frank, Martin

    2015-01-01

    Complex carbohydrates usually have a large number of rotatable bonds and consequently a large number of theoretically possible conformations can be generated (combinatorial explosion). The application of systematic search methods for conformational analysis of carbohydrates is therefore limited to disaccharides and trisaccharides in a routine analysis. An alternative approach is to use Monte-Carlo methods or (high-temperature) molecular dynamics (MD) simulations to explore the conformational space of complex carbohydrates. This chapter describes how to use MD simulation data to perform a conformational analysis (conformational maps, hydrogen bonds) of oligosaccharides and how to build realistic 3D structures of large polysaccharides using Conformational Analysis Tools (CAT).

  3. Space weathering of small Koronis family members

    NASA Astrophysics Data System (ADS)

    Thomas, Cristina A.; Rivkin, Andrew S.; Trilling, David E.; Enga, Marie-therese; Grier, Jennifer A.

    2011-03-01

    The space weathering process and its implications for the relationships between S- and Q-type asteroids and ordinary chondrite meteorites is an often debated topic in asteroid science. Q-type asteroids have been shown to display the best spectral match to ordinary chondrites (McFadden, L.A., Gaffey, M.J., McCord, T.B. [1985]. Science 229, 160-163). While the Q-types and ordinary chondrites share some spectral features with S-type asteroids, the S-types have significantly redder spectral slopes than the Q-types in visible and near-infrared wavelengths. This reddening of spectral slope is attributed to the effects of space weathering on the observed surface composition. The analysis by Binzel et al. (Binzel, R.P., Rivkin, A.S., Stuart, J.S., Harris, A.W., Bus, S.J., Burbine, T.H. [2004]. Icarus 170, 259-294) provided a missing link between the Q- and S-type bodies in near-Earth space by showing a reddening of spectral slope in objects from 0.1 to 5 km that corresponded to a transition from Q-type to S-type asteroid spectra, implying that size, and therefore surface age, is related to the relationship between S- and Q-types. The existence of Q-type asteroids in the main-belt was not confirmed until Mothé-Diniz and Nesvorny (Mothé-Diniz, T., Nesvorny, D. [2008]. Astron. Astrophys. 486, L9-L12) found them in young S-type clusters. The young age of these families suggest that the unweathered surface could date to the formation of the family. This leads to the question of whether older S-type main-belt families can contain Q-type objects and display evidence of a transition from Q- to S-type. To answer this question we have carried out a photometric survey of the Koronis family using the Kitt Peak 2.1 m telescope. This provides a unique opportunity to compare the effects of the space weathering process on potentially ordinary chondrite-like bodies within a population of identical initial conditions. We find a trend in spectral slope for objects 1-5 km that shows the

  4. Deletion of degQ gene enhances outer membrane vesicle production of Shewanella oneidensis cells.

    PubMed

    Ojima, Yoshihiro; Mohanadas, Thivagaran; Kitamura, Kosei; Nunogami, Shota; Yajima, Reiki; Taya, Masahito

    2017-04-01

    Shewanella oneidensis is a Gram-negative facultative anaerobe that can use a wide variety of terminal electron acceptors for anaerobic respiration. In this study, S. oneidensis degQ gene, encoding a putative periplasmic serine protease, was cloned and expressed. The activity of purified DegQ was inhibited by diisopropyl fluorophosphate, a typical serine protease-specific inhibitor, indicating that DegQ is a serine protease. In-frame deletion and subsequent complementation of the degQ were carried out to examine the effect of envelope stress on the production of outer membrane vesicles (OMVs). Analysis of periplasmic proteins from the resulting S. oneidensis strain showed that deletion of degQ induced protein accumulation and resulted in a significant decrease in protease activity within the periplasmic space. OMVs from the wild-type and mutant strains were purified and observed by transmission electron microscopy. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis of the OMVs showed a prominent band at ~37 kDa. Nanoliquid chromatography-tandem mass spectrometry analysis identified three outer membrane porins (SO3896, SO1821, and SO3545) as dominant components of the band, suggesting that these proteins could be used as indices for comparing OMV production by S. oneidensis strains. Quantitative evaluation showed that degQ-deficient cells had a fivefold increase in OMV production compared with wild-type cells. Thus, the increased OMV production following the deletion of DegQ in S. oneidensis may be responsible for the increase in envelope stress.

  5. Sixteenth Space Simulation Conference Confirming Spaceworthiness Into the Next Millennium

    NASA Technical Reports Server (NTRS)

    Stecher, Joseph L., III (Editor)

    1990-01-01

    The conference provided participants with a forum to acquire and exchange information on the state of the art in space simulation, test technology, thermal simulation and protection, contamination, and techniques of test measurements.

  6. Numerical simulation of airflow around the evaporator in the closed space

    NASA Astrophysics Data System (ADS)

    Puchor, Tomáš; Banovčan, Roman; Lenhard, Richard

    2018-06-01

    The article deals with a numerical simulation of the forced airflow around a evaporator with the finned tubes in the electrotechnical box, by finite volume method in the program ANSYS Workbench. The work contains an analysis of the impact of forced airflow on the evaporator with the various seated the electrical components. The aim of the work is to find out the most effective way of heat dissipation by forced convection from the electrical components in the closed space with lowest pressure loss.

  7. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Concept document

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station Payload of experiments that will be onboard the Space Station Freedom. The simulation will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  8. Linkage analysis of the Fanconi anemia gene FACC with chromosome 9q markers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auerbach, A.D.; Shin, H.T.; Kaporis, A.G.

    1994-09-01

    Fanconi anemia (FA) is a genetically heterogeneous syndrome, with at least four different complementation groups as determined by cell fusion studies. The gene for complementation group C, FACC, has been cloned and mapped to chromosome 9q22.3 by in situ hybridization, while linkage analysis has supported the placement of another gene on chromosome 20q. We have analyzed five microsatellite markers and one RFLP on chromosome 9q in a panel of FA families from the International Fanconi Anemia Registry (IFAR) in order to place FACC on the genetic map. Polymorphisms were typed in 308 individuals from 51 families. FACC is tightly linkedmore » to both D9S151 [{Theta}{sub max}=0.025, Z{sub max}=7.75] and to D9S196 [{Theta}{sub max}=0.041, Z{sub max}=7.89]; multipoint analysis is in progress. We are currently screening a YAC clone that contains the entire FACC gene for additional microsatellite markers suitable for haplotype analysis of FA families.« less

  9. The Planetary and Space Simulation Facilities at DLR Cologne

    NASA Astrophysics Data System (ADS)

    Rabbow, Elke; Parpart, André; Reitz, Günther

    2016-06-01

    Astrobiology strives to increase our knowledge on the origin, evolution and distribution of life, on Earth and beyond. In the past centuries, life has been found on Earth in environments with extreme conditions that were expected to be uninhabitable. Scientific investigations of the underlying metabolic mechanisms and strategies that lead to the high adaptability of these extremophile organisms increase our understanding of evolution and distribution of life on Earth. Life as we know it depends on the availability of liquid water. Exposure of organisms to defined and complex extreme environmental conditions, in particular those that limit the water availability, allows the investigation of the survival mechanisms as well as an estimation of the possibility of the distribution to and survivability on other celestial bodies of selected organisms. Space missions in low Earth orbit (LEO) provide access for experiments to complex environmental conditions not available on Earth, but studies on the molecular and cellular mechanisms of adaption to these hostile conditions and on the limits of life cannot be performed exclusively in space experiments. Experimental space is limited and allows only the investigation of selected endpoints. An additional intensive ground based program is required, with easy to access facilities capable to simulate space and planetary environments, in particular with focus on temperature, pressure, atmospheric composition and short wavelength solar ultraviolet radiation (UV). DLR Cologne operates a number of Planetary and Space Simulation facilities (PSI) where microorganisms from extreme terrestrial environments or known for their high adaptability are exposed for mechanistic studies. Space or planetary parameters are simulated individually or in combination in temperature controlled vacuum facilities equipped with a variety of defined and calibrated irradiation sources. The PSI support basic research and were recurrently used for pre

  10. Performance analysis of MIMO wireless optical communication system with Q-ary PPM over correlated log-normal fading channel

    NASA Astrophysics Data System (ADS)

    Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua

    2018-06-01

    The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.

  11. Test and Analysis Capabilities of the Space Environment Effects Team at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Finckenor, M. M.; Edwards, D. L.; Vaughn, J. A.; Schneider, T. A.; Hovater, M. A.; Hoppe, D. T.

    2002-01-01

    Marshall Space Flight Center has developed world-class space environmental effects testing facilities to simulate the space environment. The combined environmental effects test system exposes temperature-controlled samples to simultaneous protons, high- and low-energy electrons, vacuum ultraviolet (VUV) radiation, and near-ultraviolet (NUV) radiation. Separate chambers for studying the effects of NUV and VUV at elevated temperatures are also available. The Atomic Oxygen Beam Facility exposes samples to atomic oxygen of 5 eV energy to simulate low-Earth orbit (LEO). The LEO space plasma simulators are used to study current collection to biased spacecraft surfaces, arcing from insulators and electrical conductivity of materials. Plasma propulsion techniques are analyzed using the Marshall magnetic mirror system. The micro light gas gun simulates micrometeoroid and space debris impacts. Candidate materials and hardware for spacecraft can be evaluated for durability in the space environment with a variety of analytical techniques. Mass, solar absorptance, infrared emittance, transmission, reflectance, bidirectional reflectance distribution function, and surface morphology characterization can be performed. The data from the space environmental effects testing facilities, combined with analytical results from flight experiments, enable the Environmental Effects Group to determine optimum materials for use on spacecraft.

  12. Exploration of DGVM Parameter Solution Space Using Simulated Annealing: Implications for Forecast Uncertainties

    NASA Astrophysics Data System (ADS)

    Wells, J. R.; Kim, J. B.

    2011-12-01

    Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that

  13. Q-operators for the open Heisenberg spin chain

    NASA Astrophysics Data System (ADS)

    Frassek, Rouven; Szécsényi, István M.

    2015-12-01

    We construct Q-operators for the open spin-1/2 XXX Heisenberg spin chain with diagonal boundary matrices. The Q-operators are defined as traces over an infinite-dimensional auxiliary space involving novel types of reflection operators derived from the boundary Yang-Baxter equation. We argue that the Q-operators defined in this way are polynomials in the spectral parameter and show that they commute with transfer matrix. Finally, we prove that the Q-operators satisfy Baxter's TQ-equation and derive the explicit form of their eigenvalues in terms of the Bethe roots.

  14. Fine mapping on chromosome 13q32-34 and brain expression analysis implicates MYO16 in schizophrenia.

    PubMed

    Rodriguez-Murillo, Laura; Xu, Bin; Roos, J Louw; Abecasis, Gonçalo R; Gogos, Joseph A; Karayiorgou, Maria

    2014-03-01

    We previously reported linkage of schizophrenia and schizoaffective disorder to 13q32-34 in the European descent Afrikaner population from South Africa. The nature of genetic variation underlying linkage peaks in psychiatric disorders remains largely unknown and both rare and common variants may be contributing. Here, we examine the contribution of common variants located under the 13q32-34 linkage region. We used densely spaced SNPs to fine map the linkage peak region using both a discovery sample of 415 families and a meta-analysis incorporating two additional replication family samples. In a second phase of the study, we use one family-based data set with 237 families and independent case-control data sets for fine mapping of the common variant association signal using HapMap SNPs. We report a significant association with a genetic variant (rs9583277) within the gene encoding for the myosin heavy-chain Myr 8 (MYO16), which has been implicated in neuronal phosphoinositide 3-kinase signaling. Follow-up analysis of HapMap variation within MYO16 in a second set of Afrikaner families and additional case-control data sets of European descent highlighted a region across introns 2-6 as the most likely region to harbor common MYO16 risk variants. Expression analysis revealed a significant increase in the level of MYO16 expression in the brains of schizophrenia patients. Our results suggest that common variation within MYO16 may contribute to the genetic liability to schizophrenia.

  15. Static critical behavior of the q-states Potts model: High-resolution entropic study

    NASA Astrophysics Data System (ADS)

    Caparica, A. A.; Leão, Salviano A.; DaSilva, Claudio J.

    2015-11-01

    Here we report a precise computer simulation study of the static critical properties of the two-dimensional q-states Potts model using very accurate data obtained from a modified Wang-Landau (WL) scheme proposed by Caparica and Cunha-Netto (2012). This algorithm is an extension of the conventional WL sampling, but the authors changed the criterion to update the density of states during the random walk and established a new procedure to windup the simulation run. These few changes have allowed a more precise microcanonical averaging which is essential to a reliable finite-size scaling analysis. In this work we used this new technique to determine the static critical exponents β, γ, and ν, in an unambiguous fashion. The static critical exponents were determined as β = 0.10811(77) , γ = 1.4459(31) , and ν = 0.8197(17) , for the q = 3 case, and β = 0.0877(37) , γ = 1.3161(69) , and ν = 0.7076(10) , for the q = 4 Potts model. A comparison of the present results with conjectured values and with those obtained from other well established approaches strengthens this new way of performing WL simulations.

  16. A design methodology for neutral buoyancy simulation of space operations

    NASA Technical Reports Server (NTRS)

    Akin, David L.

    1988-01-01

    Neutral buoyancy has often been used in the past for EVA development activities, but little has been done to provide an analytical understanding of the environment and its correlation with space. This paper covers a set of related research topics at the MIT Space Systems Laboratory, dealing with the modeling of the space and underwater environments, validation of the models through testing in neutral buoyancy, parabolic flight, and space flight experiments, and applications of the models to gain a better design methodology for creating meaningful neutral buoyancy simulations. Examples covered include simulation validation criteria for human body dynamics, and for applied torques in a beam rotation task, which is the pacing crew operation for EVA structural assembly. Extensions of the dynamics models are presented for powered vehicles in the underwater environment, and examples given from the MIT Space Telerobotics Research Program, including the Beam Assembly Teleoperator and the Multimode Proximity Operations Device. Future expansions of the modeling theory are also presented, leading to remote vehicles which behave in neutral buoyancy exactly as the modeled system would in space.

  17. Effect of q-nonextensive parameter and saturation time on electron density steepening in electron-positron-ion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hashemzadeh, M., E-mail: hashemzade@gmail.com

    2015-11-15

    The effect of q-nonextensive parameter and saturation time on the electron density steepening in electron-positron-ion plasmas is studied by particle in cell method. Phase space diagrams show that the size of the holes, and consequently, the number of trapped particles strongly depends on the q-parameter and saturation time. Furthermore, the mechanism of the instability and exchange of energy between electron-positron and electric field is explained by the profiles of the energy density. Moreover, it is found that the q-parameter, saturation time, and electron and positron velocities affect the nonlinear evolution of the electron density which leads to the steepening ofmore » its structure. The q-nonextensive parameter or degree of nonextensivity is the relation between temperature gradient and potential energy of the system. Therefore, the deviation of q-parameter from unity indicates the degree of inhomogeneity of temperature or deviation from equilibrium. Finally, using the kinetic theory, a generalized q-dispersion relation is presented for electron-positron-ion plasma systems. It is found that the simulation results in the linear regime are in good agreement with the growth rate results obtained by the kinetic theory.« less

  18. Space Flight Plasma Data Analysis

    NASA Technical Reports Server (NTRS)

    Wright, Kenneth H.; Minow, Joseph I.

    2009-01-01

    This slide presentation reviews a method to analyze the plasma data that is reported on board the International Space station (ISS). The Floating Potential Measurement Unit (FPMU), the role of which is to obtain floating potential and ionosphere plasma measurements for validation of the ISS charging model, assess photo voltaic array variability and interpreting IRI predictions, is composed of four probes: Floating Potential Probe (FPP), Wide-sweep Langmuir Probe (WLP), Narrow-sweep Langmuir Probe (NLP) and the Plasma Impedance Probe (PIP). This gives redundant measurements of each parameter. There are also many 'boxes' that the data must pass through before being captured by the ground station, which leads to telemetry noise. Methods of analysis for the various signals from the different sets are reviewed. There is also a brief discussion of LP analysis of Low Earth Orbit plasma simulation source.

  19. A simulation facility for testing Space Station assembly procedures

    NASA Technical Reports Server (NTRS)

    Hajare, Ankur R.; Wick, Daniel T.; Shehad, Nagy M.

    1994-01-01

    NASA plans to construct the Space Station Freedom (SSF) in one of the most hazardous environments known to mankind - space. It is of the utmost importance that the procedures to assemble and operate the SSF in orbit are both safe and effective. This paper describes a facility designed to test the integration of the telerobotic systems and to test assembly procedures using a real-world robotic arm grappling space hardware in a simulated microgravity environment.

  20. Galactic Cosmic Ray Simulator at the NASA Space Radiation Laboratory

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Slaba, Tony C.; Rusek, Adam

    2015-01-01

    The external Galactic Cosmic Ray (GCR) spectrum is significantly modified when it passes through spacecraft shielding and astronauts. One approach for simulating the GCR space radiation environment is to attempt to reproduce the unmodified, external GCR spectrum at a ground based accelerator. A possibly better approach would use the modified, shielded tissue spectrum, to select accelerator beams impinging on biological targets. NASA plans for implementation of a GCR simulator at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory will be discussed.

  1. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 6: Study issues report

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload specialists and mission specialists to operate the wide variety of experiments that will be on-board the Freedom Space Station. This simulation Computer System (SCS) study issues report summarizes the analysis and study done as task 1-identify and analyze the CSC study issues- of the SCS study contract.This work was performed over the first three months of the SCS study which began in August of 1988. First issues were identified from all sources. These included the NASA SOW, the TRW proposal, and working groups which focused the experience of NASA and the contractor team performing the study-TRW, Essex, and Grumman. The final list is organized into training related issues, and SCS associated development issues. To begin the analysis of the issues, a list of all the functions for which the SCS could be used was created, i.e., when the computer is turned on, what will it be doing. Analysis was continued by creating an operational functions matrix of SCS users vs. SCS functions to insure all the functions considered were valid, and to aid in identification of users as the analysis progressed. The functions will form the basis for the requirements, which are currently being developed under task 3 of the SCS study.

  2. Exploring Space Physics Concepts Using Simulation Results

    NASA Astrophysics Data System (ADS)

    Gross, N. A.

    2008-05-01

    The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the National Science Foundation, has the goal of developing a suite of integrated physics based computer models of the space environment that can follow the evolution of a space weather event from the Sun to the Earth. In addition to the research goals, CISM is also committed to training the next generation of space weather professionals who are imbued with a system view of space weather. This view should include an understanding of both helio-spheric and geo-space phenomena. To this end, CISM offers a yearly Space Weather Summer School targeted to first year graduate students, although advanced undergraduates and space weather professionals have also attended. This summer school uses a number of innovative pedagogical techniques including devoting each afternoon to a computer lab exercise that use results from research quality simulations and visualization techniques, along with ground based and satellite data to explore concepts introduced during the morning lectures. These labs are suitable for use in wide variety educational settings from formal classroom instruction to outreach programs. The goal of this poster is to outline the goals and content of the lab materials so that instructors may evaluate their potential use in the classroom or other settings.

  3. CFD Simulation of the Space Shuttle Launch Vehicle with Booster Separation Motor and Reaction Control Plumes

    NASA Technical Reports Server (NTRS)

    Gea, L. M.; Vicker, D.

    2006-01-01

    The primary objective of this paper is to demonstrate the capability of computational fluid dynamics (CFD) to simulate a very complicated flow field encountered during the space shuttle ascent. The flow field features nozzle plumes from booster separation motor (BSM) and reaction control system (RCS) jets with a supersonic incoming cross flow at speed of Mach 4. The overset Navier-Stokes code OVERFLOW, was used to simulate the flow field surrounding the entire space shuttle launch vehicle (SSLV) with high geometric fidelity. The variable gamma option was chosen due to the high temperature nature of nozzle flows and different plume species. CFD predicted Mach contours are in good agreement with the schlieren photos from wind tunnel test. Flow fields are discussed in detail and the results are used to support the debris analysis for the space shuttle Return To Flight (RTF) task.

  4. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing

    NASA Astrophysics Data System (ADS)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2018-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  5. Space Operations Analysis Using the Synergistic Engineering Environment

    NASA Technical Reports Server (NTRS)

    Angster, Scott; Brewer, Laura

    2002-01-01

    The Synergistic Engineering Environment has been under development at the NASA Langley Research Center to aid in the understanding of the operations of spacecraft. This is accomplished through the integration of multiple data sets, analysis tools, spacecraft geometric models, and a visualization environment to create an interactive virtual simulation of the spacecraft. Initially designed to support the needs of the International Space Station, the SEE has broadened the scope to include spacecraft ranging from low-earth orbit to deep space missions. Analysis capabilities within the SEE include rigid body dynamics, kinematics, orbital mechanics, and payload operations. This provides the user the ability to perform real-time interactive engineering analyses in areas including flight attitudes and maneuvers, visiting vehicle docking scenarios, robotic operations, plume impingement, field of view obscuration, and alternative assembly configurations. The SEE has been used to aid in the understanding of several operational procedures related to the International Space Station. This paper will address the capabilities of the first build of the SEE, present several use cases of the SEE, and discuss the next build of the SEE.

  6. Effects of incentives on psychosocial performances in simulated space-dwelling groups

    NASA Astrophysics Data System (ADS)

    Hienz, Robert D.; Brady, Joseph V.; Hursh, Steven R.; Gasior, Eric D.; Spence, Kevin R.; Emurian, Henry H.

    Prior research with individually isolated 3-person crews in a distributed, interactive, planetary exploration simulation examined the effects of communication constraints and crew configuration changes on crew performance and psychosocial self-report measures. The present report extends these findings to a model of performance maintenance that operationalizes conditions under which disruptive affective responses by crew participants might be anticipated to emerge. Experiments evaluated the effects of changes in incentive conditions on crew performance and self-report measures in simulated space-dwelling groups. Crews participated in a simulated planetary exploration mission that required identification, collection, and analysis of geologic samples. Results showed that crew performance effectiveness was unaffected by either positive or negative incentive conditions, while self-report measures were differentially affected—negative incentive conditions produced pronounced increases in negative self-report ratings and decreases in positive self-report ratings, while positive incentive conditions produced increased positive self-report ratings only. Thus, incentive conditions associated with simulated spaceflight missions can significantly affect psychosocial adaptation without compromising task performance effectiveness in trained and experienced crews.

  7. A simulation system for Space Station extravehicular activity

    NASA Technical Reports Server (NTRS)

    Marmolejo, Jose A.; Shepherd, Chip

    1993-01-01

    America's next major step into space will be the construction of a permanently manned Space Station which is currently under development and scheduled for full operation in the mid-1990's. Most of the construction of the Space Station will be performed over several flights by suited crew members during an extravehicular activity (EVA) from the Space Shuttle. Once fully operational, EVA's will be performed from the Space Station on a routine basis to provide, among other services, maintenance and repair operations of satellites currently in Earth orbit. Both voice recognition and helmet-mounted display technologies can improve the productivity of workers in space by potentially reducing the time, risk, and cost involved in performing EVA. NASA has recognized this potential and is currently developing a voice-controlled information system for Space Station EVA. Two bench-model helmet-mounted displays and an EVA simulation program have been developed to demonstrate the functionality and practicality of the system.

  8. A Simulation Base Investigation of High Latency Space Systems Operations

    NASA Technical Reports Server (NTRS)

    Li, Zu Qun; Crues, Edwin Z.; Bielski, Paul; Moore, Michael

    2017-01-01

    NASA's human space program has developed considerable experience with near Earth space operations. Although NASA has experience with deep space robotic missions, NASA has little substantive experience with human deep space operations. Even in the Apollo program, the missions lasted only a few weeks and the communication latencies were on the order of seconds. Human missions beyond the relatively close confines of the Earth-Moon system will involve missions with durations measured in months and communications latencies measured in minutes. To minimize crew risk and to maximize mission success, NASA needs to develop a better understanding of the implications of these types of mission durations and communication latencies on vehicle design, mission design and flight controller interaction with the crew. To begin to address these needs, NASA performed a study using a physics-based subsystem simulation to investigate the interactions between spacecraft crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation, built with a subsystem modeling tool developed at NASA's Johnson Space Center, models the life support system of a Mars transit vehicle. The simulation contains models of the cabin atmosphere and pressure control system, electrical power system, drinking and waste water systems, internal and external thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the vehicle subsystems; 2) a mission control center interface with data transport delays up to 15 minutes each way; 3) a real-time simulation test conductor interface that can be use to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission between July 18th and Aug 3rd of year 2016. The NEEMO

  9. Simulator evaluation of the final approach spacing tool

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.

    1990-01-01

    The design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course is described. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arrivals as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a 4-D trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST was implemented on a high performance workstation. It can be operated as a stand-alone in the Terminal Radar Approach Control (TRACON) Facility or as an element of a system integrated with automation tools in the Air Route Traffic Control Center (ARTCC). FAST was evaluated by experienced TRACON controllers in a real-time air traffic control simulation. Simulation results show that FAST significantly reduced controller workload and demonstrated a potential for an increase in landing rate.

  10. Q estimation of seismic data using the generalized S-transform

    NASA Astrophysics Data System (ADS)

    Hao, Yaju; Wen, Xiaotao; Zhang, Bo; He, Zhenhua; Zhang, Rui; Zhang, Jinming

    2016-12-01

    Quality factor, Q, is a parameter that characterizes the energy dissipation during seismic wave propagation. The reservoir pore is one of the main factors that affect the value of Q. Especially, when pore space is filled with oil or gas, the rock usually exhibits a relative low Q value. Such a low Q value has been used as a direct hydrocarbon indicator by many researchers. The conventional Q estimation method based on spectral ratio suffers from the problem of waveform tuning; hence, many researchers have introduced time-frequency analysis techniques to tackle this problem. Unfortunately, the window functions adopted in time-frequency analysis algorithms such as continuous wavelet transform (CWT) and S-transform (ST) contaminate the amplitude spectra because the seismic signal is multiplied by the window functions during time-frequency decomposition. The basic assumption of the spectral ratio method is that there is a linear relationship between natural logarithmic spectral ratio and frequency. However, this assumption does not hold if we take the influence of window functions into consideration. In this paper, we first employ a recently developed two-parameter generalized S-transform (GST) to obtain the time-frequency spectra of seismic traces. We then deduce the non-linear relationship between natural logarithmic spectral ratio and frequency. Finally, we obtain a linear relationship between natural logarithmic spectral ratio and a newly defined parameter γ by ignoring the negligible second order term. The gradient of this linear relationship is 1/Q. Here, the parameter γ is a function of frequency and source wavelet. Numerical examples for VSP and post-stack reflection data confirm that our algorithm is capable of yielding accurate results. The Q-value results estimated from field data acquired in western China show reasonable comparison with oil-producing well location.

  11. A Coordinated Initialization Process for the Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Phillips, Robert; Dexter, Dan; Hasan, David; Crues, Edwin Z.

    2007-01-01

    This document describes the federate initialization process that was developed at the NASA Johnson Space Center with the HIIA Transfer Vehicle Flight Controller Trainer (HTV FCT) simulations and refined in the Distributed Space Exploration Simulation (DSES). These simulations use the High Level Architecture (HLA) IEEE 1516 to provide the communication and coordination between the distributed parts of the simulation. The purpose of the paper is to describe a generic initialization sequence that can be used to create a federate that can: 1. Properly initialize all HLA objects, object instances, interactions, and time management 2. Check for the presence of all federates 3. Coordinate startup with other federates 4. Robustly initialize and share initial object instance data with other federates.

  12. R248Q mutation--Beyond p53-DNA binding.

    PubMed

    Ng, Jeremy W K; Lama, Dilraj; Lukman, Suryani; Lane, David P; Verma, Chandra S; Sim, Adelene Y L

    2015-12-01

    R248 in the DNA binding domain (DBD) of p53 interacts directly with the minor groove of DNA. Earlier nuclear magnetic resonance (NMR) studies indicated that the R248Q mutation resulted in conformation changes in parts of DBD far from the mutation site. However, how information propagates from the mutation site to the rest of the DBD is still not well understood. We performed a series of all-atom molecular dynamics (MD) simulations to dissect sterics and charge effects of R248 on p53-DBD conformation: (i) wild-type p53 DBD; (ii) p53 DBD with an electrically neutral arginine side-chain; (iii) p53 DBD with R248A; (iv) p53 DBD with R248W; and (v) p53 DBD with R248Q. Our results agree well with experimental observations of global conformational changes induced by the R248Q mutation. Our simulations suggest that both charge- and sterics are important in the dynamics of the loop (L3) where the mutation resides. We show that helix 2 (H2) dynamics is altered as a result of a change in the hydrogen bonding partner of D281. In turn, neighboring L1 dynamics is altered: in mutants, L1 predominantly adopts the recessed conformation and is unable to interact with the major groove of DNA. We focused our attention the R248Q mutant that is commonly found in a wide range of cancer and observed changes at the zinc-binding pocket that might account for the dominant negative effects of R248Q. Furthermore, in our simulations, the S6/S7 turn was more frequently solvent exposed in R248Q, suggesting that there is a greater tendency of R248Q to partially unfold and possibly lead to an increased aggregation propensity. Finally, based on the observations made in our simulations, we propose strategies for the rescue of R248Q mutants. © 2015 Wiley Periodicals, Inc.

  13. Proximity Operations for Space Situational Awareness Spacecraft Rendezvous and Maneuvering using Numerical Simulations and Fuzzy Logic

    NASA Astrophysics Data System (ADS)

    Carrico, T.; Langster, T.; Carrico, J.; Alfano, S.; Loucks, M.; Vallado, D.

    The authors present several spacecraft rendezvous and close proximity maneuvering techniques modeled with a high-precision numerical integrator using full force models and closed loop control with a Fuzzy Logic intelligent controller to command the engines. The authors document and compare the maneuvers, fuel use, and other parameters. This paper presents an innovative application of an existing capability to design, simulate and analyze proximity maneuvers; already in use for operational satellites performing other maneuvers. The system has been extended to demonstrate the capability to develop closed loop control laws to maneuver spacecraft in close proximity to another, including stand-off, docking, lunar landing and other operations applicable to space situational awareness, space based surveillance, and operational satellite modeling. The fully integrated end-to-end trajectory ephemerides are available from the authors in electronic ASCII text by request. The benefits of this system include: A realistic physics-based simulation for the development and validation of control laws A collaborative engineering environment for the design, development and tuning of spacecraft law parameters, sizing actuators (i.e., rocket engines), and sensor suite selection. An accurate simulation and visualization to communicate the complexity, criticality, and risk of spacecraft operations. A precise mathematical environment for research and development of future spacecraft maneuvering engineering tasks, operational planning and forensic analysis. A closed loop, knowledge-based control example for proximity operations. This proximity operations modeling and simulation environment will provide a valuable adjunct to programs in military space control, space situational awareness and civil space exploration engineering and decision making processes.

  14. The limit distribution in the q-CLT for q\\,\\geqslant \\,1 is unique and can not have a compact support

    NASA Astrophysics Data System (ADS)

    Umarov, Sabir; Tsallis, Constantino

    2016-10-01

    In a paper by Umarov et al (2008 Milan J. Math. 76 307-28), a generalization of the Fourier transform, called the q-Fourier transform, was introduced and applied for the proof of a q-generalized central limit theorem (q-CLT). Subsequently, Hilhorst illustrated (2009 Braz. J. Phys. 39 371-9 2010 J. Stat. Mech. P10023) that the q-Fourier transform for q\\gt 1, is not invertible in the space of density functions. Indeed, using an invariance principle, he constructed a family of densities with the same q-Fourier transform and noted that ‘as a consequence, the q-CLT falls short of achieving its stated goal’. The distributions constructed there have compact support. We prove now that the limit distribution in the q-CLT is unique and can not have a compact support. This result excludes all the possible counterexamples which can be constructed using the invariance principle and fills the gap mentioned by Hilhorst.

  15. Feature-space-based FMRI analysis using the optimal linear transformation.

    PubMed

    Sun, Fengrong; Morris, Drew; Lee, Wayne; Taylor, Margot J; Mills, Travis; Babyn, Paul S

    2010-09-01

    The optimal linear transformation (OLT), an image analysis technique of feature space, was first presented in the field of MRI. This paper proposes a method of extending OLT from MRI to functional MRI (fMRI) to improve the activation-detection performance over conventional approaches of fMRI analysis. In this method, first, ideal hemodynamic response time series for different stimuli were generated by convolving the theoretical hemodynamic response model with the stimulus timing. Second, constructing hypothetical signature vectors for different activity patterns of interest by virtue of the ideal hemodynamic responses, OLT was used to extract features of fMRI data. The resultant feature space had particular geometric clustering properties. It was then classified into different groups, each pertaining to an activity pattern of interest; the applied signature vector for each group was obtained by averaging. Third, using the applied signature vectors, OLT was applied again to generate fMRI composite images with high SNRs for the desired activity patterns. Simulations and a blocked fMRI experiment were employed for the method to be verified and compared with the general linear model (GLM)-based analysis. The simulation studies and the experimental results indicated the superiority of the proposed method over the GLM-based analysis in detecting brain activities.

  16. A Simulation Based Investigation of High Latency Space Systems Operations

    NASA Technical Reports Server (NTRS)

    Li, Zu Qun; Moore, Michael; Bielski, Paul; Crues, Edwin Z.

    2017-01-01

    This study was the first in a series of planned tests to use physics-based subsystem simulations to investigate the interactions between a spacecraft's crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation models the life support system of a deep space habitat. It contains models of an environmental control and life support system, an electrical power system, an active thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the subsystems; 2) a mission control center interface with data transport delays up to 15 minute each way; and 3) a real-time simulation test conductor interface used to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission. The NEEMO crew and ground support team performed a number of relevant deep space mission scenarios that included both nominal activities and activities with system malfunctions. While this initial test sequence was focused on test infrastructure and procedures development, the data collected in the study already indicate that long communication delays have notable impacts on the operation of deep space systems. For future human missions beyond cis-lunar, NASA will need to design systems and support tools to meet these challenges. These will be used to train the crew to handle critical malfunctions on their own, to predict malfunctions and assist with vehicle operations. Subsequent more detailed and involved studies will be conducted to continue advancing NASA's understanding of space systems operations across long communications delays.

  17. Space tug geosynchronous mission simulation

    NASA Technical Reports Server (NTRS)

    Lang, T. J.

    1973-01-01

    Near-optimal three dimensional trajectories from a low earth park orbit inclined at 28.5 deg to a synchronous-equatorial mission orbit were developed for both the storable (thrust = 28,912 N (6,500 lbs), I sub sp = 339 sec) and cryogenic (thrust = 44,480 N (10,000 lbs), I sub sp = 470 sec) space tug using the iterative cost function minimization technique contained within the modularized vehicle simulation (MVS) program. The finite burn times, due to low thrust-to-weight ratios, and the associated gravity losses are accounted for in the trajectory simulation and optimization. The use of an ascent phasing orbit to achieve burnout in synchronous orbit at any longitude is investigated. The ascent phasing orbit is found to offer the additional advantage of significantly reducing the overall delta velocity by splitting the low altitude burn into two parts and thereby reducing gravity losses.

  18. A space debris simulation facility for spacecraft materials evaluation

    NASA Technical Reports Server (NTRS)

    Taylor, Roy A.

    1987-01-01

    A facility to simulate the effects of space debris striking an orbiting spacecraft is described. This facility was purchased in 1965 to be used as a micrometeoroid simulation facility. Conversion to a Space Debris Simulation Facility began in July 1984 and it was placed in operation in February 1985. The facility consists of a light gas gun with a 12.7-mm launch tube capable of launching 2.5-12.7 mm projectiles with a mass of 4-300 mg and velocities of 2-8 km/sec, and three target tanks of 0.067 m, 0.53 a m and 28.5 a m. Projectile velocity measurements are accomplished via pulsed X-ray, laser diode detectors, and a Hall photographic station. This facility is being used to test development structural configurations and candidate materials for long duration orbital spacecraft. A summary of test results are also described.

  19. Cyber threat impact assessment and analysis for space vehicle architectures

    NASA Astrophysics Data System (ADS)

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.

    2014-06-01

    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  20. Sensor-scheduling simulation of disparate sensors for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Hobson, T.; Clarkson, I.

    2011-09-01

    The art and science of space situational awareness (SSA) has been practised and developed from the time of Sputnik. However, recent developments, such as the accelerating pace of satellite launch, the proliferation of launch capable agencies, both commercial and sovereign, and recent well-publicised collisions involving man-made space objects, has further magnified the importance of timely and accurate SSA. The United States Strategic Command (USSTRATCOM) operates the Space Surveillance Network (SSN), a global network of sensors tasked with maintaining SSA. The rapidly increasing number of resident space objects will require commensurate improvements in the SSN. Sensors are scarce resources that must be scheduled judiciously to obtain measurements of maximum utility. Improvements in sensor scheduling and fusion, can serve to reduce the number of additional sensors that may be required. Recently, Hill et al. [1] have proposed and developed a simulation environment named TASMAN (Tasking Autonomous Sensors in a Multiple Application Network) to enable testing of alternative scheduling strategies within a simulated multi-sensor, multi-target environment. TASMAN simulates a high-fidelity, hardware-in-the-loop system by running multiple machines with different roles in parallel. At present, TASMAN is limited to simulations involving electro-optic sensors. Its high fidelity is at once a feature and a limitation, since supercomputing is required to run simulations of appreciable scale. In this paper, we describe an alternative, modular and scalable SSA simulation system that can extend the work of Hill et al with reduced complexity, albeit also with reduced fidelity. The tool has been developed in MATLAB and therefore can be run on a very wide range of computing platforms. It can also make use of MATLAB’s parallel processing capabilities to obtain considerable speed-up. The speed and flexibility so obtained can be used to quickly test scheduling algorithms even with a

  1. Linkage analysis of primary open-angle glaucoma excludes the juvenile glaucoma region on chromosome 1q

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wirtz, M.K.; Acott, T.S.; Samples, J.R.

    1994-09-01

    The gene for one form of juvenile glaucoma has been mapped to chromosome 1q21-q31. This raises the possibility of primary open-angle glaucoma (POAG) also mapping to this region if the same defective gene causes both diseases. To ask this question linkage analysis was performed on a large POAG kindred. Blood samples or skin biopsies were obtained from 40 members of this family. Individuals were diagnosed as having POAG if they met two or more of the following criteria: (1) Visual field defects compatible with glaucoma on automated perimetry; (2) Optic nerve head and/or nerve fiber layer analysis compatible with glaucomatousmore » damage; (3) high intraocular pressures (> 20 mm Hg). Patients were considered glaucoma suspects if they only met one criterion. These individuals were excluded from the analysis. Of the 40 members, seven were diagnosed with POAG; four were termed suspects. The earliest age of onset was 38 years old, while the average age of onset was 65 years old. We performed two-point and multipoint linkage analysis, using five markers which encompass the region 1q21-q31; specifically, D1S194, D1S210, D1S212, D1S191 and LAMB2. Two-point lod scores excluded tight linkage with all markers except D1S212 (maximum lod score of 1.07 at theta = 0.0). In the multipoint analysis, including D1S210-D1S212-LAMB2 and POAG, the entire 11 cM region spanned by these markers was excluded for linkage with POAG; that is, lod scores were < -2.0. In conclusion, POAG in this family does not map to chromosome 1q21-q31 and, thus, they carry a gene that is distinct from the juvenile glaucoma gene.« less

  2. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  3. Radiation effects control: Eyes, skin. [space environment simulation

    NASA Technical Reports Server (NTRS)

    Hightower, D.; Smathers, J. B.

    1974-01-01

    Adverse effects on the lens of the eye and the skin due to exposure to proton radiation during manned space flight were evaluated. Actual proton irradiation which might be encountered in space was simulated. Irradiation regimes included single acute exposures, daily fractionated exposures, and weekly fractionated exposures. Animals were exposed and then maintained and examined periodically until data sufficient to meet the objective were obtained. No significant skin effects were noted and no serious sight impairment was exhibited.

  4. Space environment simulation and sensor calibration facility

    NASA Astrophysics Data System (ADS)

    Engelhart, Daniel P.; Patton, James; Plis, Elena; Cooper, Russell; Hoffmann, Ryan; Ferguson, Dale; Hilmer, Robert V.; McGarity, John; Holeman, Ernest

    2018-02-01

    The Mumbo space environment simulation chamber discussed here comprises a set of tools to calibrate a variety of low flux, low energy electron and ion detectors used in satellite-mounted particle sensors. The chamber features electron and ion beam sources, a Lyman-alpha ultraviolet lamp, a gimbal table sensor mounting system, cryogenic sample mount and chamber shroud, and beam characterization hardware and software. The design of the electron and ion sources presented here offers a number of unique capabilities for space weather sensor calibration. Both sources create particle beams with narrow, well-characterized energetic and angular distributions with beam diameters that are larger than most space sensor apertures. The electron and ion sources can produce consistently low fluxes that are representative of quiescent space conditions. The particle beams are characterized by 2D beam mapping with several co-located pinhole aperture electron multipliers to capture relative variation in beam intensity and a large aperture Faraday cup to measure absolute current density.

  5. Space environment simulation and sensor calibration facility.

    PubMed

    Engelhart, Daniel P; Patton, James; Plis, Elena; Cooper, Russell; Hoffmann, Ryan; Ferguson, Dale; Hilmer, Robert V; McGarity, John; Holeman, Ernest

    2018-02-01

    The Mumbo space environment simulation chamber discussed here comprises a set of tools to calibrate a variety of low flux, low energy electron and ion detectors used in satellite-mounted particle sensors. The chamber features electron and ion beam sources, a Lyman-alpha ultraviolet lamp, a gimbal table sensor mounting system, cryogenic sample mount and chamber shroud, and beam characterization hardware and software. The design of the electron and ion sources presented here offers a number of unique capabilities for space weather sensor calibration. Both sources create particle beams with narrow, well-characterized energetic and angular distributions with beam diameters that are larger than most space sensor apertures. The electron and ion sources can produce consistently low fluxes that are representative of quiescent space conditions. The particle beams are characterized by 2D beam mapping with several co-located pinhole aperture electron multipliers to capture relative variation in beam intensity and a large aperture Faraday cup to measure absolute current density.

  6. Genetic Algorithm Optimizes Q-LAW Control Parameters

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; von Allmen, Paul; Petropoulos, Anastassios; Terrile, Richard

    2008-01-01

    A document discusses a multi-objective, genetic algorithm designed to optimize Lyapunov feedback control law (Q-law) parameters in order to efficiently find Pareto-optimal solutions for low-thrust trajectories for electronic propulsion systems. These would be propellant-optimal solutions for a given flight time, or flight time optimal solutions for a given propellant requirement. The approximate solutions are used as good initial solutions for high-fidelity optimization tools. When the good initial solutions are used, the high-fidelity optimization tools quickly converge to a locally optimal solution near the initial solution. Q-law control parameters are represented as real-valued genes in the genetic algorithm. The performances of the Q-law control parameters are evaluated in the multi-objective space (flight time vs. propellant mass) and sorted by the non-dominated sorting method that assigns a better fitness value to the solutions that are dominated by a fewer number of other solutions. With the ranking result, the genetic algorithm encourages the solutions with higher fitness values to participate in the reproduction process, improving the solutions in the evolution process. The population of solutions converges to the Pareto front that is permitted within the Q-law control parameter space.

  7. Simulation of realistic abnormal SPECT brain perfusion images: application in semi-quantitative analysis

    NASA Astrophysics Data System (ADS)

    Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.

    2005-11-01

    Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.

  8. Cosmological Distortions in Redshift Space

    NASA Astrophysics Data System (ADS)

    Ryden, Barbara S.

    1995-05-01

    The long-sought value of q_0, the deceleration parameter, remains elusive. One method of finding q_0 is to measure the distortions of large scale structure in redshift space. If the Hubble constant changes with time, then the mapping between redshift space and real space is nonlinear, even in the absence of peculiar motions. When q_0 > -1, structures in redshift space will be distorted along the line of sight; the distortion is proportional to (1 + q_0 ) z in the limit that the redshift z is small. The cosmological distortions at z <= 0.2 can be found by measuring the shapes of voids in redshift surveys of galaxies (such as the upcoming Sloane Digital Sky Survey). The cosmological distortions are masked to some extent by the distortions caused by small-scale peculiar velocities; it is difficult to measure the shape of a void when the fingers of God are poking into it. The cosmological distortions at z ~ 1 can be found by measuring the correlation function of quasars as a function of redshift and of angle relative to the line of sight. Finding q_0 by measuring distortions in redshift space, like the classical methods of determining q_0, is simple and elegant in principle but complicated and messy in practice.

  9. An expert system for simulating electric loads aboard Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Kukich, George; Dolce, James L.

    1990-01-01

    Space Station Freedom will provide an infrastructure for space experimentation. This environment will feature regulated access to any resources required by an experiment. Automated systems are being developed to manage the electric power so that researchers can have the flexibility to modify their experiment plan for contingencies or for new opportunities. To define these flexible power management characteristics for Space Station Freedom, a simulation is required that captures the dynamic nature of space experimentation; namely, an investigator is allowed to restructure his experiment and to modify its execution. This changes the energy demands for the investigator's range of options. An expert system competent in the domain of cryogenic fluid management experimentation was developed. It will be used to help design and test automated power scheduling software for Freedom's electric power system. The expert system allows experiment planning and experiment simulation. The former evaluates experimental alternatives and offers advice on the details of the experiment's design. The latter provides a real-time simulation of the experiment replete with appropriate resource consumption.

  10. Effects of coenzyme Q10 on statin-induced myopathy: a meta-analysis of randomized controlled trials.

    PubMed

    Banach, Maciej; Serban, Corina; Sahebkar, Amirhossein; Ursoniu, Sorin; Rysz, Jacek; Muntner, Paul; Toth, Peter P; Jones, Steven R; Rizzo, Manfredi; Glasser, Stephen P; Lip, Gregory Y H; Dragan, Simona; Mikhailidis, Dimitri P

    2015-01-01

    To evaluate the efficacy of coenzyme Q10 (CoQ10) supplementation on statin-induced myopathy. We searched the MEDLINE, Cochrane Library, Scopus, and EMBASE databases (November 1, 1987, to May 1, 2014) to identify randomized controlled trials investigating the impact of CoQ10 on muscle pain and plasma creatine kinase (CK) activity as 2 measures of statin-induced myalgia. Two independent reviewers extracted data on study characteristics, methods, and outcomes. We included 6 studies with 302 patients receiving statin therapy: 5 studies with 226 participants evaluated the effect of CoQ10 supplementation on plasma CK activity, and 5 studies (4 used in the CK analysis and 1 other study) with 253 participants were included to assess the effect of CoQ10 supplementation on muscle pain. Compared with the control group, plasma CK activity was increased after CoQ10 supplementation, but this change was not significant (mean difference, 11.69 U/L [to convert to μkat/L, multiply by 0.0167]; 95% CI, -14.25 to 37.63 U/L; P=.38). Likewise, CoQ10 supplementation had no significant effect on muscle pain despite a trend toward a decrease (standardized mean difference, -0.53; 95% CI, -1.33 to 0.28; P=.20). No dose-effect association between changes in plasma CK activity (slope, -0.001; 95% CI, -0.004 to 0.001; P=.33) or in the indices of muscle pain (slope, 0.002; 95% CI, -0.005 to 0.010; P=.67) and administered doses of CoQ10 were observed. The results of this meta-analysis of available randomized controlled trials do not suggest any significant benefit of CoQ10 supplementation in improving statin-induced myopathy. Larger, well-designed trials are necessary to confirm the findings from this meta-analysis. Copyright © 2015 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  11. Space simulation facilities providing a stable thermal vacuum facility

    NASA Technical Reports Server (NTRS)

    Tellalian, Martin L.

    1990-01-01

    CBI has recently constructed the Intermediate Thermal Vacuum Facility. Built as a corporate facility, the installation will first be used on the Boost Surveillance and Tracking System (BSTS) program. It will also be used to develop and test other sensor systems. The horizontal chamber has a horseshoe shaped cross section and is supported on pneumatic isolators for vibration isolation. The chamber structure was designed to meet stability and stiffness requirements. The design process included measurement of the ambient ground vibrations, analysis of various foundation test article support configurations, design and analysis of the chamber shell and modal testing of the chamber shell. A detailed 3-D finite element analysis was made in the design stage to predict the lowest three natural frequencies and mode shapes and to identify local vibrating components. The design process is described and the results are compared of the finite element analysis to the results of the field modal testing and analysis for the 3 lowest natural frequencies and mode shapes. Concepts are also presented for stiffening large steel structures along with methods to improve test article stability in large space simulation facilities.

  12. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  13. Y spaces and global smooth solution of fractional Navier-Stokes equations with initial value in the critical oscillation spaces

    NASA Astrophysics Data System (ADS)

    Yang, Qixiang; Yang, Haibo

    2018-04-01

    For fractional Navier-Stokes equations and critical initial spaces X, one used to establish the well-posedness in the solution space which is contained in C (R+ , X). In this paper, for heat flow, we apply parameter Meyer wavelets to introduce Y spaces Y m , β where Y m , β is not contained in C (R+, B˙∞ 1 - 2 β , ∞). Consequently, for 1/2 < β < 1, we establish the global well-posedness of fractional Navier-Stokes equations with small initial data in all the critical oscillation spaces. The critical oscillation spaces may be any Besov-Morrey spaces (B˙p,q γ1 ,γ2 (Rn)) n or any Triebel-Lizorkin-Morrey spaces (F˙p,q γ1 ,γ2 (Rn)) n where 1 ≤ p , q ≤ ∞ , 0 ≤γ2 ≤ n/p, γ1 -γ2 = 1 - 2 β. These critical spaces include many known spaces. For example, Besov spaces, Sobolev spaces, Bloch spaces, Q-spaces, Morrey spaces and Triebel-Lizorkin spaces etc.

  14. L718Q mutant EGFR escapes covalent inhibition by stabilizing a non-reactive conformation of the lung cancer drug osimertinib.

    PubMed

    Callegari, D; Ranaghan, K E; Woods, C J; Minari, R; Tiseo, M; Mor, M; Mulholland, A J; Lodola, A

    2018-03-14

    Osimertinib is a third-generation inhibitor approved for the treatment of non-small cell lung cancer. It overcomes resistance to first-generation inhibitors by incorporating an acrylamide group which alkylates Cys797 of EGFR T790M. The mutation of a residue in the P-loop (L718Q) was shown to cause resistance to osimertinib, but the molecular mechanism of this process is unknown. Here, we investigated the inhibitory process for EGFR T790M (susceptible to osimertinib) and EGFR T790M/L718Q (resistant to osimertinib), by modelling the chemical step ( i.e. , alkylation of Cys797) using QM/MM simulations and the recognition step by MD simulations coupled with free-energy calculations. The calculations indicate that L718Q has a negligible impact on both the activation energy for Cys797 alkylation and the free-energy of binding for the formation of the non-covalent complex. The results show that Gln718 affects the conformational space of the EGFR-osimertinib complex, stabilizing a conformation of acrylamide which prevents reaction with Cys797.

  15. Compact Dual Ion Composition Experiment for space plasmas—CoDICE

    NASA Astrophysics Data System (ADS)

    Desai, M. I.; Ogasawara, K.; Ebert, R. W.; Allegrini, F.; McComas, D. J.; Livi, S.; Weidner, S. E.

    2016-07-01

    The Compact Dual Ion Composition Experiment—CoDICE—simultaneously provides high-quality plasma and energetic ion composition measurements over six decades in energy in a wide variety of space plasma environments. CoDICE measures two critical ion populations in space plasmas: (1) Elemental and charge state composition, and 3-D velocity distributions of <10 eV/q-40 keV/q plasma ions; and (2) Elemental composition, energy spectra, and angular distributions of ˜30 keV->10 MeV energetic ions. CoDICE uses a novel, integrated, common time-of-flight subsystem that provides several advantages over the commonly used separate plasma and energetic ion sensors currently flying on several space missions. These advantages include reduced mass and volume compared to two separate instruments, reduced shielding in high-radiation environments, and simplified spacecraft interface and accommodation requirements. This paper describes the operation principles, electro-optic simulation results and applies the CoDICE concept for measuring plasma and energetic ion populations in Jupiter's magnetosphere.

  16. A Simulation Based Investigation of High Latency Space Systems Operations

    NASA Technical Reports Server (NTRS)

    Li, Zu Qun; Crues, Edwin Z.; Bielski, Paul; Moore, Michael

    2017-01-01

    This study was the first in a series of planned tests to use physics-based subsystem simulations to investigate the interactions between a spacecraft's crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation models the life support system of a deep space habitat. It contains models of an environmental control and life support system, an electrical power system, an active thermal control system, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the subsystems; 2) a mission control center interface with data transport delays up to 15 minute each way; and 3) a real-time simulation test conductor interface used to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission. The NEEMO crew and ground support team performed a number of relevant deep space mission scenarios that included both nominal activities and activities with system malfunctions. While this initial test sequence was focused on test infrastructure and procedures development, the data collected in the study already indicate that long communication delays have notable impacts on the operation of deep space systems. For future human missions beyond cis-lunar, NASA will need to design systems and support tools to meet these challenges. These will be used to train the crew to handle critical malfunctions on their own, to predict malfunctions, and to assist with vehicle operations. Subsequent more detailed and involved studies will be conducted to continue advancing NASA's understanding of space systems operations across long communications delays.

  17. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  18. [Genetic risk of families with t(1;2)(q42;q33) GTG, RHG, QFQ, FISH].

    PubMed

    Stasiewicz-Jarocka, B; Raczkiewicz, B; Kowalczyk, D; Zawada, M; Midro, A T

    2000-10-01

    A central concept in genetic counseling is the estimation of the probability of recurrence of unfavourable pregnancy outcomes (abortion, stillbirth and birth at malformed child). In case of chromosomal changes estimates are made on basis of segregation analyses in actual pedigree. If we have a few of pedigree members than risk estimate should be performed on basis combined our data and empiric data from literature. We present individual genetic risk for carriers of unique reciprocal translocation t(1;2)(q42;q33) detected through karyotyping of the patient with miscarriage. The pedigree consisted 5 families of t(1;2)(q42;q33) carriers with 15 members of progeny was evaluated according to Stene and Stengel-Rutkowski. Cytogenetic analysis of persons of these families (7 persons) was performed on blood samples using GTG, RHG, QFQ and FISH techniques. Additional RCT pedigree analysis of Stengel-Rutkowski et at Collection, Polish Collection, Lituanian Collection, Bielorussian Collection and an available literature cases were performed. The translocation was classified as translocation at risk for double segment imbalances for trisomy 1q42-->qter together with monosomy 2q33-->qter or monosomy 1q42-->qter together with trisomy 2q33-->qter after 2:2 disjunction after adjacent-1 segregation of the meiotic chromosomes. Two improved risk values for RCT with segments 1q42-->qter, 2q33-->qter were obtained i.e. 6/44 (13.6% +/- 5.2%) and 4/20 (20% +/- 8.9%). The probability of occurrence for this translocation carriers was estimated as 7% (medium risk). On basis of direct analysis at presented pedigree a risk for miscarriage was estimated as 2/9. 1. Carrierships of t(1;2)(q42;q33) increased population risk value for unbalanced progeny at birth by 7% (medium risk) and for miscarriage 2/9. 2. Causative relation between presence of t(1;2)(q42;q33) and miscarriages is suggested. 3. Updated, new genetic risk values for RCT at risk for single segment 1q42-->qter imbalance is 6/44 (13

  19. International Collaboration for Galactic Cosmic Ray Simulation at the NASA Space Radiation Laboratory

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Slaba, Tony C.; Rusek, Adam; Durante, Marco; Reitz, Guenther

    2015-01-01

    An international collaboration on Galactic Cosmic Ray (GCR) simulation is being formed to make recommendations on how to best simulate the GCR spectrum at ground based accelerators. The external GCR spectrum is significantly modified when it passes through spacecraft shielding and astronauts. One approach for simulating the GCR space radiation environment at ground based accelerators would use the modified spectrum, rather than the external spectrum, in the accelerator beams impinging on biological targets. Two recent workshops have studied such GCR simulation. The first workshop was held at NASA Langley Research Center in October 2014. The second workshop was held at the NASA Space Radiation Investigators' workshop in Galveston, Texas in January 2015. The anticipated outcome of these and other studies may be a report or journal article, written by an international collaboration, making accelerator beam recommendations for GCR simulation. This poster describes the status of GCR simulation at the NASA Space Radiation Laboratory and encourages others to join the collaboration.

  20. Velopharyngeal Anatomy in 22q11.2 Deletion Syndrome: A Three-Dimensional Cephalometric Analysis

    PubMed Central

    Ruotolo, Rachel A.; Veitia, Nestor A.; Corbin, Aaron; McDonough, Joseph; Solot, Cynthia B.; McDonald-McGinn, Donna; Zackai, Elaine H.; Emanuel, Beverly S.; Cnaan, Avital; LaRossa, Don; Arens, Raanan; Kirschner, Richard E.

    2010-01-01

    Objective 22q11.2 deletion syndrome is the most common genetic cause of velopharyngeal dysfunction (VPD). Magnetic resonance imaging (MRI) is a promising method for noninvasive, three-dimensional (3D) assessment of velopharyngeal (VP) anatomy. The purpose of this study was to assess VP structure in patients with 22q11.2 deletion syndrome by using 3D MRI analysis. Design This was a retrospective analysis of magnetic resonance images obtained in patients with VPD associated with a 22q11.2 deletion compared with a normal control group. Setting This study was conducted at The Children’s Hospital of Philadelphia, a pediatric tertiary care center. Patients, Participants The study group consisted of 5 children between the ages of 2.9 and 7.9 years, with 22q11.2 deletion syndrome confirmed by fluorescence in situ hybridization analysis. All had VPD confirmed by nasendoscopy or videofluoroscopy. The control population consisted of 123 unaffected patients who underwent MRI for reasons other than VP assessment. Interventions Axial and sagittal T1- and T2-weighted magnetic resonance images with 3-mm slice thickness were obtained from the orbit to the larynx in all patients by using a 1.5T Siemens Visions system. Outcome Measures Linear, angular, and volumetric measurements of VP structures were obtained from the magnetic resonance images with VIDA image- processing software. Results The study group demonstrated greater anterior and posterior cranial base and atlanto-dental angles. They also demonstrated greater pharyngeal cavity volume and width and lesser tonsillar and adenoid volumes. Conclusion Patients with a 22q11.2 deletion demonstrate significant alterations in VP anatomy that may contribute to VPD. PMID:16854203

  1. Monte Carlo simulation of TrueBeam flattening-filter-free beams using varian phase-space files: comparison with experimental data.

    PubMed

    Belosi, Maria F; Rodriguez, Miguel; Fogliata, Antonella; Cozzi, Luca; Sempau, Josep; Clivio, Alessandro; Nicolini, Giorgia; Vanetti, Eugenio; Krauss, Harald; Khamphan, Catherine; Fenoglietto, Pascal; Puxeu, Josep; Fedele, David; Mancosu, Pietro; Brualla, Lorenzo

    2014-05-01

    Phase-space files for Monte Carlo simulation of the Varian TrueBeam beams have been made available by Varian. The aim of this study is to evaluate the accuracy of the distributed phase-space files for flattening filter free (FFF) beams, against experimental measurements from ten TrueBeam Linacs. The phase-space files have been used as input in PRIMO, a recently released Monte Carlo program based on the PENELOPE code. Simulations of 6 and 10 MV FFF were computed in a virtual water phantom for field sizes 3 × 3, 6 × 6, and 10 × 10 cm(2) using 1 × 1 × 1 mm(3) voxels and for 20 × 20 and 40 × 40 cm(2) with 2 × 2 × 2 mm(3) voxels. The particles contained in the initial phase-space files were transported downstream to a plane just above the phantom surface, where a subsequent phase-space file was tallied. Particles were transported downstream this second phase-space file to the water phantom. Experimental data consisted of depth doses and profiles at five different depths acquired at SSD = 100 cm (seven datasets) and SSD = 90 cm (three datasets). Simulations and experimental data were compared in terms of dose difference. Gamma analysis was also performed using 1%, 1 mm and 2%, 2 mm criteria of dose-difference and distance-to-agreement, respectively. Additionally, the parameters characterizing the dose profiles of unflattened beams were evaluated for both measurements and simulations. Analysis of depth dose curves showed that dose differences increased with increasing field size and depth; this effect might be partly motivated due to an underestimation of the primary beam energy used to compute the phase-space files. Average dose differences reached 1% for the largest field size. Lateral profiles presented dose differences well within 1% for fields up to 20 × 20 cm(2), while the discrepancy increased toward 2% in the 40 × 40 cm(2) cases. Gamma analysis resulted in an agreement of 100% when a 2%, 2 mm criterion was used, with the only exception of the 40 × 40

  2. CFD Simulation of the Space Shuttle Launch Vehicle with Booster Separation Motor and Reaction Control System Plumes

    NASA Technical Reports Server (NTRS)

    Gea, L. M.; Vicker, D.

    2006-01-01

    The primary objective of this paper is to demonstrate the capability of computational fluid dynamics (CFD) to simulate a very complicated flow field encountered during the space shuttle ascent. The flow field features nozzle plumes from booster separation motor (BSM) and reaction control system (RCS) jets with a supersonic incoming cross flow at speed of Mach 4. The overset Navier-Stokes code OVERFLOW, was used to simulate the flow field surrounding the entire space shuttle launch vehicle (SSLV) with high geometric fidelity. The variable gamma option was chosen due to the high temperature nature of nozzle flows and different plume species. CFD predicted Mach contours are in good agreement with the schlieren photos from wind tunnel test. Flow fields are discussed in detail and the results are used to support the debris analysis for the space shuttle Return To Flight (RTF) task.

  3. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL)

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.

    2008-01-01

    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In

  4. All (4,1): Sigma models with (4 , q) off-shell supersymmetry

    NASA Astrophysics Data System (ADS)

    Hull, Chris; Lindström, Ulf

    2017-03-01

    Off-shell (4 , q) supermultiplets in 2-dimensions are constructed for q = 1 , 2 , 4. These are used to construct sigma models whose target spaces are hyperkähler with torsion. The off-shell supersymmetry implies the three complex structures are simultaneously integrable and allows us to construct actions using extended superspace and projective superspace, giving an explicit construction of the target space geometries.

  5. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The current and near-future state-of-the-art in visual simulation equipment technology is related to the requirements of the space shuttle visual system. Image source, image sensing, and displays are analyzed on a subsystem basis, and the principal conclusions are used in the formulation of a recommended baseline visual system. Perceptibility and visibility are also analyzed.

  6. Comparison of the mean quality factors for astronauts calculated using the Q-functions proposed by ICRP, ICRU, and NASA

    NASA Astrophysics Data System (ADS)

    Sato, T.; Endo, A.; Niita, K.

    2013-07-01

    For the estimation of the radiation risk for astronauts, not only the organ absorbed doses but also their mean quality factors must be evaluated. Three functions have been proposed by different organizations for expressing the radiation quality, including the Q(L), Q(y), and QNASA(Z, E) relationships as defined in International Committee of Radiological Protection (ICRP) Publication 60, International Commission on Radiation Units and Measurements (ICRU) Report 40, and National Aeronautics and Space Administration (NASA) TP-2011-216155, respectively. The Q(L) relationship is the most simple and widely used for space dosimetry, but the use of the latter two functions enables consideration of the difference in the track structure of various charged particles during the risk estimation. Therefore, we calculated the mean quality factors in organs and tissues in ICRP/ICRU reference voxel phantoms for the isotropic exposure to various mono-energetic particles using the three Q-functions. The Particle and Heavy Ion Transport code System PHITS was employed to simulate the particle motions inside the phantoms. The effective dose equivalents and the phantom-averaged effective quality factors for the astronauts were then estimated from the calculated mean quality factors multiplied by the fluence-to-dose conversion coefficients and cosmic-ray fluxes inside a spacecraft. It was found from the calculations that QNASA generally gives the largest values for the phantom-averaged effective quality factors among the three Q-functions for neutron, proton, and lighter-ion irradiation, whereas Q(L) provides the largest values for heavier-ion irradiation. Overall, the introduction of QNASA instead of Q(L) or Q(y) in astronaut dosimetry results in the increase the effective dose equivalents because the majority of the doses are composed of the contributions from protons and neutrons, although this tendency may change by the calculation conditions.

  7. Functional connectivity analysis in EEG source space: The choice of method

    PubMed Central

    Knyazeva, Maria G.

    2017-01-01

    Functional connectivity (FC) is among the most informative features derived from EEG. However, the most straightforward sensor-space analysis of FC is unreliable owing to volume conductance effects. An alternative—source-space analysis of FC—is optimal for high- and mid-density EEG (hdEEG, mdEEG); however, it is questionable for widely used low-density EEG (ldEEG) because of inadequate surface sampling. Here, using simulations, we investigate the performance of the two source FC methods, the inverse-based source FC (ISFC) and the cortical partial coherence (CPC). To examine the effects of localization errors of the inverse method on the FC estimation, we simulated an oscillatory source with varying locations and SNRs. To compare the FC estimations by the two methods, we simulated two synchronized sources with varying between-source distance and SNR. The simulations were implemented for hdEEG, mdEEG, and ldEEG. We showed that the performance of both methods deteriorates for deep sources owing to their inaccurate localization and smoothing. The accuracy of both methods improves with the increasing between-source distance. The best ISFC performance was achieved using hd/mdEEG, while the best CPC performance was observed with ldEEG. In conclusion, with hdEEG, ISFC outperforms CPC and therefore should be the preferred method. In the studies based on ldEEG, the CPC is a method of choice. PMID:28727750

  8. No Control Genes Required: Bayesian Analysis of qRT-PCR Data

    PubMed Central

    Matz, Mikhail V.; Wright, Rachel M.; Scott, James G.

    2013-01-01

    Background Model-based analysis of data from quantitative reverse-transcription PCR (qRT-PCR) is potentially more powerful and versatile than traditional methods. Yet existing model-based approaches cannot properly deal with the higher sampling variances associated with low-abundant targets, nor do they provide a natural way to incorporate assumptions about the stability of control genes directly into the model-fitting process. Results In our method, raw qPCR data are represented as molecule counts, and described using generalized linear mixed models under Poisson-lognormal error. A Markov Chain Monte Carlo (MCMC) algorithm is used to sample from the joint posterior distribution over all model parameters, thereby estimating the effects of all experimental factors on the expression of every gene. The Poisson-based model allows for the correct specification of the mean-variance relationship of the PCR amplification process, and can also glean information from instances of no amplification (zero counts). Our method is very flexible with respect to control genes: any prior knowledge about the expected degree of their stability can be directly incorporated into the model. Yet the method provides sensible answers without such assumptions, or even in the complete absence of control genes. We also present a natural Bayesian analogue of the “classic” analysis, which uses standard data pre-processing steps (logarithmic transformation and multi-gene normalization) but estimates all gene expression changes jointly within a single model. The new methods are considerably more flexible and powerful than the standard delta-delta Ct analysis based on pairwise t-tests. Conclusions Our methodology expands the applicability of the relative-quantification analysis protocol all the way to the lowest-abundance targets, and provides a novel opportunity to analyze qRT-PCR data without making any assumptions concerning target stability. These procedures have been implemented as the MCMC

  9. No control genes required: Bayesian analysis of qRT-PCR data.

    PubMed

    Matz, Mikhail V; Wright, Rachel M; Scott, James G

    2013-01-01

    Model-based analysis of data from quantitative reverse-transcription PCR (qRT-PCR) is potentially more powerful and versatile than traditional methods. Yet existing model-based approaches cannot properly deal with the higher sampling variances associated with low-abundant targets, nor do they provide a natural way to incorporate assumptions about the stability of control genes directly into the model-fitting process. In our method, raw qPCR data are represented as molecule counts, and described using generalized linear mixed models under Poisson-lognormal error. A Markov Chain Monte Carlo (MCMC) algorithm is used to sample from the joint posterior distribution over all model parameters, thereby estimating the effects of all experimental factors on the expression of every gene. The Poisson-based model allows for the correct specification of the mean-variance relationship of the PCR amplification process, and can also glean information from instances of no amplification (zero counts). Our method is very flexible with respect to control genes: any prior knowledge about the expected degree of their stability can be directly incorporated into the model. Yet the method provides sensible answers without such assumptions, or even in the complete absence of control genes. We also present a natural Bayesian analogue of the "classic" analysis, which uses standard data pre-processing steps (logarithmic transformation and multi-gene normalization) but estimates all gene expression changes jointly within a single model. The new methods are considerably more flexible and powerful than the standard delta-delta Ct analysis based on pairwise t-tests. Our methodology expands the applicability of the relative-quantification analysis protocol all the way to the lowest-abundance targets, and provides a novel opportunity to analyze qRT-PCR data without making any assumptions concerning target stability. These procedures have been implemented as the MCMC.qpcr package in R.

  10. Galactic Cosmic Ray Simulation at the NASA Space Radiation Laboratory

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Slaba, Tony C.; Rusek, Adam

    2015-01-01

    The external Galactic Cosmic Ray (GCR) spectrum is significantly modified when it passes through spacecraft shielding and astronauts. One approach for simulating the GCR space radiation environment at ground based accelerators would use the modified spectrum, rather than the external spectrum, in the accelerator beams impinging on biological targets. Two recent workshops have studied such GCR simulation. The first workshop was held at NASA Langley Research Center in October 2014. The second workshop was held at the NASA Space Radiation Investigators' workshop in Galveston, Texas in January 2015. The results of these workshops will be discussed in this paper.

  11. Performance optimization for space-based sensors: simulation and modelling at Fraunhofer IOSB

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2014-10-01

    The prediction of the effectiveness of a space-based sensor for its designated application in space (e.g. special earth surface observations or missile detection) can help to reduce the expenses, especially during the phases of mission planning and instrumentation. In order to optimize the performance of such systems we simulate and analyse the entire operational scenario, including: - optional waveband - various orbit heights and viewing angles - system design characteristics, e. g. pixel size and filter transmission - atmospheric effects, e. g. different cloud types, climate zones and seasons In the following, an evaluation of the appropriate infrared (IR) waveband for the designated sensor application is given. The simulation environment is also capable of simulating moving objects like aircraft or missiles. Therefore, the spectral signature of the object/missile as well as its track along a flight path is implemented. The resulting video sequence is then analysed by a tracking algorithm and an estimation of the effectiveness of the sensor system can be simulated. This paper summarizes the work carried out at Fraunhofer IOSB in the field of simulation and modelling for the performance optimization of space based sensors. The paper is structured as follows: First, an overview of the applied simulation and modelling software is given. Then, the capability of those tools is illustrated by means of a hypothetical threat scenario for space-based early warning (launch of a long-range ballistic missile (BM)).

  12. Direct comparisons of X-ray scattering and atomistic molecular dynamics simulations for precise acid copolymers and ionomers

    DOE PAGES

    Buitrago, C. Francisco; Bolintineanu, Dan; Seitz, Michelle E.; ...

    2015-02-09

    Designing acid- and ion-containing polymers for optimal proton, ion, or water transport would benefit profoundly from predictive models or theories that relate polymer structures with ionomer morphologies. Recently, atomistic molecular dynamics (MD) simulations were performed to study the morphologies of precise poly(ethylene-co-acrylic acid) copolymer and ionomer melts. Here, we present the first direct comparisons between scattering profiles, I(q), calculated from these atomistic MD simulations and experimental X-ray data for 11 materials. This set of precise polymers has spacers of exactly 9, 15, or 21 carbons between acid groups and has been partially neutralized with Li, Na, Cs, or Zn. Inmore » these polymers, the simulations at 120 °C reveal ionic aggregates with a range of morphologies, from compact, isolated aggregates (type 1) to branched, stringy aggregates (type 2) to branched, stringy aggregates that percolate through the simulation box (type 3). Excellent agreement is found between the simulated and experimental scattering peak positions across all polymer types and aggregate morphologies. The shape of the amorphous halo in the simulated I(q) profile is in excellent agreement with experimental I(q). We found that the modified hard-sphere scattering model fits both the simulation and experimental I(q) data for type 1 aggregate morphologies, and the aggregate sizes and separations are in agreement. Given the stringy structure in types 2 and 3, we develop a scattering model based on cylindrical aggregates. Both the spherical and cylindrical scattering models fit I(q) data from the polymers with type 2 and 3 aggregates equally well, and the extracted aggregate radii and inter- and intra-aggregate spacings are in agreement between simulation and experiment. Furthermore, these dimensions are consistent with real-space analyses of the atomistic MD simulations. By combining simulations and experiments, the ionomer scattering peak can be associated with the

  13. Behavior of stem cells under outer-space microgravity and ground-based microgravity simulation.

    PubMed

    Zhang, Cui; Li, Liang; Chen, Jianling; Wang, Jinfu

    2015-06-01

    With rapid development of space engineering, research on life sciences in space is being conducted extensively, especially cellular and molecular studies on space medicine. Stem cells, undifferentiated cells that can differentiate into specialized cells, are considered a key resource for regenerative medicine. Research on stem cells under conditions of microgravity during a space flight or a ground-based simulation has generated several excellent findings. To help readers understand the effects of outer space and ground-based simulation conditions on stem cells, we reviewed recent studies on the effects of microgravity (as an obvious environmental factor in space) on morphology, proliferation, migration, and differentiation of stem cells. © 2015 International Federation for Cell Biology.

  14. Simulation of physical and chemical processes in support of space missions

    NASA Astrophysics Data System (ADS)

    Kochan, H.; Sears, D.; Colangeli, L.; Ehrenfreund, P.

    For many years, phenomena on planetary surfaces have been simulated under space conditions on Earth-bound laboratories. In a six-year program at the German Aerospace Center, Cologne, phenomena on cometary surfaces were studied and provided new insights that enhanced the data from space missions. Similar simulation techniques are being applied in a new research program at DLR in preparation for the rendezvous of the Rosetta space craft with comet Wirtanen at 3 A.U and for the Mars Express mission with the British Beagle 2 lander which will search for traces of life. The Arkansas-Oklahoma Center for Space and Planetary Sciences is preparing to conduct experiments that will aid in the interpretation of images from Mars orbiters in terms of fluid and dust storm processes and help design instrumentation for deployment on Mars. Of particular interest is the question of the present location of the water that was apparently once abundant on Mars. Additional experiments at the new U.S. facility will help interpret images of Eros obtained by the NEAR spacecraft and to prepare for future sample return missions to near-Earth asteroids while providing fundamental insights into regolith mechanics and regolith- atmosphere interactions. The activities in the Cosmic Physics Laboratory of Naples are focused on the simulation of materials and processes active in space in the perspective of studying how physical and chemical properties of cosmic relevant species evolve depending on environmental conditions. This approach is complemented by investigation on actual extraterrestrial samples, such as meteorites and interplanetary dust particles. The approach is useful to characterize the performances of space instruments for remote and/or in -situ exploration of Solar System bodies, also in the view of searching features of exobiological relevance. One of the key objectives of the Soft matter/Astrobiology laboratory at Leiden University is to study the formation, evolution and survival of

  15. Prenatal diagnosis of de novo t(2;18;14)(q33.1;q12.2;q31.2), dup(5)(q34q34), del(7)(p21.1p21.1), and del(10)(q25.3q25.3) and a review of the prenatally ascertained de novo apparently balanced complex and multiple chromosomal rearrangements.

    PubMed

    Chen, Chih-Ping; Chern, Schu-Rern; Lee, Chen-Chi; Lin, Chyi-Chyang; Li, Yueh-Chun; Hsieh, Lie-Jiau; Chen, Wen-Lin; Wang, Wayseen

    2006-02-01

    To present the prenatal diagnosis of a de novo complex chromosomal rearrangement (CCR) associated with de novo interstitial deletions and duplication and to review the literature. Amniocentesis was performed at 18 weeks' gestation because of an increased risk for Down syndrome based on maternal serum alpha-fetoprotein and human chorionic gonadotrophin screening. Amniocentesis revealed a karyotype of 46,XY,t(2;18;14)(q33.1;q12.2;q31.2),dup(5)(q34q34),del(7)(p21.1p21.1), del(10)(q25.3q25.3). The parental karyotypes were normal. The pregnancy was terminated. The fetus manifested facial dysmorphism, clinodactyly of both hands, and hypoplasia of the left great toe. Spectral karyotyping (SKY), cytogenetic polymorphism, and polymorphic DNA markers were used to investigate the imbalances and the origin of the de novo aberrant chromosomes. SKY showed a three-way CCR. Cytogenetic polymorphism investigation of the derivative chromosome 14 of the fetus and the parental chromosomes 14 determined the maternal origin of the translocation. Polymorphic DNA marker analysis confirmed the maternal origin of the de novo interstitial deletions and duplication. No cryptic imbalance at or near the breakpoints of the CCR was detected by the molecular analysis. De novo apparently balanced CCRs may be associated with imbalances in other chromosomes. We suggest further investigation and re-evaluation of cryptic or subtle imbalances in all cases classified as de novo apparently balanced CCRs. Copyright 2006 John Wiley & Sons, Ltd.

  16. Design and analysis of a high Q MEMS passive RF filter

    NASA Astrophysics Data System (ADS)

    Rathee, Vishal; Pande, Rajesh

    2016-04-01

    Over the past few years, significant growth has been observed in using MEMS based passive components in the RF microelectronics domain, especially in transceiver system. This is due to some excellent properties of the MEMS devices like low loss, low cost and excellent isolation. This paper presents a design of high performance MEMS passive band pass filter, consisting of L and C with improved quality factor and insertion loss less than the reported filters. In this paper we have presented a design of 2nd order band pass filter with 2.4GHz centre frequency and 83MHz bandwidth for Bluetooth application. The simulation results showed improved Q-factor of 34 and Insertion loss of 1.7dB to 1.9dB. The simulation results needs to be validated by fabricating the device, fabrication flow of which is also presented in the paper.

  17. A random Q-switched fiber laser

    PubMed Central

    Tang, Yulong; Xu, Jianqiu

    2015-01-01

    Extensive studies have been performed on random lasers in which multiple-scattering feedback is used to generate coherent emission. Q-switching and mode-locking are well-known routes for achieving high peak power output in conventional lasers. However, in random lasers, the ubiquitous random cavities that are formed by multiple scattering inhibit energy storage, making Q-switching impossible. In this paper, widespread Rayleigh scattering arising from the intrinsic micro-scale refractive-index irregularities of fiber cores is used to form random cavities along the fiber. The Q-factor of the cavity is rapidly increased by stimulated Brillouin scattering just after the spontaneous emission is enhanced by random cavity resonances, resulting in random Q-switched pulses with high brightness and high peak power. This report is the first observation of high-brightness random Q-switched laser emission and is expected to stimulate new areas of scientific research and applications, including encryption, remote three-dimensional random imaging and the simulation of stellar lasing. PMID:25797520

  18. Realistic Simulations of Coronagraphic Observations with Future Space Telescopes

    NASA Astrophysics Data System (ADS)

    Rizzo, M. J.; Roberge, A.; Lincowski, A. P.; Zimmerman, N. T.; Juanola-Parramon, R.; Pueyo, L.; Hu, M.; Harness, A.

    2017-11-01

    We present a framework to simulate realistic observations of future space-based coronagraphic instruments. This gathers state-of-the-art scientific and instrumental expertise allowing robust characterization of future instrument concepts.

  19. MeltMan: Optimization, Evaluation, and Universal Application of a qPCR System Integrating the TaqMan qPCR and Melting Analysis into a Single Assay

    PubMed Central

    Nagy, Alexander; Černíková, Lenka; Vitásková, Eliška; Křivda, Vlastimil; Dán, Ádám; Dirbáková, Zuzana; Jiřincová, Helena; Procházka, Bohumír; Sedlák, Kamil; Havlíčková, Martina

    2016-01-01

    In the present work, we optimised and evaluated a qPCR system integrating 6-FAM (6-carboxyfluorescein)-labelled TaqMan probes and melting analysis using the SYTO 82 (S82) DNA binding dye in a single reaction. We investigated the influence of the S82 on various TaqMan and melting analysis parameters and defined its optimal concentration. In the next step, the method was evaluated in 36 different TaqMan assays with a total of 729 paired reactions using various DNA and RNA templates, including field specimens. In addition, the melting profiles of interest were correlated with the electrophoretic patterns. We proved that the S82 is fully compatible with the FAM-TaqMan system. Further, the advantages of this approach in routine diagnostic TaqMan qPCR were illustrated with practical examples. These included solving problems with flat or other atypical amplification curves or even false negativity as a result of probe binding failure. Our data clearly show that the integration of the TaqMan qPCR and melting analysis into a single assay provides an additional control option as well as the opportunity to perform more complex analyses, get more data from the reactions, and obtain analysis results with higher confidence. PMID:27031831

  20. Design and Optimization of Low-thrust Orbit Transfers Using Q-law and Evolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; vonAllmen, Paul; Fink, Wolfgang; Petropoulos, Anastassios; Terrile, Richard

    2005-01-01

    Future space missions will depend more on low-thrust propulsion (such as ion engines) thanks to its high specific impulse. Yet, the design of low-thrust trajectories is complex and challenging. Third-body perturbations often dominate the thrust, and a significant change to the orbit requires a long duration of thrust. In order to guide the early design phases, we have developed an efficient and efficacious method to obtain approximate propellant and flight-time requirements (i.e., the Pareto front) for orbit transfers. A search for the Pareto-optimal trajectories is done in two levels: optimal thrust angles and locations are determined by Q-law, while the Q-law is optimized with two evolutionary algorithms: a genetic algorithm and a simulated-annealing-related algorithm. The examples considered are several types of orbit transfers around the Earth and the asteroid Vesta.

  1. Apollo experience report: Simulation of manned space flight for crew training

    NASA Technical Reports Server (NTRS)

    Woodling, C. H.; Faber, S.; Vanbockel, J. J.; Olasky, C. C.; Williams, W. K.; Mire, J. L. C.; Homer, J. R.

    1973-01-01

    Through space-flight experience and the development of simulators to meet the associated training requirements, several factors have been established as fundamental for providing adequate flight simulators for crew training. The development of flight simulators from Project Mercury through the Apollo 15 mission is described. The functional uses, characteristics, and development problems of the various simulators are discussed for the benefit of future programs.

  2. Implementation of an open-scenario, long-term space debris simulation approach

    NASA Astrophysics Data System (ADS)

    Stupl, J.; Nelson, B.; Faber, N.; Perez, A.; Carlino, R.; Yang, F.; Henze, C.; Karacalioglu, A.; O'Toole, C.; Swenson, J.

    This paper provides a status update on the implementation of a flexible, long-term space debris simulation approach. The motivation is to build a tool that can assess the long-term impact of various options for debris-remediation, including the LightForce space debris collision avoidance scheme. State-of-the-art simulation approaches that assess the long-term development of the debris environment use either completely statistical approaches, or they rely on large time steps in the order of several (5-15) days if they simulate the positions of single objects over time. They cannot be easily adapted to investigate the impact of specific collision avoidance schemes or de-orbit schemes, because the efficiency of a collision avoidance maneuver can depend on various input parameters, including ground station positions, space object parameters and orbital parameters of the conjunctions and take place in much smaller timeframes than 5-15 days. For example, LightForce only changes the orbit of a certain object (aiming to reduce the probability of collision), but it does not remove entire objects or groups of objects. In the same sense, it is also not straightforward to compare specific de-orbit methods in regard to potential collision risks during a de-orbit maneuver. To gain flexibility in assessing interactions with objects, we implement a simulation that includes every tracked space object in LEO, propagates all objects with high precision, and advances with variable-sized time-steps as small as one second. It allows the assessment of the (potential) impact of changes to any object. The final goal is to employ a Monte Carlo approach to assess the debris evolution during the simulation time-frame of 100 years and to compare a baseline scenario to debris remediation scenarios or other scenarios of interest. To populate the initial simulation, we use the entire space-track object catalog in LEO. We then use a high precision propagator to propagate all objects over the

  3. Space telescope neutral buoyancy simulations: The first two years

    NASA Technical Reports Server (NTRS)

    Sanders, F. G.

    1982-01-01

    Neutral Buoyancy simulations which were conducted to validate the crew systems interface as it relates to space telescope on-orbit maintenance and contingency operations is discussed. The initial concept validation tests using low fidelity mockups is described. The entire spectrum of proposed space telescope refurbishment and selected contingencies using upgraded mockups which reflect flight hardware are reported. Findings which may be applicable to future efforts of a similar nature are presented.

  4. Experimental identification of a comb-shaped chaotic region in multiple parameter spaces simulated by the Hindmarsh—Rose neuron model

    NASA Astrophysics Data System (ADS)

    Jia, Bing

    2014-03-01

    A comb-shaped chaotic region has been simulated in multiple two-dimensional parameter spaces using the Hindmarsh—Rose (HR) neuron model in many recent studies, which can interpret almost all of the previously simulated bifurcation processes with chaos in neural firing patterns. In the present paper, a comb-shaped chaotic region in a two-dimensional parameter space was reproduced, which presented different processes of period-adding bifurcations with chaos with changing one parameter and fixed the other parameter at different levels. In the biological experiments, different period-adding bifurcation scenarios with chaos by decreasing the extra-cellular calcium concentration were observed from some neural pacemakers at different levels of extra-cellular 4-aminopyridine concentration and from other pacemakers at different levels of extra-cellular caesium concentration. By using the nonlinear time series analysis method, the deterministic dynamics of the experimental chaotic firings were investigated. The period-adding bifurcations with chaos observed in the experiments resembled those simulated in the comb-shaped chaotic region using the HR model. The experimental results show that period-adding bifurcations with chaos are preserved in different two-dimensional parameter spaces, which provides evidence of the existence of the comb-shaped chaotic region and a demonstration of the simulation results in different two-dimensional parameter spaces in the HR neuron model. The results also present relationships between different firing patterns in two-dimensional parameter spaces.

  5. Assessment of the Weather Research and Forecasting (WRF) model for simulation of extreme rainfall events in the upper Ganga Basin

    NASA Astrophysics Data System (ADS)

    Chawla, Ila; Osuri, Krishna K.; Mujumdar, Pradeep P.; Niyogi, Dev

    2018-02-01

    Reliable estimates of extreme rainfall events are necessary for an accurate prediction of floods. Most of the global rainfall products are available at a coarse resolution, rendering them less desirable for extreme rainfall analysis. Therefore, regional mesoscale models such as the advanced research version of the Weather Research and Forecasting (WRF) model are often used to provide rainfall estimates at fine grid spacing. Modelling heavy rainfall events is an enduring challenge, as such events depend on multi-scale interactions, and the model configurations such as grid spacing, physical parameterization and initialization. With this background, the WRF model is implemented in this study to investigate the impact of different processes on extreme rainfall simulation, by considering a representative event that occurred during 15-18 June 2013 over the Ganga Basin in India, which is located at the foothills of the Himalayas. This event is simulated with ensembles involving four different microphysics (MP), two cumulus (CU) parameterizations, two planetary boundary layers (PBLs) and two land surface physics options, as well as different resolutions (grid spacing) within the WRF model. The simulated rainfall is evaluated against the observations from 18 rain gauges and the Tropical Rainfall Measuring Mission Multi-Satellite Precipitation Analysis (TMPA) 3B42RT version 7 data. From the analysis, it should be noted that the choice of MP scheme influences the spatial pattern of rainfall, while the choice of PBL and CU parameterizations influences the magnitude of rainfall in the model simulations. Further, the WRF run with Goddard MP, Mellor-Yamada-Janjic PBL and Betts-Miller-Janjic CU scheme is found to perform <q>best> in simulating this heavy rain event. The selected configuration is evaluated for several heavy to extremely heavy rainfall events that occurred across different months of the monsoon season in the region. The model performance improved through

  6. Simulated Space Environment Effects on a Candidate Solar Sail Material

    NASA Technical Reports Server (NTRS)

    Kang, Jin Ho; Bryant, Robert G.; Wilkie, W. Keats; Wadsworth, Heather M.; Craven, Paul D.; Nehls, Mary K.; Vaughn, Jason A.

    2017-01-01

    For long duration missions of solar sails, the sail material needs to survive harsh space environments and the degradation of the sail material controls operational lifetime. Therefore, understanding the effects of the space environment on the sail membrane is essential for mission success. In this study, we investigated the effect of simulated space environment effects of ionizing radiation, thermal aging and simulated potential damage on mechanical, thermal and optical properties of a commercial off the shelf (COTS) polyester solar sail membrane to assess the degradation mechanisms on a feasible solar sail. The solar sail membrane was exposed to high energy electrons (about 70 keV and 10 nA/cm2), and the physical properties were characterized. After about 8.3 Grad dose, the tensile modulus, tensile strength and failure strain of the sail membrane decreased by about 20 95%. The aluminum reflective layer was damaged and partially delaminated but it did not show any significant change in solar absorbance or thermal emittance. The effect on mechanical properties of a pre-cracked sample, simulating potential impact damage of the sail membrane, as well as thermal aging effects on metallized PEN (polyethylene naphthalate) film will be discussed.

  7. Effect of coenzyme Q10 supplementation on heart failure: a meta-analysis123

    PubMed Central

    Thompson-Paul, Angela M; Bazzano, Lydia A

    2013-01-01

    Background: Coenzyme Q10 (CoQ10; also called ubiquinone) is an antioxidant that has been postulated to improve functional status in congestive heart failure (CHF). Several randomized controlled trials have examined the effects of CoQ10 on CHF with inconclusive results. Objective: The objective of this meta-analysis was to evaluate the impact of CoQ10 supplementation on the ejection fraction (EF) and New York Heart Association (NYHA) functional classification in patients with CHF. Design: A systematic review of the literature was conducted by using databases including MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials, and manual examination of references from selected studies. Studies included were randomized controlled trials of CoQ10 supplementation that reported the EF or NYHA functional class as a primary outcome. Information on participant characteristics, trial design and duration, treatment, dose, control, EF, and NYHA classification were extracted by using a standardized protocol. Results: Supplementation with CoQ10 resulted in a pooled mean net change of 3.67% (95% CI: 1.60%, 5.74%) in the EF and −0.30 (95% CI: −0.66, 0.06) in the NYHA functional class. Subgroup analyses showed significant improvement in EF for crossover trials, trials with treatment duration ≤12 wk in length, studies published before 1994, and studies with a dose ≤100 mg CoQ10/d and in patients with less severe CHF. These subgroup analyses should be interpreted cautiously because of the small number of studies and patients included in each subgroup. Conclusions: Pooled analyses of available randomized controlled trials suggest that CoQ10 may improve the EF in patients with CHF. Additional well-designed studies that include more diverse populations are needed. PMID:23221577

  8. Phase space quantum mechanics - Direct

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasiri, S.; Sobouti, Y.; Taati, F.

    2006-09-15

    Conventional approach to quantum mechanics in phase space (q,p), is to take the operator based quantum mechanics of Schroedinger, or an equivalent, and assign a c-number function in phase space to it. We propose to begin with a higher level of abstraction, in which the independence and the symmetric role of q and p is maintained throughout, and at once arrive at phase space state functions. Upon reduction to the q- or p-space the proposed formalism gives the conventional quantum mechanics, however, with a definite rule for ordering of factors of noncommuting observables. Further conceptual and practical merits of themore » formalism are demonstrated throughout the text.« less

  9. Data-Driven Learning of Q-Matrix

    PubMed Central

    Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang

    2013-01-01

    The recent surge of interests in cognitive assessment has led to developments of novel statistical models for diagnostic classification. Central to many such models is the well-known Q-matrix, which specifies the item–attribute relationships. This article proposes a data-driven approach to identification of the Q-matrix and estimation of related model parameters. A key ingredient is a flexible T-matrix that relates the Q-matrix to response patterns. The flexibility of the T-matrix allows the construction of a natural criterion function as well as a computationally amenable algorithm. Simulations results are presented to demonstrate usefulness and applicability of the proposed method. Extension to handling of the Q-matrix with partial information is presented. The proposed method also provides a platform on which important statistical issues, such as hypothesis testing and model selection, may be formally addressed. PMID:23926363

  10. Microbiological assay of the Marshall Space Flight Center neutral buoyancy simulator

    NASA Technical Reports Server (NTRS)

    Beyerle, F. J.

    1973-01-01

    A neutral buoyancy simulator tank system is described in terms of microbiological and medical safety for astronauts. The system was designed to simulate a gravity-free state for evaluation of orbital operations in a microorganism-free environment. Methods for the identification and elimination of specific microorganisms are dealt with as measures for a pure system of space environment simulation.

  11. Nonlinear simulation of the fishbone instability

    NASA Astrophysics Data System (ADS)

    Idouakass, Malik; Faganello, Matteo; Berk, Herbert; Garbet, Xavier; Benkadda, Sadruddin; PIIM Team; IFS Team; IRFM Team

    2014-10-01

    We propose to extend the Odblom-Breizman precessional fishbone model to account for both the MagnetoHydroDynamic (MHD) nonlinearity at the q = 1 surface and the nonlinear response of the energetic particles contained within the q = 1 surface. This electromagnetic mode, whose excitation, damping and frequency chirping are determined by the self-consistent interaction between an energetic trapped particle population and the bulk plasma evolution, can induce effective transport and losses for the energetic particles, being them alpha-particles in next-future fusion devices or heated particles in present Tokamaks. The model is reduced to its simplest form, assuming a reduced MHD description for the bulk plasma and a two-dimensional phase-space evolution (gyro and bounce averaged) for deeply trapped energetic particles. Numerical simulations have been performed in order to characterize the mode chirping and saturation, in particular looking at the interplay between the development of phase-space structures and the system dissipation associated to the MHD non-linearities at the resonance locations.

  12. sghC1q, a novel C1q family member from half-smooth tongue sole (Cynoglossus semilaevis): identification, expression and analysis of antibacterial and antiviral activities.

    PubMed

    Zeng, Yan; Xiang, Jinsong; Lu, Yang; Chen, Yadong; Wang, Tianzi; Gong, Guangye; Wang, Lei; Li, Xihong; Chen, Songlin; Sha, Zhenxia

    2015-01-01

    The C1q family includes many proteins that contain a globular (gC1q) domain, and this family is widely conserved from bacteria to mammals. The family is divided into three subgroups: C1q, C1q-like and ghC1q. In this study, a novel C1q family member, sghC1q, was cloned and identified from Cynoglossus semilaevis (named CssghC1q). The full-length CssghC1q cDNA spans 905 bp, including an open reading frame (ORF) of 768 bp, a 5'-untranslated region (UTR) of 25 bp and a 3'-UTR of 112 bp. The ORF encodes a putative protein of 255 amino acids (aa) with a deduced molecular weight of 28 kDa. The predicted protein contains a signal peptide (aa 1-19), a coiled-coil region (aa 61-102) and a globular C1q (gC1q) domain (aa 117-255). Protein sequence alignment indicated that the C-terminus of CssghC1q is highly conserved across several species. Phylogenetic analysis indicated that CssghC1q is most closely related to Maylandia zebra C1q-like-2-like. The CssghC1q genomic sequence spanned 1562 bp, with three exons and two introns. CssghC1q is constitutively expressed in all evaluated tissues, with the highest expression in the liver and the weakest in the heart. After a challenge with Vibrio anguillarum, CssghC1q transcript levels exhibited distinct time-dependent response patterns in the blood, head kidney, skin, spleen, intestine and liver. Recombinant CssghC1q protein exhibited antimicrobial activities against Gram-negative bacteria, Gram-positive bacteria and viruses. The minimum inhibitory concentration (MIC) values against Vibrio harveyi, Vibrio anguillarum, Pseudomonas aeruginosa and Staphylococcus aureus were 0.043 mg/mL, 0.087 mg/mL, 0.174 mg/mL and 0.025 mg/mL, respectively. A low concentration (0.06 mg/mL) of CssghC1q showed significant antiviral activity in vitro against nervous necrosis virus (NNV). These results suggest that CssghC1q plays a vital role in immune defense against bacteria and viruses. Copyright © 2014 Elsevier Ltd. All rights

  13. Microsatellite analysis of loss of heterozygosity on chromosomes 9q, 11p and 17p in medulloblastomas.

    PubMed

    Albrecht, S; von Deimling, A; Pietsch, T; Giangaspero, F; Brandner, S; Kleihues, P; Wiestler, O D

    1994-02-01

    Medulloblastoma (MB) is a primitive neuroectodermal tumour of the cerebellum whose pathogenesis is poorly understood. Previous studies suggest a role for loci on chromosomes 11p and 17p in the pathogenesis of MB. Evidence for another potential MB locus has recently emerged from studies on Gorlin syndrome (GS), an autosomal dominant syndrome with multiple basal cell carcinomas, epithelial jaw cysts, and skeletal anomalies. Since GS can be associated with MB, we examined sporadic (non-GS) cases of MB for evidence of loss of heterozygosity (LOH) on chromosome 9 where a putative GS locus has been localized to band q31. Nineteen paired blood and MB DNA specimens from 16 patients (11 primary tumours, two primary with recurrent tumours, one primary tumour and cell line, two cell lines) were studied by PCR analysis of microsatellites at D9S55 (9p12), D9S15 (9q13-q21.1), D9S127 (9q21.1-21.3), D9S12 (9q22.3), D9S58 (9q22.3-q31), D9S109 (9q31), D9S53 (9q31), GSN (9q33), D9S60 (9q33-q34), D9S65 (9q33-q34), ASS (9q34), D9S67 (9q34.3), TH (11p15.5), D11S490 (11q23.3), D17S261 (17p11.2-12), D17S520 (17p12), TP53 (17p13.1), D17S5 (17p13.3), D17S515 (17q22-qter), and by RFLP analysis at the WT-1 locus (11p13). Only two tumours had LOH on 9q. One was non-informative at D9S15, D9S65, and GSN but showed LOH at D9S127, D9S12, D9S58, D9S109, D9S53, D9S60, ASS, and D9S67. The other was uninterpretable at D9S65 and non-informative at D9S15, D9S58, D9S53, and D9S67 but exhibited LOH at D9S127, D9S12, D9S109, GSN, D9S60, and ASS. Both these cases were informative at D9S55 without LOH.(ABSTRACT TRUNCATED AT 250 WORDS)

  14. The politics of space mining - An account of a simulation game

    NASA Astrophysics Data System (ADS)

    Paikowsky, Deganit; Tzezana, Roey

    2018-01-01

    Celestial bodies like the Moon and asteroids contain materials and precious metals, which are valuable for human activity on Earth and beyond. Space mining has been mainly relegated to the realm of science fiction, and was not treated seriously by the international community. The private industry is starting to assemble towards space mining, and success on this front would have major impact on all nations. We present in this paper a review of current space mining ventures, and the international legislation, which could stand in their way - or aid them in their mission. Following that, we present the results of a role-playing simulation in which the role of several important nations was played by students of international relations. The results of the simulation are used as a basis for forecasting the potential initial responses of the nations of the world to a successful space mining operation in the future.

  15. Experimental evidences for molecular origin of low-Q peak in neutron/x-ray scattering of 1-alkyl-3-methylimidazolium bis(trifluoromethanesulfonyl)amide ionic liquids

    NASA Astrophysics Data System (ADS)

    Fujii, Kenta; Kanzaki, Ryo; Takamuku, Toshiyuki; Kameda, Yasuo; Kohara, Shinji; Kanakubo, Mitsuhiro; Shibayama, Mitsuhiro; Ishiguro, Shin-ichi; Umebayashi, Yasuhiro

    2011-12-01

    Short- and long-range liquid structures of [CnmIm+][TFSA-] with n = 2, 4, 6, 8, 10, and 12 have been studied by high-energy x-ray diffraction (HEXRD) and small-angle neutron scattering (SANS) experiments with the aid of MD simulations. Observed x-ray structure factor, S(Q), for the ionic liquids with the alkyl-chain length n > 6 exhibited a characteristic peak in the low-Q range of 0.2-0.4 Å -1, indicating the heterogeneity of their ionic liquids. SANS profiles IH(Q) and ID(Q) for the normal and the alkyl group deuterated ionic liquids, respectively, showed significant peaks for n = 10 and 12 without no form factor component for large spherical or spheroidal aggregates like micelles in solution. The peaks for n = 10 and 12 evidently disappeared in the difference SANS profiles ΔI(Q) [=ID(Q) - IH(Q)], although that for n = 12 slightly remained. This suggests that the long-range correlations originated from the alkyl groups hardly contribute to the low-Q peak intensity in SANS. To reveal molecular origin of the low-Q peak, we introduce here a new function; x-ray structure factor intensity at a given Q as a function of r, SQpeak(r). The SQpeak(r) function suggests that the observed low-Q peak intensity depending on n is originated from liquid structures at two r-region of 5-8 and 8-15 Å for all ionic liquids examined except for n = 12. Atomistic MD simulations are consistent with the HEXRD and SANS experiments, and then we discussed the relationship between both variations of low-Q peak and real-space structure with lengthening the alkyl group of the CnmIm.

  16. Flight simulator with spaced visuals

    NASA Technical Reports Server (NTRS)

    Gilson, Richard D. (Inventor); Thurston, Marlin O. (Inventor); Olson, Karl W. (Inventor); Ventola, Ronald W. (Inventor)

    1980-01-01

    A flight simulator arrangement wherein a conventional, movable base flight trainer is combined with a visual cue display surface spaced a predetermined distance from an eye position within the trainer. Thus, three degrees of motive freedom (roll, pitch and crab) are provided for a visual proprioceptive, and vestibular cue system by the trainer while the remaining geometric visual cue image alterations are developed by a video system. A geometric approach to computing runway image eliminates a need to electronically compute trigonometric functions, while utilization of a line generator and designated vanishing point at the video system raster permits facile development of the images of the longitudinal edges of the runway.

  17. Multiple-body simulation with emphasis on integrated Space Shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau

    1993-01-01

    The program to obtain intergrid communications - Pegasus - was enhanced to make better use of computing resources. Periodic block tridiagonal and penta-diagonal diagonal routines in OVERFLOW were modified to use a better algorithm to speed up the calculation for grids with periodic boundary conditions. Several programs were added to collar grid tools and a user friendly shell script was developed to help users generate collar grids. User interface for HYPGEN was modified to cope with the changes in HYPGEN. ET/SRB attach hardware grids were added to the computational model for the space shuttle and is currently incorporated into the refined shuttle model jointly developed at Johnson Space Center and Ames Research Center. Flow simulation for the integrated space shuttle vehicle at flight Reynolds number was carried out and compared with flight data as well as the earlier simulation for wind tunnel Reynolds number.

  18. Simulation Results for Airborne Precision Spacing along Continuous Descent Arrivals

    NASA Technical Reports Server (NTRS)

    Barmore, Bryan E.; Abbott, Terence S.; Capron, William R.; Baxley, Brian T.

    2008-01-01

    This paper describes the results of a fast-time simulation experiment and a high-fidelity simulator validation with merging streams of aircraft flying Continuous Descent Arrivals through generic airspace to a runway at Dallas-Ft Worth. Aircraft made small speed adjustments based on an airborne-based spacing algorithm, so as to arrive at the threshold exactly at the assigned time interval behind their Traffic-To-Follow. The 40 aircraft were initialized at different altitudes and speeds on one of four different routes, and then merged at different points and altitudes while flying Continuous Descent Arrivals. This merging and spacing using flight deck equipment and procedures to augment or implement Air Traffic Management directives is called Flight Deck-based Merging and Spacing, an important subset of a larger Airborne Precision Spacing functionality. This research indicates that Flight Deck-based Merging and Spacing initiated while at cruise altitude and well prior to the Terminal Radar Approach Control entry can significantly contribute to the delivery of aircraft at a specified interval to the runway threshold with a high degree of accuracy and at a reduced pilot workload. Furthermore, previously documented work has shown that using a Continuous Descent Arrival instead of a traditional step-down descent can save fuel, reduce noise, and reduce emissions. Research into Flight Deck-based Merging and Spacing is a cooperative effort between government and industry partners.

  19. Simulation of the Effect of Realistic Space Vehicle Environments on Binary Metal Alloys

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.; Poirier, D. R.; Heinrich, J. C.; Sung, P. K.; Felicelli, S. D.; Phelps, Lisa (Technical Monitor)

    2001-01-01

    Simulations that assess the effect of space vehicle acceleration environments on the solidification of Pb-Sb alloys are reported. Space microgravity missions are designed to provide a near zero-g acceleration environment for various types of scientific experiments. Realistically. these space missions cannot provide a perfect environment. Vibrations caused by crew activity, on-board experiments, support systems stems (pumps, fans, etc.), periodic orbital maneuvers, and water dumps can all cause perturbations to the microgravity environment. In addition, the drag on the space vehicle is a source of acceleration. Therefore, it is necessary to predict the impact of these vibration-perturbations and the steady-state drag acceleration on the experiments. These predictions can be used to design mission timelines. so that the experiment is run during times that the impact of the acceleration environment is acceptable for the experiment of interest. The simulations reported herein were conducted using a finite element model that includes mass, species, momentum, and energy conservation. This model predicts the existence of "channels" within the processing mushy zone and subsequently "freckles" within the fully processed solid, which are the effects of thermosolutal convection. It is necessary to mitigate thermosolutal convection during space experiments of metal alloys, in order to study and characterize diffusion-controlled transport phenomena (microsegregation) that are normally coupled with macrosegregation. The model allows simulation of steady-state and transient acceleration values ranging from no acceleration (0 g). to microgravity conditions (10(exp -6) to 10(exp -3) g), to terrestrial gravity conditions (1 g). The transient acceleration environments simulated were from the STS-89 SpaceHAB mission and from the STS-94 SpaceLAB mission. with on-orbit accelerometer data during different mission periods used as inputs for the simulation model. Periods of crew exercise

  20. Monte Carlo simulation of TrueBeam flattening-filter-free beams using Varian phase-space files: Comparison with experimental data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belosi, Maria F.; Fogliata, Antonella, E-mail: antonella.fogliata-cozzi@eoc.ch, E-mail: afc@iosi.ch; Cozzi, Luca

    2014-05-15

    Purpose: Phase-space files for Monte Carlo simulation of the Varian TrueBeam beams have been made available by Varian. The aim of this study is to evaluate the accuracy of the distributed phase-space files for flattening filter free (FFF) beams, against experimental measurements from ten TrueBeam Linacs. Methods: The phase-space files have been used as input in PRIMO, a recently released Monte Carlo program based on thePENELOPE code. Simulations of 6 and 10 MV FFF were computed in a virtual water phantom for field sizes 3 × 3, 6 × 6, and 10 × 10 cm{sup 2} using 1 × 1more » × 1 mm{sup 3} voxels and for 20 × 20 and 40 × 40 cm{sup 2} with 2 × 2 × 2 mm{sup 3} voxels. The particles contained in the initial phase-space files were transported downstream to a plane just above the phantom surface, where a subsequent phase-space file was tallied. Particles were transported downstream this second phase-space file to the water phantom. Experimental data consisted of depth doses and profiles at five different depths acquired at SSD = 100 cm (seven datasets) and SSD = 90 cm (three datasets). Simulations and experimental data were compared in terms of dose difference. Gamma analysis was also performed using 1%, 1 mm and 2%, 2 mm criteria of dose-difference and distance-to-agreement, respectively. Additionally, the parameters characterizing the dose profiles of unflattened beams were evaluated for both measurements and simulations. Results: Analysis of depth dose curves showed that dose differences increased with increasing field size and depth; this effect might be partly motivated due to an underestimation of the primary beam energy used to compute the phase-space files. Average dose differences reached 1% for the largest field size. Lateral profiles presented dose differences well within 1% for fields up to 20 × 20 cm{sup 2}, while the discrepancy increased toward 2% in the 40 × 40 cm{sup 2} cases. Gamma analysis resulted in an agreement of 100% when a 2%, 2 mm

  1. Fine Mapping on Chromosome 13q32–34 and Brain Expression Analysis Implicates MYO16 in Schizophrenia

    PubMed Central

    Rodriguez-Murillo, Laura; Xu, Bin; Roos, J Louw; Abecasis, Gonçalo R; Gogos, Joseph A; Karayiorgou, Maria

    2014-01-01

    We previously reported linkage of schizophrenia and schizoaffective disorder to 13q32–34 in the European descent Afrikaner population from South Africa. The nature of genetic variation underlying linkage peaks in psychiatric disorders remains largely unknown and both rare and common variants may be contributing. Here, we examine the contribution of common variants located under the 13q32–34 linkage region. We used densely spaced SNPs to fine map the linkage peak region using both a discovery sample of 415 families and a meta-analysis incorporating two additional replication family samples. In a second phase of the study, we use one family-based data set with 237 families and independent case–control data sets for fine mapping of the common variant association signal using HapMap SNPs. We report a significant association with a genetic variant (rs9583277) within the gene encoding for the myosin heavy-chain Myr 8 (MYO16), which has been implicated in neuronal phosphoinositide 3-kinase signaling. Follow-up analysis of HapMap variation within MYO16 in a second set of Afrikaner families and additional case–control data sets of European descent highlighted a region across introns 2–6 as the most likely region to harbor common MYO16 risk variants. Expression analysis revealed a significant increase in the level of MYO16 expression in the brains of schizophrenia patients. Our results suggest that common variation within MYO16 may contribute to the genetic liability to schizophrenia. PMID:24141571

  2. The q-harmonic oscillators, q-coherent states and the q-symplecton

    NASA Technical Reports Server (NTRS)

    Biedenharn, L. C.; Lohe, M. A.; Nomura, Masao

    1993-01-01

    The recently introduced notion of a quantum group is discussed conceptually and then related to deformed harmonic oscillators ('q-harmonic oscillators'). Two developments in applying q-harmonic oscillators are reviewed: q-coherent states and the q-symplecton.

  3. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  4. Time-Accurate Unsteady Pressure Loads Simulated for the Space Launch System at Wind Tunnel Conditions

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, William L.; Glass, Christopher E.; Streett, Craig L.; Schuster, David M.

    2015-01-01

    A transonic flow field about a Space Launch System (SLS) configuration was simulated with the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics (CFD) code at wind tunnel conditions. Unsteady, time-accurate computations were performed using second-order Delayed Detached Eddy Simulation (DDES) for up to 1.5 physical seconds. The surface pressure time history was collected at 619 locations, 169 of which matched locations on a 2.5 percent wind tunnel model that was tested in the 11 ft. x 11 ft. test section of the NASA Ames Research Center's Unitary Plan Wind Tunnel. Comparisons between computation and experiment showed that the peak surface pressure RMS level occurs behind the forward attach hardware, and good agreement for frequency and power was obtained in this region. Computational domain, grid resolution, and time step sensitivity studies were performed. These included an investigation of pseudo-time sub-iteration convergence. Using these sensitivity studies and experimental data comparisons, a set of best practices to date have been established for FUN3D simulations for SLS launch vehicle analysis. To the author's knowledge, this is the first time DDES has been used in a systematic approach and establish simulation time needed, to analyze unsteady pressure loads on a space launch vehicle such as the NASA SLS.

  5. Heavy-flavored tetraquark states with the Q Q Q ¯ Q ¯ configuration

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Liu, Yan-Rui; Chen, Kan; Liu, Xiang; Zhu, Shi-Lin

    2018-05-01

    In the framework of the color-magnetic interaction, we systematically investigate the mass spectrum of the tetraquark states composed of four heavy quarks with the Q Q Q ¯Q ¯ configuration in this work. We also show their strong decay patterns. Stable or narrow states in the b b b ¯c ¯ and b c b ¯c ¯ systems are found to be possible. We hope the studies shall be helpful to the experimental search for heavy-full exotic tetraquark states.

  6. A position-dependent mass harmonic oscillator and deformed space

    NASA Astrophysics Data System (ADS)

    da Costa, Bruno G.; Borges, Ernesto P.

    2018-04-01

    We consider canonically conjugated generalized space and linear momentum operators x^ q and p^ q in quantum mechanics, associated with a generalized translation operator which produces infinitesimal deformed displacements controlled by a deformation parameter q. A canonical transformation (x ^ ,p ^ ) →(x^ q,p^ q ) leads the Hamiltonian of a position-dependent mass particle in usual space to another Hamiltonian of a particle with constant mass in a conservative force field of the deformed space. The equation of motion for the classical phase space (x, p) may be expressed in terms of the deformed (dual) q-derivative. We revisit the problem of a q-deformed oscillator in both classical and quantum formalisms. Particularly, this canonical transformation leads a particle with position-dependent mass in a harmonic potential to a particle with constant mass in a Morse potential. The trajectories in phase spaces (x, p) and (xq, pq) are analyzed for different values of the deformation parameter. Finally, we compare the results of the problem in classical and quantum formalisms through the principle of correspondence and the WKB approximation.

  7. Comparative characterization of short monomeric polyglutamine peptides by replica exchange molecular dynamics simulation.

    PubMed

    Nakano, Miki; Watanabe, Hirofumi; Rothstein, Stuart M; Tanaka, Shigenori

    2010-05-27

    Polyglutamine (polyQ) diseases are caused by an abnormal expansion of CAG repeats. While their detailed structure remains unclear, polyQ peptides assume beta-sheet structures when they aggregate. To investigate the conformational ensemble of short, monomeric polyQ peptides, which consist of 15 glutamine residues (Q(15)), we performed replica exchange molecular dynamics (REMD) simulations. We found that Q(15) can assume multiple configurations due to all of the residues affecting the formation of side-chain hydrogen bonds. Analysis of the free energy landscape reveals that Q(15) has a basin for random-coil structures and another for alpha-helix or beta-turn structures. To investigate properties of aggregated polyQ peptides, we performed multiple molecular dynamics (MMD) simulations for monomeric and oligomeric Q(15). MMD revealed that the formation of oligomers stabilizes the beta-turn structure by increasing the number of hydrogen bonds between the main chains.

  8. Essential energy space random walks to accelerate molecular dynamics simulations: Convergence improvements via an adaptive-length self-healing strategy

    NASA Astrophysics Data System (ADS)

    Zheng, Lianqing; Yang, Wei

    2008-07-01

    Recently, accelerated molecular dynamics (AMD) technique was generalized to realize essential energy space random walks so that further sampling enhancement and effective localized enhanced sampling could be achieved. This method is especially meaningful when essential coordinates of the target events are not priori known; moreover, the energy space metadynamics method was also introduced so that biasing free energy functions can be robustly generated. Despite the promising features of this method, due to the nonequilibrium nature of the metadynamics recursion, it is challenging to rigorously use the data obtained at the recursion stage to perform equilibrium analysis, such as free energy surface mapping; therefore, a large amount of data ought to be wasted. To resolve such problem so as to further improve simulation convergence, as promised in our original paper, we are reporting an alternate approach: the adaptive-length self-healing (ALSH) strategy for AMD simulations; this development is based on a recent self-healing umbrella sampling method. Here, the unit simulation length for each self-healing recursion is increasingly updated based on the Wang-Landau flattening judgment. When the unit simulation length for each update is long enough, all the following unit simulations naturally run into the equilibrium regime. Thereafter, these unit simulations can serve for the dual purposes of recursion and equilibrium analysis. As demonstrated in our model studies, by applying ALSH, both fast recursion and short nonequilibrium data waste can be compromised. As a result, combining all the data obtained from all the unit simulations that are in the equilibrium regime via the weighted histogram analysis method, efficient convergence can be robustly ensured, especially for the purpose of free energy surface mapping.

  9. Configural Scoring of Simulator Sickness, Cybersickness and Space Adaptation Syndrome: Similarities and Differences?

    NASA Technical Reports Server (NTRS)

    Kennedy, Robert S.; Drexler, Julie M.; Compton, Daniel E.; Stanney, Kay M.; Lanham, Susan; Harm, Deborah L.

    2001-01-01

    From a survey of ten U.S. Navy flight simulators a large number (N > 1,600 exposures) of self-reports of motion sickness symptomatology were obtained. Using these data, scoring algorithms were derived, which permit examination of groups of individuals that can be scored either for 1) their total sickness experience in a particular device; or, 2) according to three separable symptom clusters which emerged from a Factor Analysis. Scores from this total score are found to be proportional to other global motion sickness symptom checklist scores with which they correlate (r = 0.82). The factors that surfaced from the analysis include clusters of symptoms referable as nausea, oculomotor disturbances, and disorientation (N, 0, and D). The factor scores may have utility in differentiating the source of symptoms in different devices. The present chapter describes our experience with the use of both of these types of scores and illustrates their use with examples from flight simulators, space sickness and virtual environments.

  10. ProjectQ Software Framework

    NASA Astrophysics Data System (ADS)

    Steiger, Damian S.; Haener, Thomas; Troyer, Matthias

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.

  11. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  12. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  13. Observability of ionospheric space-time structure with ISR: A simulation study

    NASA Astrophysics Data System (ADS)

    Swoboda, John; Semeter, Joshua; Zettergren, Matthew; Erickson, Philip J.

    2017-02-01

    The sources of error from electronically steerable array (ESA) incoherent scatter radar (ISR) systems are investigated both theoretically and with use of an open-source ISR simulator, developed by the authors, called Simulator for ISR (SimISR). The main sources of error incorporated in the simulator include statistical uncertainty, which arises due to nature of the measurement mechanism and the inherent space-time ambiguity from the sensor. SimISR can take a field of plasma parameters, parameterized by time and space, and create simulated ISR data at the scattered electric field (i.e., complex receiver voltage) level, subsequently processing these data to show possible reconstructions of the original parameter field. To demonstrate general utility, we show a number of simulation examples, with two cases using data from a self-consistent multifluid transport model. Results highlight the significant influence of the forward model of the ISR process and the resulting statistical uncertainty on plasma parameter measurements and the core experiment design trade-offs that must be made when planning observations. These conclusions further underscore the utility of this class of measurement simulator as a design tool for more optimal experiment design efforts using flexible ESA class ISR systems.

  14. Advanced Analysis and Visualization of Space Weather Phenomena

    NASA Astrophysics Data System (ADS)

    Murphy, Joshua J.

    As the world becomes more technologically reliant, the more susceptible society as a whole is to adverse interactions with the sun. This "space weather'' can produce significant effects on modern technology, from interrupting satellite service, to causing serious damage to Earth-side power grids. These concerns have, over the past several years, prompted an out-welling of research in an attempt to understand the processes governing, and to provide a means of forecasting, space weather events. The research presented in this thesis couples to current work aimed at understanding Coronal Mass Ejections (CMEs) and their influence on the evolution of Earth's magnetic field and associated Van Allen radiation belts. To aid in the analysis of how these solar wind transients affect Earth's magnetic field, a system named Geospace/Heliosphere Observation & Simulation Tool-kit (GHOSTkit), along with its python analysis tools, GHOSTpy, has been devised to calculate the adiabatic invariants of trapped particle motion within Earth's magnetic field. These invariants aid scientists in ordering observations of the radiation belts, providing a more natural presentation of data, but can be computationally expensive to calculate. The GHOSTpy system, in the phase presented here, is aimed at providing invariant calculations based on LFM magnetic field simulation data. This research first examines an ideal dipole application to gain understanding on system performance. Following this, the challenges of applying the algorithms to gridded LFM MHD data is examined. Performance profiles are then presented, followed by a real-world application of the system.

  15. Brownian dynamics simulations on a hypersphere in 4-space

    NASA Astrophysics Data System (ADS)

    Nissfolk, Jarl; Ekholm, Tobias; Elvingson, Christer

    2003-10-01

    We describe an algorithm for performing Brownian dynamics simulations of particles diffusing on S3, a hypersphere in four dimensions. The system is chosen due to recent interest in doing computer simulations in a closed space where periodic boundary conditions can be avoided. We specifically address the question how to generate a random walk on the 3-sphere, starting from the solution of the corresponding diffusion equation, and we also discuss an efficient implementation based on controlled approximations. Since S3 is a closed manifold (space), the average square displacement during a random walk is no longer proportional to the elapsed time, as in R3. Instead, its time rate of change is continuously decreasing, and approaches zero as time becomes large. We show, however, that the effective diffusion coefficient can still be obtained from the time dependence of the square displacement.

  16. A SLAM II simulation model for analyzing space station mission processing requirements

    NASA Technical Reports Server (NTRS)

    Linton, D. G.

    1985-01-01

    Space station mission processing is modeled via the SLAM 2 simulation language on an IBM 4381 mainframe and an IBM PC microcomputer with 620K RAM, two double-sided disk drives and an 8087 coprocessor chip. Using a time phased mission (payload) schedule and parameters associated with the mission, orbiter (space shuttle) and ground facility databases, estimates for ground facility utilization are computed. Simulation output associated with the science and applications database is used to assess alternative mission schedules.

  17. Q-PCR based bioburden assessment of drinking water throughout treatment and delivery to the International Space Station

    NASA Technical Reports Server (NTRS)

    Newcombe, David; Stuecker, Tara; La Duc, Myron; Venkateswaran, Kasthuri

    2005-01-01

    Previous studies indicated evidence of opportunistic pathogens samples obtained during missions to the International Space Station (ISS). This study utilized TaqMan quantitative PCR to determine specific gene abundance in potable and non-potable ISS waters. Probe and primer sets specific to the small subunit rRNA genes were used to elucidate overall bacterial rRNA gene numbers. while those specific for Burkholderia cepacia and Stenotrophomonas maltophilia were optimized and used to probe for the presence of these two opportunistic pathogens. This research builds upon previous microbial diversity studies of ISS water and demonstrates the utility of Q-PCR tool to examine water quality.

  18. Characterization of C1q in Teleosts

    PubMed Central

    Hu, Yu-Lan; Pan, Xin-Min; Xiang, Li-Xin; Shao, Jian-Zhong

    2010-01-01

    C1qs are key components of the classical complement pathway. They have been well documented in human and mammals, but little is known about their molecular and functional characteristics in fish. In the present study, full-length cDNAs of c1qA, c1qB, and c1qC from zebrafish (Danio rerio) were cloned, revealing the conservation of their chromosomal synteny and organization between zebrafish and other species. For functional analysis, the globular heads of C1qA (ghA), C1qB (ghB), and C1qC (ghC) were expressed in Escherichia coli as soluble proteins. Hemolytic inhibitory assays showed that hemolytic activity in carp serum can be inhibited significantly by anti-C1qA, -C1qB, and -C1qC of zebrafish, respectively, indicating that C1qA, C1qB, and C1qC are involved in the classical pathway and are conserved functionally from fish to human. Zebrafish C1qs also could specifically bind to heat-aggregated zebrafish IgM, human IgG, and IgM. The involvement of globular head modules in the C1q-dependent classical pathway demonstrates the structural and functional conservation of these molecules in the classical pathway and their IgM or IgG binding sites during evolution. Phylogenetic analysis revealed that c1qA, c1qB, and c1qC may be formed by duplications of a single copy of c1qB and that the C1q family is, evolutionarily, closely related to the Emu family. This study improves current understanding of the evolutionary history of the C1q family and C1q-mediated immunity. PMID:20615881

  19. Demonstration of NICT Space Weather Cloud --Integration of Supercomputer into Analysis and Visualization Environment--

    NASA Astrophysics Data System (ADS)

    Watari, S.; Morikawa, Y.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Kato, H.; Shimojo, S.; Murata, K. T.

    2010-12-01

    In the Solar-Terrestrial Physics (STP) field, spatio-temporal resolution of computer simulations is getting higher and higher because of tremendous advancement of supercomputers. A more advanced technology is Grid Computing that integrates distributed computational resources to provide scalable computing resources. In the simulation research, it is effective that a researcher oneself designs his physical model, performs calculations with a supercomputer, and analyzes and visualizes for consideration by a familiar method. A supercomputer is far from an analysis and visualization environment. In general, a researcher analyzes and visualizes in the workstation (WS) managed at hand because the installation and the operation of software in the WS are easy. Therefore, it is necessary to copy the data from the supercomputer to WS manually. Time necessary for the data transfer through long delay network disturbs high-accuracy simulations actually. In terms of usefulness, integrating a supercomputer and an analysis and visualization environment seamlessly with a researcher's familiar method is important. NICT has been developing a cloud computing environment (NICT Space Weather Cloud). In the NICT Space Weather Cloud, disk servers are located near its supercomputer and WSs for data analysis and visualization. They are connected to JGN2plus that is high-speed network for research and development. Distributed virtual high-capacity storage is also constructed by Grid Datafarm (Gfarm v2). Huge-size data output from the supercomputer is transferred to the virtual storage through JGN2plus. A researcher can concentrate on the research by a familiar method without regard to distance between a supercomputer and an analysis and visualization environment. Now, total 16 disk servers are setup in NICT headquarters (at Koganei, Tokyo), JGN2plus NOC (at Otemachi, Tokyo), Okinawa Subtropical Environment Remote-Sensing Center, and Cybermedia Center, Osaka University. They are connected on

  20. Lessons Learned from Application of System and Software Level RAMS Analysis to a Space Control System

    NASA Astrophysics Data System (ADS)

    Silva, N.; Esper, A.

    2012-01-01

    The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.

  1. PRANAS: A New Platform for Retinal Analysis and Simulation.

    PubMed

    Cessac, Bruno; Kornprobst, Pierre; Kraria, Selim; Nasser, Hassan; Pamplona, Daniela; Portelli, Geoffrey; Viéville, Thierry

    2017-01-01

    The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  2. Modal simulation analysis of novel 3D elliptical ultrasonic transducer

    NASA Astrophysics Data System (ADS)

    Kurniawan, R.; Ali, S.; Ko, T. J.

    2018-03-01

    This paper aims to present the modal simulation analysis results of a novel 3D elliptical ultrasonic transducer. This research aims to develop a novel elliptical transducer that works in ultrasonic and is able to generate a three dimensional motion in Cartesian space. The concept of the transducer design is basically to find a coupling frequency of the longitudinal-bending-bending mode. To achieve that purpose, the modal simulation analysis was performed to find a proper dimension of the transducer, thus the natural frequency of the 1st longitudinal mode is much closed with the two of natural frequency of the 3rd bending mode. The finite element modelling (FEM) was used to perform this work.

  3. Selection and validation of endogenous reference genes for qRT-PCR analysis in leafy spurge (Euphorbia esula)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is the most important tool in measuring levels of gene expression due to its accuracy, specificity, and sensitivity. However, the accuracy of qRT-PCR analysis strongly depends on transcript normalization using stably expressed reference gene...

  4. Constitutional del(19)(q12q13.1) in a three-year-old girl with severe phenotypic abnormalities affecting multiple organ systems.

    PubMed

    Kulharya, A S; Michaelis, R C; Norris, K S; Taylor, H A; Garcia-Heras, J

    1998-06-05

    We present the clinical, cytogenetic, and molecular studies on a constitutional deletion of 19q ascertained prenatally due to decreased fetal activity and IUGR. Chromosome analysis by GTG banding on amniocytes suggested a del(19)(q13.1q13.3), but the analysis of microsatellites by PCR demonstrated that the deletion involved the distal segment of q12 and the proximal segment of q13.1 (15 cM). The severely affected female infant born at 38 weeks has clinical findings that may be related to haploinsufficiency of specific genes within 19q12.1-->q13.1 that control important processes of normal development and cell function.

  5. The 11th Space Simulation Conference

    NASA Technical Reports Server (NTRS)

    Bond, A. C. (Editor)

    1980-01-01

    Subject areas range from specialized issues dealing with the space and entry environments to the environmental testing of systems and complete spacecraft of present-day vintage. Various papers consider: the test and development of several key systems of the orbiter vehicle; integrated tests of complete satellites; new and unique test facilities developed to meet the demanding requirements of high fidelity simulation of test environments; and contamination species, including the instrumentation for detection and measurement of such. Special topics include improved thermal protection methodologies and approaches, sophisticated sensor developments, and other related testing and development areas.

  6. Statistical analysis of flight times for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Graves, M. E.; Perlmutter, M.

    1974-01-01

    Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.

  7. Credibility Assessment of Deterministic Computational Models and Simulations for Space Biomedical Research and Operations

    NASA Technical Reports Server (NTRS)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry

    2015-01-01

    Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.

  8. A Fast-Time Simulation Environment for Airborne Merging and Spacing Research

    NASA Technical Reports Server (NTRS)

    Bussink, Frank J. L.; Doble, Nathan A.; Barmore, Bryan E.; Singer, Sharon

    2005-01-01

    As part of NASA's Distributed Air/Ground Traffic Management (DAG-TM) effort, NASA Langley Research Center is developing concepts and algorithms for merging multiple aircraft arrival streams and precisely spacing aircraft over the runway threshold. An airborne tool has been created for this purpose, called Airborne Merging and Spacing for Terminal Arrivals (AMSTAR). To evaluate the performance of AMSTAR and complement human-in-the-loop experiments, a simulation environment has been developed that enables fast-time studies of AMSTAR operations. The environment is based on TMX, a multiple aircraft desktop simulation program created by the Netherlands National Aerospace Laboratory (NLR). This paper reviews the AMSTAR concept, discusses the integration of the AMSTAR algorithm into TMX and the enhancements added to TMX to support fast-time AMSTAR studies, and presents initial simulation results.

  9. A modern space simulation facility to accommodate high production acceptance testing

    NASA Technical Reports Server (NTRS)

    Glover, J. D.

    1986-01-01

    A space simulation laboratory that supports acceptance testing of spacecraft and associated subsystems at throughput rates as high as nine per year is discussed. The laboratory includes a computer operated 27' by 30' space simulation, a 20' by 20' by 20' thermal cycle chamber and an eight station thermal cycle/thermal vacuum test system. The design philosophy and unique features of each system are discussed. The development of operating procedures, test team requirements, test team integration, and other peripheral activation details are described. A discussion of special accommodations for the efficient utilization of the systems in support of high rate production is presented.

  10. Heavy-Quark Symmetry Implies Stable Heavy Tetraquark Mesons Q i Q j q ¯ k q ¯ l

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichten, Estia J.; Quigg, Chris

    For very heavy quarks Q, relations derived from heavy-quark symmetry predict the existence of novel narrow doubly heavy tetraquark states of the form Q iQqq l (subscripts label flavors), where q designates a light quark. By evaluating finite-mass corrections, we predict that double-beauty states composed of bb¯u¯d, bb¯u¯s, and bb¯d¯s will be stable against strong decays, whereas the double-charm states cc¯qq l, mixed beauty+charm states bc¯qq l, and heavier bb¯qk¯ql states will dissociate into pairs of heavy-light mesons. Furthermore, observation of a new double-beauty state through its weak decays would establish the existence of tetraquarks andmore » illuminate the role of heavy color-antitriplet diquarks as hadron constituents.« less

  11. Heavy-Quark Symmetry Implies Stable Heavy Tetraquark Mesons Q i Q j q ¯ k q ¯ l

    DOE PAGES

    Eichten, Estia J.; Quigg, Chris

    2017-11-15

    For very heavy quarks Q, relations derived from heavy-quark symmetry predict the existence of novel narrow doubly heavy tetraquark states of the form Q iQqq l (subscripts label flavors), where q designates a light quark. By evaluating finite-mass corrections, we predict that double-beauty states composed of bb¯u¯d, bb¯u¯s, and bb¯d¯s will be stable against strong decays, whereas the double-charm states cc¯qq l, mixed beauty+charm states bc¯qq l, and heavier bb¯qk¯ql states will dissociate into pairs of heavy-light mesons. Furthermore, observation of a new double-beauty state through its weak decays would establish the existence of tetraquarks andmore » illuminate the role of heavy color-antitriplet diquarks as hadron constituents.« less

  12. Simulation of the space station information system in Ada

    NASA Technical Reports Server (NTRS)

    Spiegel, James R.

    1986-01-01

    The Flexible Ada Simulation Tool (FAST) is a discrete event simulation language which is written in Ada. FAST has been used to simulate a number of options for ground data distribution of Space Station payload data. The fact that Ada language is used for implementation has allowed a number of useful interactive features to be built into FAST and has facilitated quick enhancement of its capabilities to support new modeling requirements. General simulation concepts are discussed, and how these concepts are implemented in FAST. The FAST design is discussed, and it is pointed out how the used of the Ada language enabled the development of some significant advantages over classical FORTRAN based simulation languages. The advantages discussed are in the areas of efficiency, ease of debugging, and ease of integrating user code. The specific Ada language features which enable these advances are discussed.

  13. Preliminary analysis of one year long space climate simulation

    NASA Astrophysics Data System (ADS)

    Facsko, G.; Honkonen, I. J.; Juusola, L.; Viljanen, A.; Vanhamäki, H.; Janhunen, P.; Palmroth, M.; Milan, S. E.

    2013-12-01

    One full year (155 Cluster orbits, from January 29, 2002 to February 2, 2003) is simulated using the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS) in the European Cluster Assimilation Technology project (ECLAT). This enables us to study the performance of a global magnetospheric model in an unprecedented scale both in terms of the amount of available observations and the length of the timeseries that can be compared. The solar wind for the simulated period, obtained from OMNIWeb, is used as input to GUMICS. We present an overview of various comparisons of GUMICS results to observations for the simulated year. Results along the Cluster reference spacecraft orbit to are compared to Cluster measurements. The Cross Polar Cap Potential (CPCP) results are compared to SuperDARN measurements. The IMAGE electrojet indicators (IU, IL) calculated from the ionospheric currents of GUMICS are compared to observations. Finally, Geomagnetically Induced Currents (GIC) calculated from GUMICS results along the Finnish mineral gas pipeline at Mätsälä are also compared to measurements.

  14. Simulations of the MATROSHKA experiment at the international space station using PHITS.

    PubMed

    Sihver, L; Sato, T; Puchalska, M; Reitz, G

    2010-08-01

    Concerns about the biological effects of space radiation are increasing rapidly due to the perspective of long-duration manned missions, both in relation to the International Space Station (ISS) and to manned interplanetary missions to Moon and Mars in the future. As a preparation for these long-duration space missions, it is important to ensure an excellent capability to evaluate the impact of space radiation on human health, in order to secure the safety of the astronauts/cosmonauts and minimize their risks. It is therefore necessary to measure the radiation load on the personnel both inside and outside the space vehicles and certify that organ- and tissue-equivalent doses can be simulated as accurate as possible. In this paper, simulations are presented using the three-dimensional Monte Carlo Particle and Heavy-Ion Transport code System (PHITS) (Iwase et al. in J Nucl Sci Tech 39(11):1142-1151, 2002) of long-term dose measurements performed with the European Space Agency-supported MATROSHKA (MTR) experiment (Reitz and Berger in Radiat Prot Dosim 120:442-445, 2006). MATROSHKA is an anthropomorphic phantom containing over 6,000 radiation detectors, mimicking a human head and torso. The MTR experiment, led by the German Aerospace Center (DLR), was launched in January 2004 and has measured the absorbed doses from space radiation both inside and outside the ISS. Comparisons of simulations with measurements outside the ISS are presented. The results indicate that PHITS is a suitable tool for estimation of doses received from cosmic radiation and for study of the shielding of spacecraft against cosmic radiation.

  15. Finite element analysis of space debris removal by high-power lasers

    NASA Astrophysics Data System (ADS)

    Xue, Li; Jiang, Guanlei; Yu, Shuang; Li, Ming

    2015-08-01

    With the development of space station technologies, irradiation of space debris by space-based high-power lasers, can locally generate high-temperature plasmas and micro momentum, which may achieve the removal of debris through tracking down. Considered typical square-shaped space debris of material Ti with 5cm×5cm size, whose thermal conductivity, density, specific heat capacity and emissivity are 7.62W/(m·°C), 4500kg/m3, 0.52J/(kg·°C) and 0.3,respectively, based on the finite element analysis of ANSYS, each irradiation of space debris by high-power lasers with power density 106W/m2 and weapons-grade lasers with power density 3000W/m2 are simulated under space environment, and the temperature curves due to laser thermal irradiation are obtained and compared. Results show only 2s is needed for high-power lasers to make the debris temperature reach to about 10000K, which is the threshold temperature for plasmas-state conversion. While for weapons-grade lasers, it is 13min needed. Using two line elements (TLE), and combined with the coordinate transformation from celestial coordinate system to site coordinate system, the visible period of space debris is calculated as 5-10min. That is, in order to remove space debris by laser plasmas, the laser power density should be further improved. The article provides an intuitive and visual feasibility analysis method of space debris removal, and the debris material and shape, laser power density and spot characteristics are adjustable. This finite element analysis method is low-cost, repeatable and adaptable, which has an engineering-prospective applications.

  16. Variance Analysis of Unevenly Spaced Time Series Data

    NASA Technical Reports Server (NTRS)

    Hackman, Christine; Parker, Thomas E.

    1996-01-01

    We have investigated the effect of uneven data spacing on the computation of delta (sub chi)(gamma). Evenly spaced simulated data sets were generated for noise processes ranging from white phase modulation (PM) to random walk frequency modulation (FM). Delta(sub chi)(gamma) was then calculated for each noise type. Data were subsequently removed from each simulated data set using typical two-way satellite time and frequency transfer (TWSTFT) data patterns to create two unevenly spaced sets with average intervals of 2.8 and 3.6 days. Delta(sub chi)(gamma) was then calculated for each sparse data set using two different approaches. First the missing data points were replaced by linear interpolation and delta (sub chi)(gamma) calculated from this now full data set. The second approach ignored the fact that the data were unevenly spaced and calculated delta(sub chi)(gamma) as if the data were equally spaced with average spacing of 2.8 or 3.6 days. Both approaches have advantages and disadvantages, and techniques are presented for correcting errors caused by uneven data spacing in typical TWSTFT data sets.

  17. A space-efficient quantum computer simulator suitable for high-speed FPGA implementation

    NASA Astrophysics Data System (ADS)

    Frank, Michael P.; Oniciuc, Liviu; Meyer-Baese, Uwe H.; Chiorescu, Irinel

    2009-05-01

    Conventional vector-based simulators for quantum computers are quite limited in the size of the quantum circuits they can handle, due to the worst-case exponential growth of even sparse representations of the full quantum state vector as a function of the number of quantum operations applied. However, this exponential-space requirement can be avoided by using general space-time tradeoffs long known to complexity theorists, which can be appropriately optimized for this particular problem in a way that also illustrates some interesting reformulations of quantum mechanics. In this paper, we describe the design and empirical space/time complexity measurements of a working software prototype of a quantum computer simulator that avoids excessive space requirements. Due to its space-efficiency, this design is well-suited to embedding in single-chip environments, permitting especially fast execution that avoids access latencies to main memory. We plan to prototype our design on a standard FPGA development board.

  18. Accuracy of diagnosing gastroesophageal reflux disease by GerdQ, esophageal impedance monitoring and histology.

    PubMed

    Zhou, Li Ya; Wang, Ye; Lu, Jing Jing; Lin, Lin; Cui, Rong Li; Zhang, He Jun; Xue, Yan; Ding, Shi Gang; Lin, San Ren

    2014-05-01

    To assess the performance of self-assessment gastroesophageal reflux disease questionnaire (GerdQ), 24-h impedance monitoring, proton pump inhibitor (PPI) test and intercellular space of esophageal mucosal epithelial cells in the diagnosis of gastroesophageal reflux disease (GERD). Patients with symptoms suspected of GERD were administered the GerdQ and underwent endoscopy (measurement of intercellular space in the biopsy specimen sampling at 2 cm above the Z-line) and 24-h impedance pH monitoring, together with a 2-week experimental treatment with esomeprazole. A total of 636 patients were included for the final analysis, including 352 with GERD. The sensitivity and specificity of GerdQ and 24-h impedance monitoring for diagnosing GERD were 57.7% and 48.9%, and 66.4% and 43.3%, respectively. The sensitivity of 24-h impedance pH monitoring increased to 93.7%. The sensitivity and specificity of dilated intercellular spaces (DIS) (≥0.9 μm) for diagnosing GERD were 61.2% and 56.1%, respectively, whereas those for PPI test were 70.5% and 44.4%. GerdQ score or PPI test alone cannot accurately diagnose GERD in a Chinese population suspected of GERD. A definitive diagnosis of GERD still depends on endoscopy or 24-h pH monitoring. 24-h impedance pH monitoring may increase the sensitivity for diagnosing GERD by 20%; however, when used alone, it results in poor specificity in patients without acid suppressive therapy. © 2014 Chinese Medical Association Shanghai Branch, Chinese Society of Gastroenterology, Renji Hospital Affiliated to Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  19. Using Virtual Simulations in the Design of 21st Century Space Science Environments

    NASA Technical Reports Server (NTRS)

    Hutchinson, Sonya L.; Alves, Jeffery R.

    1996-01-01

    Space Technology has been rapidly increasing in the past decade. This can be attributed to the future construction of the International Space Station (ISS). New innovations must constantly be engineered to make ISS the safest, quality, research facility in space. Since space science must often be gathered by crew members, more attention must be geared to the human's safety and comfort. Virtual simulations are now being used to design environments that crew members can live in for long periods of time without harmful effects to their bodies. This paper gives a few examples of the ergonomic design problems that arise on manned space flights, and design solutions that follow NASA's strategic commitment to customer satisfaction. The conclusions show that virtual simulations are a great asset to 21st century design.

  20. NL(q) Theory: A Neural Control Framework with Global Asymptotic Stability Criteria.

    PubMed

    Vandewalle, Joos; De Moor, Bart L.R.; Suykens, Johan A.K.

    1997-06-01

    In this paper a framework for model-based neural control design is presented, consisting of nonlinear state space models and controllers, parametrized by multilayer feedforward neural networks. The models and closed-loop systems are transformed into so-called NL(q) system form. NL(q) systems represent a large class of nonlinear dynamical systems consisting of q layers with alternating linear and static nonlinear operators that satisfy a sector condition. For such NL(q)s sufficient conditions for global asymptotic stability, input/output stability (dissipativity with finite L(2)-gain) and robust stability and performance are presented. The stability criteria are expressed as linear matrix inequalities. In the analysis problem it is shown how stability of a given controller can be checked. In the synthesis problem two methods for neural control design are discussed. In the first method Narendra's dynamic backpropagation for tracking on a set of specific reference inputs is modified with an NL(q) stability constraint in order to ensure, e.g., closed-loop stability. In a second method control design is done without tracking on specific reference inputs, but based on the input/output stability criteria itself, within a standard plant framework as this is done, for example, in H( infinity ) control theory and &mgr; theory. Copyright 1997 Elsevier Science Ltd.

  1. Comparison of High-Performance Fiber Materials Properties in Simulated and Actual Space Environments

    NASA Technical Reports Server (NTRS)

    Finckernor, M. M.

    2017-01-01

    A variety of high-performance fibers, including Kevlar, Nomex, Vectran, and Spectra, have been tested for durability in the space environment, mostly the low Earth orbital environment. These materials have been tested in yarn, tether/cable, and fabric forms. Some material samples were tested in a simulated space environment, such as the Atomic Oxygen Beam Facility and solar simulators in the laboratory. Other samples were flown on the International Space Station as part of the Materials on International Space Station Experiment. Mass loss due to atomic oxygen erosion and optical property changes due to ultraviolet radiation degradation are given. Tensile test results are also presented, including where moisture loss in a vacuum had an impact on tensile strength.

  2. [Development of fixed-base full task space flight training simulator].

    PubMed

    Xue, Liang; Chen, Shan-quang; Chang, Tian-chun; Yang, Hong; Chao, Jian-gang; Li, Zhi-peng

    2003-01-01

    Fixed-base full task flight training simulator is a very critical and important integrated training facility. It is mostly used in training of integrated skills and tasks, such as running the flight program of manned space flight, dealing with faults, operating and controlling spacecraft flight, communicating information between spacecraft and ground. This simulator was made up of several subentries including spacecraft simulation, simulating cabin, sight image, acoustics, main controlling computer, instructor and assistant support. It has implemented many simulation functions, such as spacecraft environment, spacecraft movement, communicating information between spacecraft and ground, typical faults, manual control and operating training, training control, training monitor, training database management, training data recording, system detecting and so on.

  3. Compact Q-balls in the complex signum-Gordon model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arodz, H.; Lis, J.

    2008-05-15

    We discuss Q-balls in the complex signum-Gordon model in d-dimensional space for d=1, 2, 3. The Q-balls have strictly finite size. Their total energy is a powerlike function of the conserved U(1) charge with the exponent equal to (d+2)(d+3){sup -1}. In the cases d=1 and d=3 explicit analytic solutions are presented.

  4. STS_135_MaxQ

    NASA Image and Video Library

    2011-05-05

    JSC2011-E-059375 (4 May 2011) --- NASA astronaut Chris Ferguson, STS-135 commander, plays the drums with the all-astronaut band known as Max Q as the group performs on Innovation Day at NASA?s Johnson Space Center in Houston May 4, 2011. Vocalist Tracy Caldwell Dyson is at left. Guitarist Drew Feustel is at right. Photo credit: NASA Photo/Houston Chronicle, Smiley N. Pool

  5. Development of automation and robotics for space via computer graphic simulation methods

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  6. Simulated Space Environmental Effects on Thin Film Solar Array Components

    NASA Technical Reports Server (NTRS)

    Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon

    2017-01-01

    The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection typically afforded by thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 100, 500, and 700 keV energy protons, and a fourth set were exposed to >2,000 hours of near ultraviolet (NUV) radiation. A final set was rapidly thermal cycled between -55 and +125 C. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.

  7. Simulated Space Environmental Effects on Thin Film Solar Array Components

    NASA Technical Reports Server (NTRS)

    Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon

    2017-01-01

    The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection typically afforded by thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 100, 500, and 700 keV energy protons, and a fourth set were exposed to >2,000 hours of near ultraviolet (NUV) radiation. A final set was rapidly thermal cycled between -55 and +125degC. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.

  8. Simulated Space Environmental Effects on Thin Film Solar Array Components

    NASA Technical Reports Server (NTRS)

    Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon

    2017-01-01

    The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection afforded by typical thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 500, and 750 keV energy protons, and a fourth set were exposed to >2,000 hours of ultraviolet radiation. A final set was rapidly thermal cycled between -50 and +120 C. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.

  9. Dermatoglyphics and Reproductive Risk in a Family with Robertsonian Translocation 14q;21q.

    PubMed

    Kolgeci, Selim; Kolgeci, Jehona; Azemi, Mehmedali; Daka, Aferdita; Shala-Beqiraj, Ruke; Kurtishi, Ilir; Sopjani, Mentor

    2015-06-01

    The present study is carried out to evaluate the risk of giving birth to children with Down syndrome in a family with Robertsonian translocation 14q;21q, and to find the dermatoglyphic changes present in carriers of this translocation. Cytogenetics diagnosis has been made according to Moorhead and Seabright method, while the analysis of prints (dermatoglyphics analysis) was made with the Cummins and Midlo method. Cytogenetic diagnosis has been made in a couple who suffered the spontaneous miscarriages and children with Down syndrome. Robertsonian translocation between chromosomes 14 and 21 (45, XX, der (14; 21) (q10; q10)) was found in a female partner who had four pregnancies, in two of which was found fetus karyotype with trisomy in chromosome 21 and pregnancies were terminated. The outcome of fourth pregnancy was twin birth, one of them with normal karyotype and another with Down syndrome due to Robertsonian translocation inherited by mother side. Specific dermatoglyphics traits are found in the child carrying Down syndrome, whereas several traits of dermatoglyphics characteristic of Down syndrome have been displayed among the silent carriers of Robertsonian translocation 14q;21q. Robertsonian translocation found in female partner was the cause of spontaneous miscarriages, of giving birth to a child with Down syndrome, and of trisomy of chromosome 21 due to translocation in two pregnancies.

  10. Dermatoglyphics and Reproductive Risk in a Family with Robertsonian Translocation 14q;21q

    PubMed Central

    Kolgeci, Selim; Kolgeci, Jehona; Azemi, Mehmedali; Daka, Aferdita; Shala-Beqiraj, Ruke; Kurtishi, Ilir; Sopjani, Mentor

    2015-01-01

    Aim: The present study is carried out to evaluate the risk of giving birth to children with Down syndrome in a family with Robertsonian translocation 14q;21q, and to find the dermatoglyphic changes present in carriers of this translocation. Methods: Cytogenetics diagnosis has been made according to Moorhead and Seabright method, while the analysis of prints (dermatoglyphics analysis) was made with the Cummins and Midlo method. Results: Cytogenetic diagnosis has been made in a couple who suffered the spontaneous miscarriages and children with Down syndrome. Robertsonian translocation between chromosomes 14 and 21 (45, XX, der (14; 21) (q10; q10)) was found in a female partner who had four pregnancies, in two of which was found fetus karyotype with trisomy in chromosome 21 and pregnancies were terminated. The outcome of fourth pregnancy was twin birth, one of them with normal karyotype and another with Down syndrome due to Robertsonian translocation inherited by mother side. Specific dermatoglyphics traits are found in the child carrying Down syndrome, whereas several traits of dermatoglyphics characteristic of Down syndrome have been displayed among the silent carriers of Robertsonian translocation 14q;21q. Conclusion: Robertsonian translocation found in female partner was the cause of spontaneous miscarriages, of giving birth to a child with Down syndrome, and of trisomy of chromosome 21 due to translocation in two pregnancies. PMID:26236088

  11. Space headache on Earth: head-down-tilted bed rest studies simulating outer-space microgravity.

    PubMed

    van Oosterhout, W P J; Terwindt, G M; Vein, A A; Ferrari, M D

    2015-04-01

    Headache is a common symptom during space travel, both isolated and as part of space motion syndrome. Head-down-tilted bed rest (HDTBR) studies are used to simulate outer space microgravity on Earth, and allow countermeasure interventions such as artificial gravity and training protocols, aimed at restoring microgravity-induced physiological changes. The objectives of this article are to assess headache incidence and characteristics during HDTBR, and to evaluate the effects of countermeasures. In a randomized cross-over design by the European Space Agency (ESA), 22 healthy male subjects, without primary headache history, underwent three periods of -6-degree HDTBR. In two of these episodes countermeasure protocols were added, with either centrifugation or aerobic exercise training protocols. Headache occurrence and characteristics were daily assessed using a specially designed questionnaire. In total 14/22 (63.6%) subjects reported a headache during ≥1 of the three HDTBR periods, in 12/14 (85.7%) non-specific, and two of 14 (14.4%) migraine. The occurrence of headache did not differ between HDTBR with and without countermeasures: 12/22 (54.5%) subjects vs. eight of 22 (36.4%) subjects; p = 0.20; 13/109 (11.9%) headache days vs. 36/213 (16.9%) headache days; p = 0.24). During countermeasures headaches were, however, more often mild (p = 0.03) and had fewer associated symptoms (p = 0.008). Simulated microgravity during HDTBR induces headache episodes, mostly on the first day. Countermeasures are useful in reducing headache severity and associated symptoms. Reversible, microgravity-induced cephalic fluid shift may cause headache, also on Earth. HDTBR can be used to study space headache on Earth. © International Headache Society 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  12. Segregation of a paternal insertional translocation results in partial 4q monosomy or 4q trisomy in two siblings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hegmann, K.M.; Spikes, A.S.; Orr-Urtreger, A.

    A genetics evaluation was requested for a 6-week-old infant with multiple congenital malformations including mild craniofacial anomalies, truncal hypotonia, hypospadias, and a ventriculoseptal defect. Blood obtained for chromosome analysis revealed an abnormal chromosome 4. Paternal chromosome analysis showed a 46,XY, inv ins (3;4)(p21.32;q25q21.2), inv(4)(p15.3q21.2) karyotype. Therefore, the proband`s chromosome 4 was the unbalanced product of this insertional translocation from the father resulting in partial monosomy 4q. Additionally, the derivative 4 had a pericentric inversion which was also seen in the father`s chromosome 4. During genetic counseling, the proband`s 2-year-old brother was evaluated. He was not felt to be abnormal inmore » appearance, but was described as having impulsive behavior. Chromosome analysis on this child revealed 46, XY, der(3) inv ins(3;4)(p21.32;q25q21.2)pat. This karyotype results in partial trisomy 4q. FISH using two-color {open_quotes}painting{close_quotes} probes for chromosomes 3 and 4 confirmed the G-banded interpretation in this family. The segregation seen in this family resulted in both reciprocal products being observed in the two children, with partial 4q monosomy showing multiple congenital anomalies, and partial 4q trisomy showing very few phenotypic abnormalities. 13 refs., 5 figs.« less

  13. Implementation of an Open-Scenario, Long-Term Space Debris Simulation Approach

    NASA Technical Reports Server (NTRS)

    Nelson, Bron; Yang Yang, Fan; Carlino, Roberto; Dono Perez, Andres; Faber, Nicolas; Henze, Chris; Karacalioglu, Arif Goktug; O'Toole, Conor; Swenson, Jason; Stupl, Jan

    2015-01-01

    This paper provides a status update on the implementation of a flexible, long-term space debris simulation approach. The motivation is to build a tool that can assess the long-term impact of various options for debris-remediation, including the LightForce space debris collision avoidance concept that diverts objects using photon pressure [9]. State-of-the-art simulation approaches that assess the long-term development of the debris environment use either completely statistical approaches, or they rely on large time steps on the order of several days if they simulate the positions of single objects over time. They cannot be easily adapted to investigate the impact of specific collision avoidance schemes or de-orbit schemes, because the efficiency of a collision avoidance maneuver can depend on various input parameters, including ground station positions and orbital and physical parameters of the objects involved in close encounters (conjunctions). Furthermore, maneuvers take place on timescales much smaller than days. For example, LightForce only changes the orbit of a certain object (aiming to reduce the probability of collision), but it does not remove entire objects or groups of objects. In the same sense, it is also not straightforward to compare specific de-orbit methods in regard to potential collision risks during a de-orbit maneuver. To gain flexibility in assessing interactions with objects, we implement a simulation that includes every tracked space object in Low Earth Orbit (LEO) and propagates all objects with high precision and variable time-steps as small as one second. It allows the assessment of the (potential) impact of physical or orbital changes to any object. The final goal is to employ a Monte Carlo approach to assess the debris evolution during the simulation time-frame of 100 years and to compare a baseline scenario to debris remediation scenarios or other scenarios of interest. To populate the initial simulation, we use the entire space

  14. Simulating Autonomous Telecommunication Networks for Space Exploration

    NASA Technical Reports Server (NTRS)

    Segui, John S.; Jennings, Esther H.

    2008-01-01

    Currently, most interplanetary telecommunication systems require human intervention for command and control. However, considering the range from near Earth to deep space missions, combined with the increase in the number of nodes and advancements in processing capabilities, the benefits from communication autonomy will be immense. Likewise, greater mission science autonomy brings the need for unscheduled, unpredictable communication and network routing. While the terrestrial Internet protocols are highly developed their suitability for space exploration has been questioned. JPL has developed the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to help characterize network designs and protocols. The results will allow future mission planners to better understand the trade offs of communication protocols. This paper discusses various issues with interplanetary network and simulation results of interplanetary networking protocols.

  15. Leak rate measurements for satellite subsystems and residual gas analysis during space environment tests. [thermal vacuum and solar simulation tests

    NASA Technical Reports Server (NTRS)

    Nuss, H. E.

    1975-01-01

    The measuring and evaluation procedure for the determination of leak rates of satellite subsystems with a quadrupole mass spectrometer, and the results of the residual gas analysis are described. The method selected for leak rate determination was placing the system into a vacuum chamber and furnishing the chamber with a mass spectrometer and calibrated leaks. The residual gas of a thermal vacuum test facility, in which the thermal balance test radiation input was simulated by a heated canister, was analyzed with the mass spectrometer in the atomic mass unit range up to 300 amu. In addition to the measurements during the space environment tests, mass spectrometric studies were performed with samples of spacecraft materials. The studies were carried out during tests for the projects HELIOS, AEROS B and SYMPHONIE.

  16. The promise of the state space approach to time series analysis for nursing research.

    PubMed

    Levy, Janet A; Elser, Heather E; Knobel, Robin B

    2012-01-01

    Nursing research, particularly related to physiological development, often depends on the collection of time series data. The state space approach to time series analysis has great potential to answer exploratory questions relevant to physiological development but has not been used extensively in nursing. The aim of the study was to introduce the state space approach to time series analysis and demonstrate potential applicability to neonatal monitoring and physiology. We present a set of univariate state space models; each one describing a process that generates a variable of interest over time. Each model is presented algebraically and a realization of the process is presented graphically from simulated data. This is followed by a discussion of how the model has been or may be used in two nursing projects on neonatal physiological development. The defining feature of the state space approach is the decomposition of the series into components that are functions of time; specifically, slowly varying level, faster varying periodic, and irregular components. State space models potentially simulate developmental processes where a phenomenon emerges and disappears before stabilizing, where the periodic component may become more regular with time, or where the developmental trajectory of a phenomenon is irregular. The ultimate contribution of this approach to nursing science will require close collaboration and cross-disciplinary education between nurses and statisticians.

  17. Applying simulation model to uniform field space charge distribution measurements by the PEA method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y.; Salama, M.M.A.

    1996-12-31

    Signals measured under uniform fields by the Pulsed Electroacoustic (PEA) method have been processed by the deconvolution procedure to obtain space charge distributions since 1988. To simplify data processing, a direct method has been proposed recently in which the deconvolution is eliminated. However, the surface charge cannot be represented well by the method because the surface charge has a bandwidth being from zero to infinity. The bandwidth of the charge distribution must be much narrower than the bandwidths of the PEA system transfer function in order to apply the direct method properly. When surface charges can not be distinguished frommore » space charge distributions, the accuracy and the resolution of the obtained space charge distributions decrease. To overcome this difficulty a simulation model is therefore proposed. This paper shows their attempts to apply the simulation model to obtain space charge distributions under plane-plane electrode configurations. Due to the page limitation for the paper, the charge distribution originated by the simulation model is compared to that obtained by the direct method with a set of simulated signals.« less

  18. Simulation test beds for the space station electrical power system

    NASA Technical Reports Server (NTRS)

    Sadler, Gerald G.

    1988-01-01

    NASA Lewis Research Center and its prime contractor are responsible for developing the electrical power system on the space station. The power system will be controlled by a network of distributed processors. Control software will be verified, validated, and tested in hardware and software test beds. Current plans for the software test bed involve using real time and nonreal time simulations of the power system. This paper will discuss the general simulation objectives and configurations, control architecture, interfaces between simulator and controls, types of tests, and facility configurations.

  19. Comparative study of standard space and real space analysis of quantitative MR brain data.

    PubMed

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  20. Development and testing of a mouse simulated space flight model

    NASA Technical Reports Server (NTRS)

    Sonnenfeld, Gerald

    1987-01-01

    The development and testing of a mouse model for simulating some aspects of weightlessness that occurs during space flight, and the carrying out of immunological experiments on animals undergoing space flight is examined. The mouse model developed was an antiorthostatic, hypokinetic, hypodynamic suspension model similar to one used with rats. The study was divided into two parts. The first involved determination of which immunological parameters should be observed on animals flown during space flight or studied in the suspension model. The second involved suspending mice and determining which of those immunological parameters were altered by the suspension. Rats that were actually flown in Space Shuttle SL-3 were used to test the hypotheses.

  1. Design and simulation of a lighting system for a shadowless space

    NASA Astrophysics Data System (ADS)

    Wang, Ye; Fang, Li

    2017-10-01

    This paper described implementing the shadowless space by two kinds of methods. The first method will implement the shadowless space utilizing the semblable principles used in the integrating sphere. The rays from a built in light source will eventually evolve into a uniform lighting through diffuse reflections for numerous times, consider that the spherical cavity structure and the inner surface with high reflectivity. There is possibility to create a shadowless space through diffuse reflections. At a 27.4m2 area, illuminance uniformity achieved 88.2% in this model. The other method is analogous with the method used in medical shadowless lamps. Lights will fall on the object in different angles and each light will generate a shadow. By changing the position distribution of multiple lights, increasing the number of light sources, the possibility of obtaining shadowless area will gradually increase. Based on these two approaches, two simple models are proposed showing the optical system designed for the shadowless space. By taking simulation software TracePro as design platform, this paper simulated the two systems.

  2. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown

    NASA Technical Reports Server (NTRS)

    Li, Zuqun

    2011-01-01

    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing

  3. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Rives, T. B.; Ingels, F. M.

    1988-01-01

    An analysis of the Automated Booster Assembly Checkout System (ABACS) has been conducted. A computer simulation of the ETHERNET LAN has been written. The simulation allows one to investigate different structures of the ABACS system. The simulation code is in PASCAL and is VAX compatible.

  4. Association of allelic loss on 1q, 4p, 7q, 9p, 9q, and 16q with postoperative death in papillary thyroid carcinoma.

    PubMed

    Kitamura, Y; Shimizu, K; Tanaka, S; Ito, K; Emi, M

    2000-05-01

    Papillary thyroid carcinomas, most of which are characterized by slow growth and good prognosis, account for the majority of thyroid carcinomas. To provide appropriate postoperative management, it is important to classify them by prediction of their prognosis. To find genetic markers associated with poor prognosis, allelic loss at all 39 nonacrocentric chromosome arms was compared in 24 deceased cases and 45 age-, sex-, stage-, and type-matched survived cases. Allelic loss was examined in primary tumors from both groups using highly polymorphic microsatellite markers on 39 nonacrocentric autosomal arms. Age at diagnosis, sex, stage, and types of tumors were matched between the two groups. No recurrent tumor was used for DNA analysis. Mean fractional allelic loss in the deceased and survived cases was 0.10+/-0.08 and 0.03+/-0.05 (P < 0.001). The survived cases showed marginal frequencies of allelic loss throughout all chromosome arms except 22q. The deceased cases showed frequent allelic losses on chromosomes 1q (37%), 4p (21%), 7q (20%), 9p (36%), 9q (31%), and 16q (29%), with significant difference (P < 0.05). These chromosome regions may include tumor suppressor genes whose inactivation is associated with aggressive phenotypes of papillary thyroid carcinoma.

  5. Vector-valued Lizorkin-Triebel spaces and sharp trace theory for functions in Sobolev spaces with mixed \\pmb{L_p}-norm for parabolic problems

    NASA Astrophysics Data System (ADS)

    Weidemaier, P.

    2005-06-01

    The trace problem on the hypersurface y_n=0 is investigated for a function u=u(y,t) \\in L_q(0,T;W_{\\underline p}^{\\underline m}(\\mathbb R_+^n)) with \\partial_t u \\in L_q(0,T; L_{\\underline p}(\\mathbb R_+^n)), that is, Sobolev spaces with mixed Lebesgue norm L_{\\underline p,q}(\\mathbb R^n_+\\times(0,T))=L_q(0,T;L_{\\underline p}(\\mathbb R_+^n)) are considered; here \\underline p=(p_1,\\dots,p_n) is a vector and \\mathbb R^n_+=\\mathbb R^{n-1} \\times (0,\\infty). Such function spaces are useful in the context of parabolic equations. They allow, in particular, different exponents of summability in space and time. It is shown that the sharp regularity of the trace in the time variable is characterized by the Lizorkin-Triebel space F_{q,p_n}^{1-1/(p_nm_n)}(0,T;L_{\\widetilde{\\underline p}}(\\mathbb R^{n-1})), \\underline p=(\\widetilde{\\underline p},p_n). A similar result is established for first order spatial derivatives of u. These results allow one to determine the exact spaces for the data in the inhomogeneous Dirichlet and Neumann problems for parabolic equations of the second order if the solution is in the space L_q(0,T; W_p^2(\\Omega)) \\cap W_q^1(0,T;L_p(\\Omega)) with p \\le q.

  6. An adaptively refined phase-space element method for cosmological simulations and collisionless dynamics

    NASA Astrophysics Data System (ADS)

    Hahn, Oliver; Angulo, Raul E.

    2016-01-01

    N-body simulations are essential for understanding the formation and evolution of structure in the Universe. However, the discrete nature of these simulations affects their accuracy when modelling collisionless systems. We introduce a new approach to simulate the gravitational evolution of cold collisionless fluids by solving the Vlasov-Poisson equations in terms of adaptively refineable `Lagrangian phase-space elements'. These geometrical elements are piecewise smooth maps between Lagrangian space and Eulerian phase-space and approximate the continuum structure of the distribution function. They allow for dynamical adaptive splitting to accurately follow the evolution even in regions of very strong mixing. We discuss in detail various one-, two- and three-dimensional test problems to demonstrate the performance of our method. Its advantages compared to N-body algorithms are: (I) explicit tracking of the fine-grained distribution function, (II) natural representation of caustics, (III) intrinsically smooth gravitational potential fields, thus (IV) eliminating the need for any type of ad hoc force softening. We show the potential of our method by simulating structure formation in a warm dark matter scenario. We discuss how spurious collisionality and large-scale discreteness noise of N-body methods are both strongly suppressed, which eliminates the artificial fragmentation of filaments. Therefore, we argue that our new approach improves on the N-body method when simulating self-gravitating cold and collisionless fluids, and is the first method that allows us to explicitly follow the fine-grained evolution in six-dimensional phase-space.

  7. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    NASA Astrophysics Data System (ADS)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  8. Multi-Spectral Image Analysis for Improved Space Object Characterization

    NASA Astrophysics Data System (ADS)

    Duggin, M.; Riker, J.; Glass, W.; Bush, K.; Briscoe, D.; Klein, M.; Pugh, M.; Engberg, B.

    The Air Force Research Laboratory (AFRL) is studying the application and utility of various ground based and space-based optical sensors for improving surveillance of space objects in both Low Earth Orbit (LEO) and Geosynchronous Earth Orbit (GEO). At present, ground-based optical and radar sensors provide the bulk of remotely sensed information on satellites and space debris, and will continue to do so into the foreseeable future. However, in recent years, the Space Based Visible (SBV) sensor was used to demonstrate that a synthesis of space-based visible data with ground-based sensor data could provide enhancements to information obtained from any one source in isolation. The incentives for space-based sensing include improved spatial resolution due to the absence of atmospheric effects and cloud cover and increased flexibility for observations. Though ground-based optical sensors can use adaptive optics to somewhat compensate for atmospheric turbulence, cloud cover and absorption are unavoidable. With recent advances in technology, we are in a far better position to consider what might constitute an ideal system to monitor our surroundings in space. This work has begun at the AFRL using detailed optical sensor simulations and analysis techniques to explore the trade space involved in acquiring and processing data from a variety of hypothetical space-based and ground-based sensor systems. In this paper, we briefly review the phenomenology and trade space aspects of what might be required in order to use multiple band-passes, sensor characteristics, and observation and illumination geometries to increase our awareness of objects in space.

  9. A statistical method for the conservative adjustment of false discovery rate (q-value).

    PubMed

    Lai, Yinglei

    2017-03-14

    q-value is a widely used statistical method for estimating false discovery rate (FDR), which is a conventional significance measure in the analysis of genome-wide expression data. q-value is a random variable and it may underestimate FDR in practice. An underestimated FDR can lead to unexpected false discoveries in the follow-up validation experiments. This issue has not been well addressed in literature, especially in the situation when the permutation procedure is necessary for p-value calculation. We proposed a statistical method for the conservative adjustment of q-value. In practice, it is usually necessary to calculate p-value by a permutation procedure. This was also considered in our adjustment method. We used simulation data as well as experimental microarray or sequencing data to illustrate the usefulness of our method. The conservativeness of our approach has been mathematically confirmed in this study. We have demonstrated the importance of conservative adjustment of q-value, particularly in the situation that the proportion of differentially expressed genes is small or the overall differential expression signal is weak.

  10. Clinical report of a 17q12 microdeletion with additionally unreported clinical features.

    PubMed

    Roberts, Jennifer L; Gandomi, Stephanie K; Parra, Melissa; Lu, Ira; Gau, Chia-Ling; Dasouki, Majed; Butler, Merlin G

    2014-01-01

    Copy number variations involving the 17q12 region have been associated with developmental and speech delay, autism, aggression, self-injury, biting and hitting, oppositional defiance, inappropriate language, and auditory hallucinations. We present a tall-appearing 17-year-old boy with marfanoid habitus, hypermobile joints, mild scoliosis, pectus deformity, widely spaced nipples, pes cavus, autism spectrum disorder, intellectual disability, and psychiatric manifestations including physical and verbal aggression, obsessive-compulsive behaviors, and oppositional defiance. An echocardiogram showed borderline increased aortic root size. An abdominal ultrasound revealed a small pancreas, mild splenomegaly with a 1.3 cm accessory splenule, and normal kidneys and liver. A testing panel for Marfan, aneurysm, and related disorders was negative. Subsequently, a 400 K array-based comparative genomic hybridization (aCGH) + SNP analysis was performed which identified a de novo suspected pathogenic deletion on chromosome 17q12 encompassing 28 genes. Despite the limited number of cases described in the literature with 17q12 rearrangements, our proband's phenotypic features both overlap and expand on previously reported cases. Since syndrome-specific DNA sequencing studies failed to provide an explanation for this patient's unusual habitus, we postulate that this case represents an expansion of the 17q12 microdeletion phenotype. Further analysis of the deleted interval is recommended for new genotype-phenotype correlations.

  11. CAD-based stand-alone spacecraft radiation exposure analysis system: An application of the early man-tended Space Station

    NASA Technical Reports Server (NTRS)

    Appleby, M. H.; Golightly, M. J.; Hardy, A. C.

    1993-01-01

    Major improvements have been completed in the approach to analyses and simulation of spacecraft radiation shielding and exposure. A computer-aided design (CAD)-based system has been developed for determining the amount of shielding provided by a spacecraft and simulating transmission of an incident radiation environment to any point within or external to the vehicle. Shielding analysis is performed using a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design programs such as a Mars transfer habitat, pressurized lunar rover, and the redesigned international Space Station. Results of analysis performed for the Space Station astronaut exposure assessment are provided to demonastrate the applicability and versatility of the system.

  12. PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.

    1997-01-01

    The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.

  13. SIMULATION FROM ENDPOINT-CONDITIONED, CONTINUOUS-TIME MARKOV CHAINS ON A FINITE STATE SPACE, WITH APPLICATIONS TO MOLECULAR EVOLUTION.

    PubMed

    Hobolth, Asger; Stone, Eric A

    2009-09-01

    Analyses of serially-sampled data often begin with the assumption that the observations represent discrete samples from a latent continuous-time stochastic process. The continuous-time Markov chain (CTMC) is one such generative model whose popularity extends to a variety of disciplines ranging from computational finance to human genetics and genomics. A common theme among these diverse applications is the need to simulate sample paths of a CTMC conditional on realized data that is discretely observed. Here we present a general solution to this sampling problem when the CTMC is defined on a discrete and finite state space. Specifically, we consider the generation of sample paths, including intermediate states and times of transition, from a CTMC whose beginning and ending states are known across a time interval of length T. We first unify the literature through a discussion of the three predominant approaches: (1) modified rejection sampling, (2) direct sampling, and (3) uniformization. We then give analytical results for the complexity and efficiency of each method in terms of the instantaneous transition rate matrix Q of the CTMC, its beginning and ending states, and the length of sampling time T. In doing so, we show that no method dominates the others across all model specifications, and we give explicit proof of which method prevails for any given Q, T, and endpoints. Finally, we introduce and compare three applications of CTMCs to demonstrate the pitfalls of choosing an inefficient sampler.

  14. TID Simulation of Advanced CMOS Devices for Space Applications

    NASA Astrophysics Data System (ADS)

    Sajid, Muhammad

    2016-07-01

    This paper focuses on Total Ionizing Dose (TID) effects caused by accumulation of charges at silicon dioxide, substrate/silicon dioxide interface, Shallow Trench Isolation (STI) for scaled CMOS bulk devices as well as at Buried Oxide (BOX) layer in devices based on Silicon-On-Insulator (SOI) technology to be operated in space radiation environment. The radiation induced leakage current and corresponding density/concentration electrons in leakage current path was presented/depicted for 180nm, 130nm and 65nm NMOS, PMOS transistors based on CMOS bulk as well as SOI process technologies on-board LEO and GEO satellites. On the basis of simulation results, the TID robustness analysis for advanced deep sub-micron technologies was accomplished up to 500 Krad. The correlation between the impact of technology scaling and magnitude of leakage current with corresponding total dose was established utilizing Visual TCAD Genius program.

  15. Space shuttle orbiter digital data processing system timing sensitivity analysis OFT ascent phase

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Becker, D. A.

    1977-01-01

    Dynamic loads were investigated to provide simulation and analysis of the space shuttle orbiter digital data processing system (DDPS). Segments of the ascent test (OFT) configuration were modeled utilizing the information management system interpretive model (IMSIM) in a computerized simulation modeling of the OFT hardware and software workload. System requirements for simulation of the OFT configuration were defined, and sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and these sensitivity analyses, a test design was developed for adapting, parameterizing, and executing IMSIM, using varying load and stress conditions for model execution. Analyses of the computer simulation runs are documented, including results, conclusions, and recommendations for DDPS improvements.

  16. Mathematical modeling and simulation of the space shuttle imaging radar antennas

    NASA Technical Reports Server (NTRS)

    Campbell, R. W.; Melick, K. E.; Coffey, E. L., III

    1978-01-01

    Simulations of space shuttle synthetic aperture radar antennas under the influence of space environmental conditions were carried out at L, C, and X-band. Mathematical difficulties in modeling large, non-planar array antennas are discussed, and an approximate modeling technique is presented. Results for several antenna error conditions are illustrated in far-field profile patterns, earth surface footprint contours, and summary graphs.

  17. Q-Thruster Breadboard Campaign Project

    NASA Technical Reports Server (NTRS)

    White, Harold

    2014-01-01

    Dr. Harold "Sonny" White has developed the physics theory basis for utilizing the quantum vacuum to produce thrust. The engineering implementation of the theory is known as Q-thrusters. During FY13, three test campaigns were conducted that conclusively demonstrated tangible evidence of Q-thruster physics with measurable thrust bringing the TRL up from TRL 2 to early TRL 3. This project will continue with the development of the technology to a breadboard level by leveraging the most recent NASA/industry test hardware. This project will replace the manual tuning process used in the 2013 test campaign with an automated Radio Frequency (RF) Phase Lock Loop system (precursor to flight-like implementation), and will redesign the signal ports to minimize RF leakage (improves efficiency). This project will build on the 2013 test campaign using the above improvements on the test implementation to get ready for subsequent Independent Verification and Validation testing at Glenn Research Center (GRC) and Jet Propulsion Laboratory (JPL) in FY 2015. Q-thruster technology has a much higher thrust to power than current forms of electric propulsion (7x Hall thrusters), and can significantly reduce the total power required for either Solar Electric Propulsion (SEP) or Nuclear Electric Propulsion (NEP). Also, due to the high thrust and high specific impulse, Q-thruster technology will greatly relax the specific mass requirements for in-space nuclear reactor systems. Q-thrusters can reduce transit times for a power-constrained architecture.

  18. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  19. Neutral Buoyancy Simulator-NB32-Large Space Structure Assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Once the United States' space program had progressed from Earth's orbit into outerspace, the prospect of building and maintaining a permanent presence in space was realized. To accomplish this feat, NASA launched a temporary workstation, Skylab, to discover the effects of low gravity and weightlessness on the human body, and also to develop tools and equipment that would be needed in the future to build and maintain a more permanent space station. The structures, techniques, and work schedules had to be carefully designed to fit this unique construction site. The components had to be lightweight for transport into orbit, yet durable. The station also had to be made with removable parts for easy servicing and repairs by astronauts. All of the tools necessary for service and repairs had to be designed for easy manipulation by a suited astronaut. Construction methods had to be efficient due to the limited time the astronauts could remain outside their controlled environment. In lieu of all the specific needs for this project, an environment on Earth had to be developed that could simulate a low gravity atmosphere. A Neutral Buoyancy Simulator (NBS) was constructed by NASA Marshall Space Flight Center (MSFC) in 1968. Since then, NASA scientists have used this facility to understand how humans work best in low gravity and also provide information about the different kinds of structures that can be built. As part of this experimentation, the Experimental Assembly of Structures in Extravehicular Activity (EASE) project was developed as a joint effort between MFSC and the Massachusetts Institute of Technology (MIT). The EASE experiment required that crew members assemble small components to form larger components, working from the payload bay of the space shuttle. Pictured is an entire unit that has been constructed and is sitting in the bottom of a mock-up shuttle cargo bay pallet.

  20. Multi-spectral image analysis for improved space object characterization

    NASA Astrophysics Data System (ADS)

    Glass, William; Duggin, Michael J.; Motes, Raymond A.; Bush, Keith A.; Klein, Meiling

    2009-08-01

    The Air Force Research Laboratory (AFRL) is studying the application and utility of various ground-based and space-based optical sensors for improving surveillance of space objects in both Low Earth Orbit (LEO) and Geosynchronous Earth Orbit (GEO). This information can be used to improve our catalog of space objects and will be helpful in the resolution of satellite anomalies. At present, ground-based optical and radar sensors provide the bulk of remotely sensed information on satellites and space debris, and will continue to do so into the foreseeable future. However, in recent years, the Space-Based Visible (SBV) sensor was used to demonstrate that a synthesis of space-based visible data with ground-based sensor data could provide enhancements to information obtained from any one source in isolation. The incentives for space-based sensing include improved spatial resolution due to the absence of atmospheric effects and cloud cover and increased flexibility for observations. Though ground-based optical sensors can use adaptive optics to somewhat compensate for atmospheric turbulence, cloud cover and absorption are unavoidable. With recent advances in technology, we are in a far better position to consider what might constitute an ideal system to monitor our surroundings in space. This work has begun at the AFRL using detailed optical sensor simulations and analysis techniques to explore the trade space involved in acquiring and processing data from a variety of hypothetical space-based and ground-based sensor systems. In this paper, we briefly review the phenomenology and trade space aspects of what might be required in order to use multiple band-passes, sensor characteristics, and observation and illumination geometries to increase our awareness of objects in space.

  1. Deep Space Optical Link ARQ Performance Analysis

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Miles, Gregory

    2016-01-01

    Substantial advancements have been made toward the use of optical communications for deep space exploration missions, promising a much higher volume of data to be communicated in comparison with present -day Radio Frequency (RF) based systems. One or more ground-based optical terminals are assumed to communicate with the spacecraft. Both short-term and long-term link outages will arise due to weather at the ground station(s), space platform pointing stability, and other effects. To mitigate these outages, an Automatic Repeat Query (ARQ) retransmission method is assumed, together with a reliable back channel for acknowledgement traffic. Specifically, the Licklider Transmission Protocol (LTP) is used, which is a component of the Disruption-Tolerant Networking (DTN) protocol suite that is well suited for high bandwidth-delay product links subject to disruptions. We provide an analysis of envisioned deep space mission scenarios and quantify buffering, latency and throughput performance, using a simulation in which long-term weather effects are modeled with a Gilbert -Elliot Markov chain, short-term outages occur as a Bernoulli process, and scheduled outages arising from geometric visibility or operational constraints are represented. We find that both short- and long-term effects impact throughput, but long-term weather effects dominate buffer sizing and overflow losses as well as latency performance.

  2. An interstitial 15q11-q14 deletion: expanded Prader-Willi syndrome phenotype.

    PubMed

    Butler, Merlin G; Bittel, Douglas C; Kibiryeva, Nataliya; Cooley, Linda D; Yu, Shihui

    2010-02-01

    We present an infant girl with a de novo interstitial deletion of the chromosome 15q11-q14 region, larger than the typical deletion seen in Prader-Willi syndrome (PWS). She presented with features seen in PWS including hypotonia, a poor suck, feeding problems, and mild micrognathia. She also presented with features not typically seen in PWS such as preauricular ear tags, a high-arched palate, edematous feet, coarctation of the aorta, a PDA, and a bicuspid aortic valve. G-banded chromosome analysis showed a large de novo deletion of the proximal long arm of chromosome 15 confirmed using FISH probes (D15511 and GABRB3). Methylation testing was abnormal and consistent with the diagnosis of PWS. Because of the large appearing deletion by karyotype analysis, an array comparative genomic hybridization (aCGH) was performed. A 12.3 Mb deletion was found which involved the 15q11-q14 region containing approximately 60 protein coding genes. This rare deletion was approximately twice the size of the typical deletion seen in PWS and involved the proximal breakpoint BP1 and the distal breakpoint was located in the 15q14 band between previously recognized breakpoints BP5 and BP6. The deletion extended slightly distal to the AVEN gene including the neighboring CHRM5 gene. There is no evidence that the genes in the 15q14 band are imprinted; therefore, their potential contribution in this patient's expanded PWS phenotype must be a consequence of dosage sensitivity of the genes or due to altered expression of intact neighboring genes from a position effect. Copyright 2010 Wiley-Liss, Inc.

  3. Space station interior noise analysis program

    NASA Technical Reports Server (NTRS)

    Stusnick, E.; Burn, M.

    1987-01-01

    Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.

  4. Combination of classical test theory (CTT) and item response theory (IRT) analysis to study the psychometric properties of the French version of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF).

    PubMed

    Bourion-Bédès, Stéphanie; Schwan, Raymund; Epstein, Jonathan; Laprevote, Vincent; Bédès, Alex; Bonnet, Jean-Louis; Baumann, Cédric

    2015-02-01

    The study aimed to examine the construct validity and reliability of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF) according to both classical test and item response theories. The psychometric properties of the French version of this instrument were investigated in a cross-sectional, multicenter study. A total of 124 outpatients with a substance dependence diagnosis participated in the study. Psychometric evaluation included descriptive analysis, internal consistency, test-retest reliability, and validity. The dimensionality of the instrument was explored using a combination of the classical test, confirmatory factor analysis (CFA), and an item response theory analysis, the Person Separation Index (PSI), in a complementary manner. The results of the Q-LES-Q-SF revealed that the questionnaire was easy to administer and the acceptability was good. The internal consistency and the test-retest reliability were 0.9 and 0.88, respectively. All items were significantly correlated with the total score and the SF-12 used in the study. The CFA with one factor model was good, and for the unidimensional construct, the PSI was found to be 0.902. The French version of the Q-LES-Q-SF yielded valid and reliable clinical assessments of the quality of life for future research and clinical practice involving French substance abusers. In response to recent questioning regarding the unidimensionality or bidimensionality of the instrument and according to the underlying theoretical unidimensional construct used for its development, this study suggests the Q-LES-Q-SF as a one-dimension questionnaire in French QoL studies.

  5. Development and testing of a mouse simulated space flight model

    NASA Technical Reports Server (NTRS)

    Sonnenfeld, G.

    1985-01-01

    The development and testing of a mouse model for simulating some aspects of weightlessness that occur during space flight, and the carrying out of immunological flight experiments on animals was discussed. The mouse model is an antiorthostatic, hypokinetic, hypodynamic suspension model similar to the one used with rats. It is shown that this murine model yield similar results to the rat model of antiorthostatic suspension for simulating some aspects of weightlessness. It is also shown that mice suspended in this model have decreased interferon-alpha/beta production as compared to control, nonsuspended mice or to orthostatically suspended mice. It is suggested that the conditions occuring during space flight could possibly affect interferon production. The regulatory role of interferon in nonviral diseases is demonstrated including several bacterial and protozoan infections indicating the great significance of interferon in resistance to many types of infectious diseases.

  6. Results of Small-scale Solid Rocket Combustion Simulator testing at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Goldberg, Benjamin E.; Cook, Jerry

    1993-01-01

    The Small-scale Solid Rocket Combustion Simulator (SSRCS) program was established at the Marshall Space Flight Center (MSFC), and used a government/industry team consisting of Hercules Aerospace Corporation, Aerotherm Corporation, United Technology Chemical Systems Division, Thiokol Corporation and MSFC personnel to study the feasibility of simulating the combustion species, temperatures and flow fields of a conventional solid rocket motor (SRM) with a versatile simulator system. The SSRCS design is based on hybrid rocket motor principles. The simulator uses a solid fuel and a gaseous oxidizer. Verification of the feasibility of a SSRCS system as a test bed was completed using flow field and system analyses, as well as empirical test data. A total of 27 hot firings of a subscale SSRCS motor were conducted at MSFC. Testing of the Small-scale SSRCS program was completed in October 1992. This paper, a compilation of reports from the above team members and additional analysis of the instrumentation results, will discuss the final results of the analyses and test programs.

  7. Quantitative Component Analysis of Solid Mixtures by Analyzing Time Domain 1H and 19F T1 Saturation Recovery Curves (qSRC).

    PubMed

    Stueber, Dirk; Jehle, Stefan

    2017-07-01

    Prevalent polymorphism and complicated phase behavior of active pharmaceutical ingredients (APIs) often result in remarkable differences in the respective biochemical and physical API properties. Consequently, API form characterization and quantification play a central role in the pharmaceutical industry from early drug development to manufacturing. Here we present a novel and proficient quantification protocol for solid mixtures (qSRC) based on the measurement and mathematical fitting of T 1 nuclear magnetic resonance (NMR) saturation recovery curves collected on a bench top time-domain NMR instrument. The saturation recovery curves of the relevant pure components are used as fingerprints. Employing a bench top NMR instrument possesses clear benefits. These instruments exhibit a small footprint, do not present any special requirements on lab space, and required sample handling is simple and fast. The qSRC analysis can easily be conducted in a conventional laboratory setting as well as in an industrial production environment, making it a versatile tool with the potential for widespread application. The accuracy and efficiency of the qSRC method is illustrated using 1 H and 19 F T 1 data of selected pharmaceutical model compounds, as well as utilizing 1 H T 1 data of an actual binary API anhydrous polymorph system of a Merck & Co., Inc. compound formerly developed as a hepatitis C virus drug. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  8. Genomic analysis of the chromosome 15q11-q13 Prader-Willi syndrome region and characterization of transcripts for GOLGA8E and WHCD1L1 from the proximal breakpoint region.

    PubMed

    Jiang, Yong-Hui; Wauki, Kekio; Liu, Qian; Bressler, Jan; Pan, Yanzhen; Kashork, Catherine D; Shaffer, Lisa G; Beaudet, Arthur L

    2008-01-28

    Prader-Willi syndrome (PWS) is a neurobehavioral disorder characterized by neonatal hypotonia, childhood obesity, dysmorphic features, hypogonadism, mental retardation, and behavioral problems. Although PWS is most often caused by a paternal interstitial deletion of a 6-Mb region of chromosome 15q11-q13, the identity of the exact protein coding or noncoding RNAs whose deficiency produces the PWS phenotype is uncertain. There are also reports describing a PWS-like phenotype in a subset of patients with full mutations in the FMR1 (fragile X mental retardation 1) gene. Taking advantage of the human genome sequence, we have performed extensive sequence analysis and molecular studies for the PWS candidate region. We have characterized transcripts for the first time for two UCSC Genome Browser predicted protein-coding genes, GOLGA8E (golgin subfamily a, 8E) and WHDC1L1 (WAS protein homology region containing 1-like 1) and have further characterized two previously reported genes, CYF1P1 and NIPA2; all four genes are in the region close to the proximal/centromeric deletion breakpoint (BP1). GOLGA8E belongs to the golgin subfamily of coiled-coil proteins associated with the Golgi apparatus. Six out of 16 golgin subfamily proteins in the human genome have been mapped in the chromosome 15q11-q13 and 15q24-q26 regions. We have also identified more than 38 copies of GOLGA8E-like sequence in the 15q11-q14 and 15q23-q26 regions which supports the presence of a GOLGA8E-associated low copy repeat (LCR). Analysis of the 15q11-q13 region by PFGE also revealed a polymorphic region between BP1 and BP2. WHDC1L1 is a novel gene with similarity to mouse Whdc1 (WAS protein homology region 2 domain containing 1) and human JMY protein (junction-mediating and regulatory protein). Expression analysis of cultured human cells and brain tissues from PWS patients indicates that CYFIP1 and NIPA2 are biallelically expressed. However, we were not able to determine the allele-specific expression

  9. Investigation of Techniques for Simulating Communications and Tracking Subsystems on Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Deacetis, Louis A.

    1991-01-01

    The need to reduce the costs of Space Station Freedom has resulted in a major redesign and downsizing of the Station in general, and its Communications and Tracking (C&T) components in particular. Earlier models and simulations of the C&T Space-to-Ground Subsystem (SGS) in particular are no longer valid. There thus exists a general need for updated, high fidelity simulations of C&T subsystems. This project explored simulation techniques and methods that might be used in developing new simulations of C&T subsystems, including the SGS. Three requirements were placed on the simulations to be developed: (1) they run on IBM PC/XT/AT compatible computers; (2) they be written in Ada as much as possible; and (3) since control and monitoring of the C&T subsystems will involve communication via a MIL-STD-1553B serial bus, that the possibility of commanding the simulator and monitoring its sensors via that bus be included in the design of the simulator. The result of the project is a prototype of a simulation of the Assembly/Contingency Transponder of the SGS, written in Ada, which can be controlled from another PC via a MIL-STD-1553B bus.

  10. The Xenon Test Chamber Q-SUN® for testing realistic tolerances of fungi exposed to simulated full spectrum solar radiation.

    PubMed

    Dias, Luciana P; Araújo, Claudinéia A S; Pupin, Breno; Ferreira, Paulo C; Braga, Gilberto Ú L; Rangel, Drauzio E N

    2018-06-01

    The low survival of insect-pathogenic fungi when used for insect control in agriculture is mainly due to the deleterious effects of ultraviolet radiation and heat from solar irradiation. In this study, conidia of 15 species of entomopathogenic fungi were exposed to simulated full-spectrum solar radiation emitted by a Xenon Test Chamber Q-SUN XE-3-HC 340S (Q-LAB ® Corporation, Westlake, OH, USA), which very closely simulates full-spectrum solar radiation. A dendrogram obtained from cluster analyses, based on lethal time 50 % and 90 % calculated by Probit analyses, separated the fungi into three clusters: cluster 3 contains species with highest tolerance to simulated full-spectrum solar radiation, included Metarhizium acridum, Cladosporium herbarum, and Trichothecium roseum with LT 50  > 200 min irradiation. Cluster 2 contains eight species with moderate UV tolerance: Aschersonia aleyrodis, Isaria fumosorosea, Mariannaea pruinosa, Metarhizium anisopliae, Metarhizium brunneum, Metarhizium robertsii, Simplicillium lanosoniveum, and Torrubiella homopterorum with LT 50 between 120 and 150 min irradiation. The four species in cluster 1 had the lowest UV tolerance: Lecanicillium aphanocladii, Beauveria bassiana, Tolypocladium cylindrosporum, and Tolypocladium inflatum with LT 50  < 120 min irradiation. The QSUN Xenon Test Chamber XE3 is often used by the pharmaceutical and automotive industry to test light stability and weathering, respectively, but it was never used to evaluate fungal tolerance to full-spectrum solar radiation before. We conclude that the equipment provided an excellent tool for testing realistic tolerances of fungi to full-spectrum solar radiation of microbial agents for insect biological control in agriculture. Copyright © 2018 British Mycological Society. Published by Elsevier Ltd. All rights reserved.

  11. Simulating Space Capsule Water Landing with Explicit Finite Element Method

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Lyle, Karen H.

    2007-01-01

    A study of using an explicit nonlinear dynamic finite element code for simulating the water landing of a space capsule was performed. The finite element model contains Lagrangian shell elements for the space capsule and Eulerian solid elements for the water and air. An Arbitrary Lagrangian Eulerian (ALE) solver and a penalty coupling method were used for predicting the fluid and structure interaction forces. The space capsule was first assumed to be rigid, so the numerical results could be correlated with closed form solutions. The water and air meshes were continuously refined until the solution was converged. The converged maximum deceleration predicted is bounded by the classical von Karman and Wagner solutions and is considered to be an adequate solution. The refined water and air meshes were then used in the models for simulating the water landing of a capsule model that has a flexible bottom. For small pitch angle cases, the maximum deceleration from the flexible capsule model was found to be significantly greater than the maximum deceleration obtained from the corresponding rigid model. For large pitch angle cases, the difference between the maximum deceleration of the flexible model and that of its corresponding rigid model is smaller. Test data of Apollo space capsules with a flexible heat shield qualitatively support the findings presented in this paper.

  12. Feedback-Assisted Extension of the Tokamak Operating Space to Low Safety Factor

    NASA Astrophysics Data System (ADS)

    Hanson, J. M.

    2013-10-01

    Recent DIII-D experiments have demonstrated stable operation at very low edge safety factor, q95 <~ 2 through the use of magnetic feedback to control the n = 1 resistive wall mode (RWM) instability. The performance of tokamak fusion devices may benefit from increased plasma current, and thus, decreased q. However, disruptive stability limits are commonly encountered in experiments at qedge ~ 2 (limited plasmas) and q95 ~ 2 (diverted plasmas), limiting exploration of low q regimes. In the recent DIII-D experiments, the impact and control of key disruptive instabilities was studied. Locked n = 1 modes with exponential growth times on the order of the wall eddy current decay timescale τw preceded disruptions at q95 = 2 . The instabilities have a poloidal structure that is consistent with VALEN simulations of the RWM mode structure at q95 = 2 . Applying proportional gain magnetic feedback control of the n = 1 mode resulted in stabilized operation with q95 reaching 1.9, and an extension of the discharge lifetime for > 100τw . Loss of feedback control was accompanied by power supply saturation, followed by a rapidly growing n = 1 mode and disruption. Comparisons of the feedback dynamics with VALEN simulations will be presented. The DIII-D results complement and will be discussed alongside recent RFX-MOD demonstrations of RWM control using magnetic feedback in limited tokamak discharges with qedge < 2. These results call attention to the utility of magnetic feedback in significantly extending the tokamak operational space and potentially opening a new route to economical fusion power production. Supported by the US Department of Energy under DE-FG02-04ER54761 and DE-FC02-04ER54698.

  13. q-bosons and the q-analogue quantized field

    NASA Technical Reports Server (NTRS)

    Nelson, Charles A.

    1995-01-01

    The q-analogue coherent states are used to identify physical signatures for the presence of a 1-analogue quantized radiation field in the q-CS classical limits where the absolute value of z is large. In this quantum-optics-like limit, the fractional uncertainties of most physical quantities (momentum, position, amplitude, phase) which characterize the quantum field are O(1). They only vanish as O(1/absolute value of z) when q = 1. However, for the number operator, N, and the N-Hamiltonian for a free q-boson gas, H(sub N) = h(omega)(N + 1/2), the fractional uncertainties do still approach zero. A signature for q-boson counting statistics is that (Delta N)(exp 2)/ (N) approaches 0 as the absolute value of z approaches infinity. Except for its O(1) fractional uncertainty, the q-generalization of the Hermitian phase operator of Pegg and Barnett, phi(sub q), still exhibits normal classical behavior. The standard number-phase uncertainty-relation, Delta(N) Delta phi(sub q) = 1/2, and the approximate commutation relation, (N, phi(sub q)) = i, still hold for the single-mode q-analogue quantized field. So, N and phi(sub q) are almost canonically conjugate operators in the q-CS classical limit. The q-analogue CS's minimize this uncertainty relation for moderate (absolute value of z)(exp 2).

  14. The Study of Simulated Space Radiation Environment Effect on Conductive Properties of ITO Thermal Control Materials

    NASA Astrophysics Data System (ADS)

    Wei-Quan, Feng; Chun-Qing, Zhao; Zi-Cai, Shen; Yi-Gang, Ding; Fan, Zhang; Yu-Ming, Liu; Hui-Qi, Zheng; Xue, Zhao

    In order to prevent detrimental effects of ESD caused by differential surface charging of spacecraft under space environments, an ITO transparent conductive coating is often deposited on the thermal control materials outside spacecraft. Since the ITO coating is exposed in space environment, the environment effects on electrical property of ITO coatings concern designers of spacecraft deeply. This paper introduces ground tests to simulate space radiation environmental effects on conductive property of ITO coating. Samples are made of ITO/OSR, ITO/Kapton/Al and ITO/FEP/Ag thermal control coatings. Simulated space radiation environment conditions are NUV of 500ESH, 40 keV electron of 2 × 1016 е/cm2, 40 keV proton of 2.5 × 1015 p/cm2. Conductive property is surface resistivity measured in-situ in vacuum. Test results proved that the surface resistivity for all ITO coatings have a sudden decrease in the beginning of environment test. The reasons for it may be the oxygen vacancies caused by vacuum and decayed RIC caused by radiation. Degradation in conductive properties caused by irradiation were found. ITO/FEP/Ag exhibits more degradation than other two kinds. The conductive property of ITO/kapton/Al is stable for vacuum irradiation. The analysis of SEM and XPS found more crackers and less Sn and In concentration after irradiation which may be the reason for conductive property degradation.

  15. Heavy-Quark Symmetry Implies Stable Heavy Tetraquark Mesons Q_{i}Q_{j}q[over ¯]_{k}q[over ¯]_{l}.

    PubMed

    Eichten, Estia J; Quigg, Chris

    2017-11-17

    For very heavy quarks Q, relations derived from heavy-quark symmetry predict the existence of novel narrow doubly heavy tetraquark states of the form Q_{i}Q_{j}q[over ¯]_{k}q[over ¯]_{l} (subscripts label flavors), where q designates a light quark. By evaluating finite-mass corrections, we predict that double-beauty states composed of bbu[over ¯]d[over ¯], bbu[over ¯]s[over ¯], and bbd[over ¯]s[over ¯] will be stable against strong decays, whereas the double-charm states ccq[over ¯]_{k}q[over ¯]_{l}, mixed beauty+charm states bcq[over ¯]_{k}q[over ¯]_{l}, and heavier bbq[over ¯]_{k}q[over ¯]_{l} states will dissociate into pairs of heavy-light mesons. Observation of a new double-beauty state through its weak decays would establish the existence of tetraquarks and illuminate the role of heavy color-antitriplet diquarks as hadron constituents.

  16. Quasi-optical analysis of a far-infrared spatio-spectral space interferometer concept

    NASA Astrophysics Data System (ADS)

    Bracken, C.; O'Sullivan, C.; Murphy, J. A.; Donohoe, A.; Savini, G.; Lightfoot, J.; Juanola-Parramon, R.; Fisica Consortium

    2016-07-01

    FISICA (Far-Infrared Space Interferometer Critical Assessment) was a three year study of a far-infrared spatio-spectral double-Fourier interferometer concept. One of the aims of the FISICA study was to set-out a baseline optical design for such a system, and to use a model of the system to simulate realistic telescope beams for use with an end-to-end instrument simulator. This paper describes a two-telescope (and hub) baseline optical design that fulfils the requirements of the FISICA science case, while minimising the optical mass of the system. A number of different modelling techniques were required for the analysis: fast approximate simulation tools such as ray tracing and Gaussian beam methods were employed for initial analysis, with GRASP physical optics used for higher accuracy in the final analysis. Results are shown for the predicted far-field patterns of the telescope primary mirrors under illumination by smooth walled rectangular feed horns. Far-field patterns for both on-axis and off-axis detectors are presented and discussed.

  17. Comparative performance of different scale-down simulators of substrate gradients in Penicillium chrysogenum cultures: the need of a biological systems response analysis.

    PubMed

    Wang, Guan; Zhao, Junfei; Haringa, Cees; Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang; Deshmukh, Amit T; van Gulik, Walter; Heijnen, Joseph J; Noorman, Henk J

    2018-05-01

    In a 54 m 3 large-scale penicillin fermentor, the cells experience substrate gradient cycles at the timescales of global mixing time about 20-40 s. Here, we used an intermittent feeding regime (IFR) and a two-compartment reactor (TCR) to mimic these substrate gradients at laboratory-scale continuous cultures. The IFR was applied to simulate substrate dynamics experienced by the cells at full scale at timescales of tens of seconds to minutes (30 s, 3 min and 6 min), while the TCR was designed to simulate substrate gradients at an applied mean residence time (τc) of 6 min. A biological systems analysis of the response of an industrial high-yielding P. chrysogenum strain has been performed in these continuous cultures. Compared to an undisturbed continuous feeding regime in a single reactor, the penicillin productivity (q PenG ) was reduced in all scale-down simulators. The dynamic metabolomics data indicated that in the IFRs, the cells accumulated high levels of the central metabolites during the feast phase to actively cope with external substrate deprivation during the famine phase. In contrast, in the TCR system, the storage pool (e.g. mannitol and arabitol) constituted a large contribution of carbon supply in the non-feed compartment. Further, transcript analysis revealed that all scale-down simulators gave different expression levels of the glucose/hexose transporter genes and the penicillin gene clusters. The results showed that q PenG did not correlate well with exposure to the substrate regimes (excess, limitation and starvation), but there was a clear inverse relation between q PenG and the intracellular glucose level. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  18. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    PubMed

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  19. Q-Method Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Zanetti, Renato; Ainscough, Thomas; Christian, John; Spanos, Pol D.

    2012-01-01

    A new algorithm is proposed that smoothly integrates non-linear estimation of the attitude quaternion using Davenport s q-method and estimation of non-attitude states through an extended Kalman filter. The new method is compared to a similar existing algorithm showing its similarities and differences. The validity of the proposed approach is confirmed through numerical simulations.

  20. Simulated Space Environment Effects on a Candidate Solar Sail Material

    NASA Technical Reports Server (NTRS)

    Kang, Jin Ho; Bryant, Robert G.; Wilkie, W. Keats; Wadsworth, Heather M.; Craven, Paul D.; Nehls, Mary K.; Vaughn, Jason A.

    2017-01-01

    For long duration missions of solar sail vehicles, the sail material needs to survive the harsh space environment as the degradation of the sail material determines its operational lifetime. Therefore, understanding the effects of the space environment on the sail membrane is essential for mission success. In this study, the effect of simulated space environments of ionizing radiation and thermal aging were investigated. In order to assess some of the potential damage effects on the mechanical, thermal and optical properties of a commercial off the shelf (COTS) polyester solar sail membrane. The solar sail membrane was exposed to high energy electrons [about 70 keV and 10 nA/cm(exp. 2)], and the physical properties were characterized. After about 8.3 Grad dose, the tensile modulus, tensile strength and failure strain of the sail membrane decreased by 20 to 95%. The aluminum reflective layer was damaged and partially delaminated but it did not show any significant change in solar absorbance or thermal emittance. The mechanical properties of a precracked sample, simulating potential impact damage of the sail membrane, as well as thermal aging effects on metallized PEN (polyethylene naphthalate) film, will be discussed.

  1. Frontal dysconnectivity in 22q11.2 deletion syndrome: an atlas-based functional connectivity analysis.

    PubMed

    Mattiaccio, Leah M; Coman, Ioana L; Thompson, Carlie A; Fremont, Wanda P; Antshel, Kevin M; Kates, Wendy R

    2018-01-20

    22q11.2 deletion syndrome (22q11DS) is a neurodevelopmental syndrome associated with deficits in cognitive and emotional processing. This syndrome represents one of the highest risk factors for the development of schizophrenia. Previous studies of functional connectivity (FC) in 22q11DS report aberrant connectivity patterns in large-scale networks that are associated with the development of psychotic symptoms. In this study, we performed a functional connectivity analysis using the CONN toolbox to test for differential connectivity patterns between 54 individuals with 22q11DS and 30 healthy controls, between the ages of 17-25 years old. We mapped resting-state fMRI data onto 68 atlas-based regions of interest (ROIs) generated by the Desikan-Killany atlas in FreeSurfer, resulting in 2278 ROI-to-ROI connections for which we determined total linear temporal associations between each. Within the group with 22q11DS only, we further tested the association between prodromal symptoms of psychosis and FC. We observed that relative to controls, individuals with 22q11DS displayed increased FC in lobar networks involving the frontal-frontal, frontal-parietal, and frontal-occipital ROIs. In contrast, FC between ROIs in the parietal-temporal and occipital lobes was reduced in the 22q11DS group relative to healthy controls. Moreover, positive psychotic symptoms were positively associated with increased functional connections between the left precuneus and right superior frontal gyrus, as well as reduced functional connectivity between the bilateral pericalcarine. Positive symptoms were negatively associated with increased functional connectivity between the right pericalcarine and right postcentral gyrus. Our results suggest that functional organization may be altered in 22q11DS, leading to disruption in connectivity between frontal and other lobar substructures, and potentially increasing risk for prodromal psychosis.

  2. MitoQ regulates autophagy by inducing a pseudo-mitochondrial membrane potential

    PubMed Central

    Sun, Chao; Liu, Xiongxiong; Di, Cuixia; Wang, Zhenhua; Mi, Xiangquan; Liu, Yang; Zhao, Qiuyue; Mao, Aihong; Chen, Weiqiang; Gan, Lu; Zhang, Hong

    2017-01-01

    ABSTRACT During the process of oxidative phosphorylation, protons are pumped into the mitochondrial intermembrane space to establish a mitochondrial membrane potential (MMP). The electrochemical gradient generated allows protons to return to the matrix through the ATP synthase complex and generates ATP in the process. MitoQ is a lipophilic cationic drug that is adsorbed to the inner mitochondrial membrane; however, the cationic moiety of MitoQ remains in the intermembrane space. We found that the positive charges in MitoQ inhibited the activity of respiratory chain complexes I, III, and IV, reduced proton production, and decreased oxygen consumption. Therefore, a pseudo-MMP (PMMP) was formed via maintenance of exogenous positive charges. Proton backflow was severely impaired, leading to a decrease in ATP production and an increase in AMP production. Excess AMP activates AMP kinase, which inhibits the MTOR (mechanistic target of rapamycin) pathway and induces macroautophagy/autophagy. Therefore, we conclude that MitoQ increases PMMP via proton displacement with exogenous positive charges. In addition, PMMP triggered autophagy in hepatocellular carcinoma HepG2 cells via modification of mitochondrial bioenergetics pathways. PMID:28121478

  3. MitoQ regulates autophagy by inducing a pseudo-mitochondrial membrane potential.

    PubMed

    Sun, Chao; Liu, Xiongxiong; Di, Cuixia; Wang, Zhenhua; Mi, Xiangquan; Liu, Yang; Zhao, Qiuyue; Mao, Aihong; Chen, Weiqiang; Gan, Lu; Zhang, Hong

    2017-04-03

    During the process of oxidative phosphorylation, protons are pumped into the mitochondrial intermembrane space to establish a mitochondrial membrane potential (MMP). The electrochemical gradient generated allows protons to return to the matrix through the ATP synthase complex and generates ATP in the process. MitoQ is a lipophilic cationic drug that is adsorbed to the inner mitochondrial membrane; however, the cationic moiety of MitoQ remains in the intermembrane space. We found that the positive charges in MitoQ inhibited the activity of respiratory chain complexes I, III, and IV, reduced proton production, and decreased oxygen consumption. Therefore, a pseudo-MMP (PMMP) was formed via maintenance of exogenous positive charges. Proton backflow was severely impaired, leading to a decrease in ATP production and an increase in AMP production. Excess AMP activates AMP kinase, which inhibits the MTOR (mechanistic target of rapamycin) pathway and induces macroautophagy/autophagy. Therefore, we conclude that MitoQ increases PMMP via proton displacement with exogenous positive charges. In addition, PMMP triggered autophagy in hepatocellular carcinoma HepG2 cells via modification of mitochondrial bioenergetics pathways.

  4. Molecular Simulation Uncovers the Conformational Space of the λ Cro Dimer in Solution

    PubMed Central

    Ahlstrom, Logan S.; Miyashita, Osamu

    2011-01-01

    The significant variation among solved structures of the λ Cro dimer suggests its flexibility. However, contacts in the crystal lattice could have stabilized a conformation which is unrepresentative of its dominant solution form. Here we report on the conformational space of the Cro dimer in solution using replica exchange molecular dynamics in explicit solvent. The simulated ensemble shows remarkable correlation with available x-ray structures. Network analysis and a free energy surface reveal the predominance of closed and semi-open dimers, with a modest barrier separating these two states. The fully open conformation lies higher in free energy, indicating that it requires stabilization by DNA or crystal contacts. Most NMR models are found to be unstable conformations in solution. Intersubunit salt bridging between Arg4 and Glu53 during simulation stabilizes closed conformations. Because a semi-open state is among the low-energy conformations sampled in simulation, we propose that Cro-DNA binding may not entail a large conformational change relative to the dominant dimer forms in solution. PMID:22098751

  5. Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation

    NASA Astrophysics Data System (ADS)

    Alton, G. D.; Bilheux, H.

    2004-05-01

    Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j+ext, and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j+ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects.

  6. Biallelic losses of 13q do not confer a poorer outcome in chronic lymphocytic leukaemia: analysis of 627 patients with isolated 13q deletion.

    PubMed

    Puiggros, Anna; Delgado, Julio; Rodriguez-Vicente, Ana; Collado, Rosa; Aventín, Anna; Luño, Elisa; Grau, Javier; Hernandez, José Ángel; Marugán, Isabel; Ardanaz, Maite; González, Teresa; Valiente, Alberto; Osma, Mar; Calasanz, Maria José; Sanzo, Carmen; Carrió, Ana; Ortega, Margarita; Santacruz, Rodrigo; Abrisqueta, Pau; Abella, Eugènia; Bosch, Francesc; Carbonell, Félix; Solé, Francesc; Hernández, Jesús Maria; Espinet, Blanca

    2013-10-01

    Losses in 13q as a sole abnormality confer a good prognosis in chronic lymphocytic leukaemia (CLL). Nevertheless, its heterogeneity has been demonstrated and the clinical significance of biallelic 13q deletions remains controversial. We compared the clinico-biological characteristics of a series of 627 patients harbouring isolated 13q deletions by fluorescence in situ hybridization (FISH), either monoallelic (13q × 1), biallelic (13q × 2), or the coexistence of both clones (13qM). The most frequent 13q deletion was 13q × 1 (82·1%), while 13q × 2 and 13qM represented 8·6% and 9·3% of patients respectively. The median percentage of altered nuclei significantly differed across groups: 55%, 72·5% and 80% in 13q × 1, 13q × 2 and 13qM (P < 0·001). However, no significant differences in the clinical outcome among 13q groups were found. From 84 patients with sequential FISH studies, eight patients lost the remaining allele of 13q whereas none of them changed from 13q × 2 to the 13q × 1 group. The percentage of abnormal cells detected by FISH had a significant impact on the five-year cumulative incidence of treatment and the overall survival, 90% being the highest predictive power cut-off. In conclusion, loss of the remaining 13q allele is not enough to entail a worse prognosis in CLL. The presence of isolated 13q deletion can be risk-stratified according to the percentage of altered cells. © 2013 John Wiley & Sons Ltd.

  7. Integrated Clinical Training for Space Flight Using a High-Fidelity Patient Simulator in a Simulated Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Hurst, Victor; Doerr, Harold K.; Polk, J. D.; Schmid, Josef; Parazynksi, Scott; Kelly, Scott

    2007-01-01

    This viewgraph presentation reviews the use of telemedicine in a simulated microgravity environment using a patient simulator. For decades, telemedicine techniques have been used in terrestrial environments by many cohorts with varied clinical experience. The success of these techniques has been recently expanded to include microgravity environments aboard the International Space Station (ISS). In order to investigate how an astronaut crew medical officer will execute medical tasks in a microgravity environment, while being remotely guided by a flight surgeon, the Medical Operation Support Team (MOST) used the simulated microgravity environment provided aboard DC-9 aircraft teams of crew medical officers, and remote flight surgeons performed several tasks on a patient simulator.

  8. k-space and q-space: combining ultra-high spatial and angular resolution in diffusion imaging using ZOOPPA at 7 T.

    PubMed

    Heidemann, Robin M; Anwander, Alfred; Feiweier, Thorsten; Knösche, Thomas R; Turner, Robert

    2012-04-02

    There is ongoing debate whether using a higher spatial resolution (sampling k-space) or a higher angular resolution (sampling q-space angles) is the better way to improve diffusion MRI (dMRI) based tractography results in living humans. In both cases, the limiting factor is the signal-to-noise ratio (SNR), due to the restricted acquisition time. One possible way to increase the spatial resolution without sacrificing either SNR or angular resolution is to move to a higher magnetic field strength. Nevertheless, dMRI has not been the preferred application for ultra-high field strength (7 T). This is because single-shot echo-planar imaging (EPI) has been the method of choice for human in vivo dMRI. EPI faces several challenges related to the use of a high resolution at high field strength, for example, distortions and image blurring. These problems can easily compromise the expected SNR gain with field strength. In the current study, we introduce an adapted EPI sequence in conjunction with a combination of ZOOmed imaging and Partially Parallel Acquisition (ZOOPPA). We demonstrate that the method can produce high quality diffusion-weighted images with high spatial and angular resolution at 7 T. We provide examples of in vivo human dMRI with isotropic resolutions of 1 mm and 800 μm. These data sets are particularly suitable for resolving complex and subtle fiber architectures, including fiber crossings in the white matter, anisotropy in the cortex and fibers entering the cortex. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Simulations of SSLV Ascent and Debris Transport

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart; Aftosmis, Michael; Murman, Scott; Chan, William; Gomez, Ray; Gomez, Ray; Vicker, Darby; Stuart, Phil

    2006-01-01

    A viewgraph presentation on Computational Fluid Dynamic (CFD) Simulation of Space Shuttle Launch Vehicle (SSLV) ascent and debris transport analysis is shown. The topics include: 1) CFD simulations of the Space Shuttle Launch Vehicle ascent; 2) Debris transport analysis; 3) Debris aerodynamic modeling; and 4) Other applications.

  10. Active illuminated space object imaging and tracking simulation

    NASA Astrophysics Data System (ADS)

    Yue, Yufang; Xie, Xiaogang; Luo, Wen; Zhang, Feizhou; An, Jianzhu

    2016-10-01

    Optical earth imaging simulation of a space target in orbit and it's extraction in laser illumination condition were discussed. Based on the orbit and corresponding attitude of a satellite, its 3D imaging rendering was built. General simulation platform was researched, which was adaptive to variable 3D satellite models and relative position relationships between satellite and earth detector system. Unified parallel projection technology was proposed in this paper. Furthermore, we denoted that random optical distribution in laser-illuminated condition was a challenge for object discrimination. Great randomicity of laser active illuminating speckles was the primary factor. The conjunction effects of multi-frame accumulation process and some tracking methods such as Meanshift tracking, contour poid, and filter deconvolution were simulated. Comparison of results illustrates that the union of multi-frame accumulation and contour poid was recommendable for laser active illuminated images, which had capacities of high tracking precise and stability for multiple object attitudes.

  11. The space physics analysis network

    NASA Astrophysics Data System (ADS)

    Green, James L.

    1988-04-01

    The Space Physics Analysis Network, or SPAN, is emerging as a viable method for solving an immediate communication problem for space and Earth scientists and has been operational for nearly 7 years. SPAN and its extension into Europe, utilizes computer-to-computer communications allowing mail, binary and text file transfer, and remote logon capability to over 1000 space science computer systems. The network has been used to successfully transfer real-time data to remote researchers for rapid data analysis but its primary function is for non-real-time applications. One of the major advantages for using SPAN is its spacecraft mission independence. Space science researchers using SPAN are located in universities, industries and government institutions all across the United States and Europe. These researchers are in such fields as magnetospheric physics, astrophysics, ionosperic physics, atmospheric physics, climatology, meteorology, oceanography, planetary physics and solar physics. SPAN users have access to space and Earth science data bases, mission planning and information systems, and computational facilities for the purposes of facilitating correlative space data exchange, data analysis and space research. For example, the National Space Science Data Center (NSSDC), which manages the network, is providing facilities on SPAN such as the Network Information Center (SPAN NIC). SPAN has interconnections with several national and international networks such as HEPNET and TEXNET forming a transparent DECnet network. The combined total number of computers now reachable over these combined networks is about 2000. In addition, SPAN supports full function capabilities over the international public packet switched networks (e.g. TELENET) and has mail gateways to ARPANET, BITNET and JANET.

  12. Sensitivity analysis of the space shuttle to ascent wind profiles

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Austin, L. D., Jr.

    1982-01-01

    A parametric sensitivity analysis of the space shuttle ascent flight to the wind profile is presented. Engineering systems parameters are obtained by flight simulations using wind profile models and samples of detailed (Jimsphere) wind profile measurements. The wind models used are the synthetic vector wind model, with and without the design gust, and a model of the vector wind change with respect to time. From these comparison analyses an insight is gained on the contribution of winds to ascent subsystems flight parameters.

  13. Rosetta CONSERT operations and data analysis preparation: simulation software tools.

    NASA Astrophysics Data System (ADS)

    Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

    2014-05-01

    The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

  14. Characterization of a new qQq-FTICR mass spectrometer for post-translational modification analysis and top-down tandem mass spectrometry of whole proteins.

    PubMed

    Jebanathirajah, Judith A; Pittman, Jason L; Thomson, Bruce A; Budnik, Bogdan A; Kaur, Parminder; Rape, Michael; Kirschner, Marc; Costello, Catherine E; O'Connor, Peter B

    2005-12-01

    The use of a new electrospray qQq Fourier transform ion cyclotron mass spectrometer (qQq-FTICR MS) instrument for biologic applications is described. This qQq-FTICR mass spectrometer was designed for the study of post-translationally modified proteins and for top-down analysis of biologically relevant protein samples. The utility of the instrument for the analysis of phosphorylation, a common and important post-translational modification, was investigated. Phosphorylation was chosen as an example because it is ubiquitous and challenging to analyze. In addition, the use of the instrument for top-down sequencing of proteins was explored since this instrument offers particular advantages to this approach. Top-down sequencing was performed on different proteins, including commercially available proteins and biologically derived samples such as the human E2 ubiquitin conjugating enzyme, UbCH10. A good sequence tag was obtained for the human UbCH10, allowing the unambiguous identification of the protein. The instrument was built with a commercially produced front end: a focusing rf-only quadrupole (Q0), followed by a resolving quadrupole (Q1), and a LINAC quadrupole collision cell (Q2), in combination with an FTICR mass analyzer. It has utility in the analysis of samples found in substoichiometric concentrations, as ions can be isolated in the mass resolving Q1 and accumulated in Q2 before analysis in the ICR cell. The speed and efficacy of the Q2 cooling and fragmentation was demonstrated on an LCMS-compatible time scale, and detection limits for phosphopeptides in the 10 amol/muL range (pM) were demonstrated. The instrument was designed to make several fragmentation methods available, including nozzle-skimmer fragmentation, Q2 collisionally activated dissociation (Q2 CAD), multipole storage assisted dissociation (MSAD), electron capture dissociation (ECD), infrared multiphoton induced dissociation (IRMPD), and sustained off resonance irradiation (SORI) CAD, thus

  15. Effects of inspired CO2, hyperventilation, and time on VA/Q inequality in the dog

    NASA Technical Reports Server (NTRS)

    Tsukimoto, K.; Arcos, J. P.; Schaffartzik, W.; Wagner, P. D.; West, J. B.

    1992-01-01

    In a recent study by Tsukimoto et al. (J. Appl. Physiol. 68: 2488-2493, 1990), CO2 inhalation appeared to reduce the size of the high ventilation-perfusion ratio (VA/Q) mode commonly observed in anesthetized mechanically air-ventilated dogs. In that study, large tidal volumes (VT) were used during CO2 inhalation to preserve normocapnia. To separate the influences of CO2 and high VT on the VA/Q distribution in the present study, we examined the effect of inspired CO2 on the high VA/Q mode using eight mechanically ventilated dogs (4 given CO2, 4 controls). The VA/Q distribution was measured first with normal VT and then with increased VT. In the CO2 group at high VT, data were collected before, during, and after CO2 inhalation. With normal VT, there was no difference in the size of the high VA/Q mode between groups [10.5 +/- 3.5% (SE) of ventilation in the CO2 group, 11.8 +/- 5.2% in the control group]. Unexpectedly, the size of the high VA/Q mode decreased similarly in both groups over time, independently of the inspired PCO2, at a rate similar to the fall in cardiac output over time. The reduction in the high VA/Q mode together with a simultaneous increase in alveolar dead space (estimated by the difference between inert gas dead space and Fowler dead space) suggests that poorly perfused high VA/Q areas became unperfused over time. A possible mechanism is that elevated alveolar pressure and decreased cardiac output eliminate blood flow from corner vessels in nondependent high VA/Q regions.

  16. Thermo-electrochemical analysis of lithium ion batteries for space applications using Thermal Desktop

    NASA Astrophysics Data System (ADS)

    Walker, W.; Ardebili, H.

    2014-12-01

    Lithium-ion batteries (LIBs) are replacing the Nickel-Hydrogen batteries used on the International Space Station (ISS). Knowing that LIB efficiency and survivability are greatly influenced by temperature, this study focuses on the thermo-electrochemical analysis of LIBs in space orbit. Current finite element modeling software allows for advanced simulation of the thermo-electrochemical processes; however the heat transfer simulation capabilities of said software suites do not allow for the extreme complexities of orbital-space environments like those experienced by the ISS. In this study, we have coupled the existing thermo-electrochemical models representing heat generation in LIBs during discharge cycles with specialized orbital-thermal software, Thermal Desktop (TD). Our model's parameters were obtained from a previous thermo-electrochemical model of a 185 Amp-Hour (Ah) LIB with 1-3 C (C) discharge cycles for both forced and natural convection environments at 300 K. Our TD model successfully simulates the temperature vs. depth-of-discharge (DOD) profiles and temperature ranges for all discharge and convection variations with minimal deviation through the programming of FORTRAN logic representing each variable as a function of relationship to DOD. Multiple parametrics were considered in a second and third set of cases whose results display vital data in advancing our understanding of accurate thermal modeling of LIBs.

  17. Neonatal liver failure and Leigh syndrome possibly due to CoQ-responsive OXPHOS deficiency.

    PubMed

    Leshinsky-Silver, E; Levine, A; Nissenkorn, A; Barash, V; Perach, M; Buzhaker, E; Shahmurov, M; Polak-Charcon, S; Lev, D; Lerman-Sagie, T

    2003-08-01

    CoQ transfers electrons from complexes I and II of the mitochondrial respiratory chain to complex III. There are very few reports on human CoQ deficiency. The clinical presentation is usually characterized by: epilepsy, muscle weakness, ataxia, cerebellar atrophy, migraine, myogloblinuria and developmental delay. We describe a patient who presented with neonatal liver and pancreatic insufficiency, tyrosinemia and hyperammonemia and later developed sensorineural hearing loss and Leigh syndrome. Liver biopsy revealed markedly reduced complex I+III and II+III. Addition of CoQ to the liver homogenate restored the activities, suggesting CoQ depletion. Histological staining showed prominent bridging; septal fibrosis and widening of portal spaces with prominent mixed inflammatory infiltrate, associated with interface hepatitis, bile duct proliferation with numerous bile plugs. Electron microscopy revealed a large number of mitochondria, which were altered in shape and size, widened and disordered intercristal spaces. This may be the first case of Leigh syndrome with liver and pancreas insufficiency, possibly caused by CoQ responsive oxphos deficiency.

  18. Space simulation test for thermal control materials

    NASA Technical Reports Server (NTRS)

    Hardgrove, W. R.

    1990-01-01

    Tests were run in TRW's Combined Environment Facility to examine the degradation of thermal control materials in a simulated space environment. Thermal control materials selected for the test were those presently being used on spacecraft or predicted to be used within the next few years. The geosynchronous orbit environment was selected as the most interesting. One of the goals was to match degradation of those materials with available flight data. Another aim was to determine if degradation can adequately be determined with accelerated or short term ground tests.

  19. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on

  20. Sperm FISH analysis of a 44,X,der(Y),t(Y;15)(q12;q10)pat,rob(13;14)(q10;q10)mat complex chromosome rearrangement.

    PubMed

    Ferfouri, F; Boitrelle, F; Clement, P; Molina Gomes, D; Selva, J; Vialard, F

    2014-06-01

    Complex chromosome rearrangements (CCR) with two independent chromosome rearrangements are rare. Although CCRs lead to high unbalanced gamete rates, data on meiotic segregation in this context are scarce. A male patient was referred to our clinic as part of a family screening programme prompted by the observation of a 44,X,der(Y),t(Y;15)(q12;q10)pat,rob(13;14)(q10;q10)mat karyotype in his brother. Karyotyping identified the same CCR. Sperm FISH (with locus-specific probes for the segments involved in the translocations and nine chromosomes not involved in both rearrangements) was used to investigate the rearrangements meiotic segregation products and establish whether or not an inter-chromosomal effect was present. Sperm nuclear DNA fragmentation was also evaluated. For rob(13;14) and der(Y), the proportions of unbalanced products were, respectively, 26.4% and 60.6%. Overall, 70.3% of the meiotic segregation products were unbalanced. No evidence of an inter-chromosomal effect was found, and the sperm nuclear DNA fragmentation rate was similar to our laboratory's normal cut-off value. In view of previously published sperm FISH analyses of Robertsonian translocations (and even though the mechanism is still unknown), we hypothesise that cosegregation of der(Y) and rob(13;14) could modify rob(13;14) meiotic segregation. © 2013 Blackwell Verlag GmbH.