Sample records for model space generation

  1. Distributed state-space generation of discrete-state stochastic models

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Gluckman, Joshua; Nicol, David

    1995-01-01

    High-level formalisms such as stochastic Petri nets can be used to model complex systems. Analysis of logical and numerical properties of these models of ten requires the generation and storage of the entire underlying state space. This imposes practical limitations on the types of systems which can be modeled. Because of the vast amount of memory consumed, we investigate distributed algorithms for the generation of state space graphs. The distributed construction allows us to take advantage of the combined memory readily available on a network of workstations. The key technical problem is to find effective methods for on-the-fly partitioning, so that the state space is evenly distributed among processors. In this paper we report on the implementation of a distributed state-space generator that may be linked to a number of existing system modeling tools. We discuss partitioning strategies in the context of Petri net models, and report on performance observed on a network of workstations, as well as on a distributed memory multi-computer.

  2. [A dynamic model of the extravehicular (correction of extravehicuar) activity space suit].

    PubMed

    Yang, Feng; Yuan, Xiu-gan

    2002-12-01

    Objective. To establish a dynamic model of the space suit base on the particular configuration of the space suit. Method. The mass of the space suit components, moment of inertia, mobility of the joints of space suit, as well as the suit-generated torques, were considered in this model. The expressions to calculate the moment of inertia were developed by simplifying the geometry of the space suit. A modified Preisach model was used to mathematically describe the hysteretic torque characteristics of joints in a pressurized space suit, and it was implemented numerically basing on the observed suit parameters. Result. A dynamic model considering mass, moment of inertia and suit-generated torques was established. Conclusion. This dynamic model provides some elements for the dynamic simulation of the astronaut extravehicular activity.

  3. Between the Rock and a Hard Place: The CCMC as a Transit Station Between Modelers and Forecasters

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involved model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the latter element. Specifically, we will discuss the process of transition research models, or information generated by research models, to Space Weather Forecasting organizations. We will analyze successes as well as obstacles to further progress, and we will suggest avenues for increased transitioning success.

  4. An evaluation of behavior inferences from Bayesian state-space models: A case study with the Pacific walrus

    USGS Publications Warehouse

    Beatty, William; Jay, Chadwick V.; Fischbach, Anthony S.

    2016-01-01

    State-space models offer researchers an objective approach to modeling complex animal location data sets, and state-space model behavior classifications are often assumed to have a link to animal behavior. In this study, we evaluated the behavioral classification accuracy of a Bayesian state-space model in Pacific walruses using Argos satellite tags with sensors to detect animal behavior in real time. We fit a two-state discrete-time continuous-space Bayesian state-space model to data from 306 Pacific walruses tagged in the Chukchi Sea. We matched predicted locations and behaviors from the state-space model (resident, transient behavior) to true animal behavior (foraging, swimming, hauled out) and evaluated classification accuracy with kappa statistics (κ) and root mean square error (RMSE). In addition, we compared biased random bridge utilization distributions generated with resident behavior locations to true foraging behavior locations to evaluate differences in space use patterns. Results indicated that the two-state model fairly classified true animal behavior (0.06 ≤ κ ≤ 0.26, 0.49 ≤ RMSE ≤ 0.59). Kernel overlap metrics indicated utilization distributions generated with resident behavior locations were generally smaller than utilization distributions generated with true foraging behavior locations. Consequently, we encourage researchers to carefully examine parameters and priors associated with behaviors in state-space models, and reconcile these parameters with the study species and its expected behaviors.

  5. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.

  6. An economic analysis of disaggregation of space assets: Application to GPS

    NASA Astrophysics Data System (ADS)

    Hastings, Daniel E.; La Tour, Paul A.

    2017-05-01

    New ideas, technologies and architectural concepts are emerging with the potential to reshape the space enterprise. One of those new architectural concepts is the idea that rather than aggregating payloads onto large very high performance buses, space architectures should be disaggregated with smaller numbers of payloads (as small as one) per bus and the space capabilities spread across a correspondingly larger number of systems. The primary rationale is increased survivability and resilience. The concept of disaggregation is examined from an acquisition cost perspective. A mixed system dynamics and trade space exploration model is developed to look at long-term trends in the space acquisition business. The model is used to examine the question of how different disaggregated GPS architectures compare in cost to the well-known current GPS architecture. A generation-over-generation examination of policy choices is made possible through the application of soft systems modeling of experience and learning effects. The assumptions that are allowed to vary are: design lives, production quantities, non-recurring engineering and time between generations. The model shows that there is always a premium in the first generation to be paid to disaggregate the GPS payloads. However, it is possible to construct survivable architectures where the premium after two generations is relatively low.

  7. Towards the Next Generation of Space Environment Prediction Capabilities.

    NASA Astrophysics Data System (ADS)

    Kuznetsova, M. M.

    2015-12-01

    Since its establishment more than 15 years ago, the Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) is serving as an assess point to expanding collection of state-of-the-art space environment models and frameworks as well as a hub for collaborative development of next generation space weather forecasting systems. In partnership with model developers and international research and operational communities the CCMC integrates new data streams and models from diverse sources into end-to-end space weather impacts predictive systems, identifies week links in data-model & model-model coupling and leads community efforts to fill those gaps. The presentation will highlight latest developments, progress in CCMC-led community-wide projects on testing, prototyping, and validation of models, forecasting techniques and procedures and outline ideas on accelerating implementation of new capabilities in space weather operations.

  8. Explorations in Space and Time: Computer-Generated Astronomy Films

    ERIC Educational Resources Information Center

    Meeks, M. L.

    1973-01-01

    Discusses the use of the computer animation technique to travel through space and time and watch models of astronomical systems in motion. Included is a list of eight computer-generated demonstration films entitled Explorations in Space and Time.'' (CC)

  9. D Modelling of AN Indoor Space Using a Rotating Stereo Frame Camera System

    NASA Astrophysics Data System (ADS)

    Kang, J.; Lee, I.

    2016-06-01

    Sophisticated indoor design and growing development in urban architecture make indoor spaces more complex. And the indoor spaces are easily connected to public transportations such as subway and train stations. These phenomena allow to transfer outdoor activities to the indoor spaces. Constant development of technology has a significant impact on people knowledge about services such as location awareness services in the indoor spaces. Thus, it is required to develop the low-cost system to create the 3D model of the indoor spaces for services based on the indoor models. In this paper, we thus introduce the rotating stereo frame camera system that has two cameras and generate the indoor 3D model using the system. First, select a test site and acquired images eight times during one day with different positions and heights of the system. Measurements were complemented by object control points obtained from a total station. As the data were obtained from the different positions and heights of the system, it was possible to make various combinations of data and choose several suitable combinations for input data. Next, we generated the 3D model of the test site using commercial software with previously chosen input data. The last part of the processes will be to evaluate the accuracy of the generated indoor model from selected input data. In summary, this paper introduces the low-cost system to acquire indoor spatial data and generate the 3D model using images acquired by the system. Through this experiments, we ensure that the introduced system is suitable for generating indoor spatial information. The proposed low-cost system will be applied to indoor services based on the indoor spatial information.

  10. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.

  11. International Space Station Centrifuge Rotor Models A Comparison of the Euler-Lagrange and the Bond Graph Modeling Approach

    NASA Technical Reports Server (NTRS)

    Nguyen, Louis H.; Ramakrishnan, Jayant; Granda, Jose J.

    2006-01-01

    The assembly and operation of the International Space Station (ISS) require extensive testing and engineering analysis to verify that the Space Station system of systems would work together without any adverse interactions. Since the dynamic behavior of an entire Space Station cannot be tested on earth, math models of the Space Station structures and mechanical systems have to be built and integrated in computer simulations and analysis tools to analyze and predict what will happen in space. The ISS Centrifuge Rotor (CR) is one of many mechanical systems that need to be modeled and analyzed to verify the ISS integrated system performance on-orbit. This study investigates using Bond Graph modeling techniques as quick and simplified ways to generate models of the ISS Centrifuge Rotor. This paper outlines the steps used to generate simple and more complex models of the CR using Bond Graph Computer Aided Modeling Program with Graphical Input (CAMP-G). Comparisons of the Bond Graph CR models with those derived from Euler-Lagrange equations in MATLAB and those developed using multibody dynamic simulation at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) are presented to demonstrate the usefulness of the Bond Graph modeling approach for aeronautics and space applications.

  12. Method of performing computational aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Silva, Walter A. (Inventor)

    2011-01-01

    Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.

  13. Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation

    DTIC Science & Technology

    2009-09-01

    Introduction This presentation summarizes recent activity in monitoring spacecraft health status using passive remote optical nonimaging ...Approved for public release; distribution is unlimited. Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation...It is beneficial to the observer/analyst to understand the fundamental optical signature variability associated with these detection and

  14. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  15. Analysis of Direct Solar Illumination on the Backside of Space Station Solar Cells

    NASA Technical Reports Server (NTRS)

    Delleur, Ann M.; Kerslake, Thomas W.; Scheiman, David A.

    1999-01-01

    The International Space Station (ISS) is a complex spacecraft that will take several years to assemble in orbit. During many of the assembly and maintenance procedures, the space station's large solar arrays must he locked, which can significantly reduce power generation. To date, power generation analyses have not included power generation from the backside of the solar cells in a desire to produce a conservative analysis. This paper describes the testing of ISS solar cell backside power generation, analytical modeling and analysis results on an ISS assembly mission.

  16. Next Generation NASA Initiative for Space Geodesy

    NASA Technical Reports Server (NTRS)

    Merkowitz, S. M.; Desai, S.; Gross, R. S.; Hilliard, L.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry J. F.; Murphy, D.; Noll, C. E.; hide

    2012-01-01

    Space geodesy measurement requirements have become more and more stringent as our understanding of the physical processes and our modeling techniques have improved. In addition, current and future spacecraft will have ever-increasing measurement capability and will lead to increasingly sophisticated models of changes in the Earth system. Ground-based space geodesy networks with enhanced measurement capability will be essential to meeting these oncoming requirements and properly interpreting the sate1!ite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation ofthe observed geophysical signals. These requirements have been articulated by the Global Geodetic Observing System (GGOS). The NASA Space Geodesy Project (SGP) is developing a prototype core site as the basis for a next generation Space Geodetic Network (SGN) that would be NASA's contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Each of the sites in the SGN would include co-located, state of-the-art systems from all four space geodetic observing techniques (GNSS, SLR, VLBI, and DORIS). The prototype core site is being developed at NASA's Geophysical and Astronomical Observatory at Goddard Space Flight Center. The project commenced in 2011 and is scheduled for completion in late 2013. In January 2012, two multiconstellation GNSS receivers, GODS and GODN, were established at the prototype site as part of the local geodetic network. Development and testing are also underway on the next generation SLR and VLBI systems along with a modern DORIS station. An automated survey system is being developed to measure inter-technique vector ties, and network design studies are being performed to define the appropriate number and distribution of these next generation space geodetic core sites that are required to achieve the driving ITRF requirements. We present the status of this prototype next generation space geodetic core site, results from the analysis of data from the established geodetic stations, and results from the ongoing network design studies.

  17. A growing social network model in geographical space

    NASA Astrophysics Data System (ADS)

    Antonioni, Alberto; Tomassini, Marco

    2017-09-01

    In this work we propose a new model for the generation of social networks that includes their often ignored spatial aspects. The model is a growing one and links are created either taking space into account, or disregarding space and only considering the degree of target nodes. These two effects can be mixed linearly in arbitrary proportions through a parameter. We numerically show that for a given range of the combination parameter, and for given mean degree, the generated network class shares many important statistical features with those observed in actual social networks, including the spatial dependence of connections. Moreover, we show that the model provides a good qualitative fit to some measured social networks.

  18. Space Weather Products at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Kuznetsova, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; MacNeice, P.

    2010-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of space weather forecasting tools. Owing to the pace of development in the science community, new model capabilities emerge frequently. Consequently, space weather products and tools involve not only increased validity, but often entirely new capabilities. This presentation will review the present state of space weather tools as well as point out emerging future capabilities.

  19. Photodynamic therapy: computer modeling of diffusion and reaction phenomena

    NASA Astrophysics Data System (ADS)

    Hampton, James A.; Mahama, Patricia A.; Fournier, Ronald L.; Henning, Jeffery P.

    1996-04-01

    We have developed a transient, one-dimensional mathematical model for the reaction and diffusion phenomena that occurs during photodynamic therapy (PDT). This model is referred to as the PDTmodem program. The model is solved by the Crank-Nicholson finite difference technique and can be used to predict the fates of important molecular species within the intercapillary tissue undergoing PDT. The following factors govern molecular oxygen consumption and singlet oxygen generation within a tumor: (1) photosensitizer concentration; (2) fluence rate; and (3) intercapillary spacing. In an effort to maximize direct tumor cell killing, the model allows educated decisions to be made to insure the uniform generation and exposure of singlet oxygen to tumor cells across the intercapillary space. Based on predictions made by the model, we have determined that the singlet oxygen concentration profile within the intercapillary space is controlled by the product of the drug concentration, and light fluence rate. The model predicts that at high levels of this product, within seconds singlet oxygen generation is limited to a small core of cells immediately surrounding the capillary. The remainder of the tumor tissue in the intercapillary space is anoxic and protected from the generation and toxic effects of singlet oxygen. However, at lower values of this product, the PDT-induced anoxic regions are not observed. An important finding is that an optimal value of this product can be defined that maintains the singlet oxygen concentration throughout the intercapillary space at a near constant level. Direct tumor cell killing is therefore postulated to depend on the singlet oxygen exposure, defined as the product of the uniform singlet oxygen concentration and the time of exposure, and not on the total light dose.

  20. Method of grid generation

    DOEpatents

    Barnette, Daniel W.

    2002-01-01

    The present invention provides a method of grid generation that uses the geometry of the problem space and the governing relations to generate a grid. The method can generate a grid with minimized discretization errors, and with minimal user interaction. The method of the present invention comprises assigning grid cell locations so that, when the governing relations are discretized using the grid, at least some of the discretization errors are substantially zero. Conventional grid generation is driven by the problem space geometry; grid generation according to the present invention is driven by problem space geometry and by governing relations. The present invention accordingly can provide two significant benefits: more efficient and accurate modeling since discretization errors are minimized, and reduced cost grid generation since less human interaction is required.

  1. Space Weather Models at the CCMC And Their Capabilities

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2007-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. In this presentation, we will provide an overview of the community-provided, space weather-relevant, model suite, which resides at CCMC. We will discuss current capabilities, and analyze expected future developments of space weather related modeling.

  2. Simulation of a cascaded longitudinal space charge amplifier for coherent radiation generation

    DOE PAGES

    Halavanau, A.; Piot, P.

    2016-03-03

    Longitudinal space charge (LSC) effects are generally considered as harmful in free-electron lasers as they can seed unfavorable energy modulations that can result in density modulations with associated emittance dilution. It was pointed out, however, that such \\micro-bunching instabilities" could be potentially useful to support the generation of broadband coherent radiation. Therefore there has been an increasing interest in devising accelerator beam lines capable of controlling LSC induced density modulations. In the present paper we augment these previous investigations by combining a grid-less space charge algorithm with the popular particle-tracking program elegant. This high-fidelity model of the space charge ismore » used to benchmark conventional LSC models. We then employ the developed model to optimize the performance of a cascaded longitudinal space charge amplifier using beam parameters comparable to the ones achievable at Fermilab Accelerator Science & Technology (FAST) facility currently under commissioning at Fermilab.« less

  3. Elliptic surface grid generation on minimal and parmetrized surfaces

    NASA Technical Reports Server (NTRS)

    Spekreijse, S. P.; Nijhuis, G. H.; Boerstoel, J. W.

    1995-01-01

    An elliptic grid generation method is presented which generates excellent boundary conforming grids in domains in 2D physical space. The method is based on the composition of an algebraic and elliptic transformation. The composite mapping obeys the familiar Poisson grid generation system with control functions specified by the algebraic transformation. New expressions are given for the control functions. Grid orthogonality at the boundary is achieved by modification of the algebraic transformation. It is shown that grid generation on a minimal surface in 3D physical space is in fact equivalent to grid generation in a domain in 2D physical space. A second elliptic grid generation method is presented which generates excellent boundary conforming grids on smooth surfaces. It is assumed that the surfaces are parametrized and that the grid only depends on the shape of the surface and is independent of the parametrization. Concerning surface modeling, it is shown that bicubic Hermite interpolation is an excellent method to generate a smooth surface which is passing through a given discrete set of control points. In contrast to bicubic spline interpolation, there is extra freedom to model the tangent and twist vectors such that spurious oscillations are prevented.

  4. A bootstrap based space-time surveillance model with an application to crime occurrences

    NASA Astrophysics Data System (ADS)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  5. Citygml and the Streets of New York - a Proposal for Detailed Street Space Modelling

    NASA Astrophysics Data System (ADS)

    Beil, C.; Kolbe, T. H.

    2017-10-01

    Three-dimensional semantic city models are increasingly used for the analysis of large urban areas. Until now the focus has mostly been on buildings. Nonetheless many applications could also benefit from detailed models of public street space for further analysis. However, there are only few guidelines for representing roads within city models. Therefore, related standards dealing with street modelling are examined and discussed. Nearly all street representations are based on linear abstractions. However, there are many use cases that require or would benefit from the detailed geometrical and semantic representation of street space. A variety of potential applications for detailed street space models are presented. Subsequently, based on related standards as well as on user requirements, a concept for a CityGML-compliant representation of street space in multiple levels of detail is developed. In the course of this process, the CityGML Transportation model of the currently valid OGC standard CityGML2.0 is examined to discover possibilities for further developments. Moreover, a number of improvements are presented. Finally, based on open data sources, the proposed concept is implemented within a semantic 3D city model of New York City generating a detailed 3D street space model for the entire city. As a result, 11 thematic classes, such as roadbeds, sidewalks or traffic islands are generated and enriched with a large number of thematic attributes.

  6. Reduced order modeling of head related transfer functions for virtual acoustic displays

    NASA Astrophysics Data System (ADS)

    Willhite, Joel A.; Frampton, Kenneth D.; Grantham, D. Wesley

    2003-04-01

    The purpose of this work is to improve the computational efficiency in acoustic virtual applications by creating and testing reduced order models of the head related transfer functions used in localizing sound sources. State space models of varying order were generated from zero-elevation Head Related Impulse Responses (HRIRs) using Kungs Single Value Decomposition (SVD) technique. The inputs to the models are the desired azimuths of the virtual sound sources (from minus 90 deg to plus 90 deg, in 10 deg increments) and the outputs are the left and right ear impulse responses. Trials were conducted in an anechoic chamber in which subjects were exposed to real sounds that were emitted by individual speakers across a numbered speaker array, phantom sources generated from the original HRIRs, and phantom sound sources generated with the different reduced order state space models. The error in the perceived direction of the phantom sources generated from the reduced order models was compared to errors in localization using the original HRIRs.

  7. A theory of the n-i-p silicon solar cell

    NASA Technical Reports Server (NTRS)

    Goradia, C.; Weinberg, I.; Baraona, C.

    1981-01-01

    A computer model has been developed, based on an analytical theory of the high base resistivity BSF n(+)(pi)p(+) or p(+)(nu)n(+) silicon solar cell. The model makes very few assumptions and accounts for nonuniform optical generation, generation and recombination in the junction space charge region, and bandgap narrowing in the heavily doped regions. The paper presents calculated results based on this model and compares them to available experimental data. Also discussed is radiation damage in high base resistivity n(+)(pi)p(+) space solar cells.

  8. Space-based laser-driven MHD generator: Feasibility study

    NASA Technical Reports Server (NTRS)

    Choi, S. H.

    1986-01-01

    The feasibility of a laser-driven MHD generator, as a candidate receiver for a space-based laser power transmission system, was investigated. On the basis of reasonable parameters obtained in the literature, a model of the laser-driven MHD generator was developed with the assumptions of a steady, turbulent, two-dimensional flow. These assumptions were based on the continuous and steady generation of plasmas by the exposure of the continuous wave laser beam thus inducing a steady back pressure that enables the medium to flow steadily. The model considered here took the turbulent nature of plasmas into account in the two-dimensional geometry of the generator. For these conditions with the plasma parameters defining the thermal conductivity, viscosity, electrical conductivity for the plasma flow, a generator efficiency of 53.3% was calculated. If turbulent effects and nonequilibrium ionization are taken into account, the efficiency is 43.2%. The study shows that the laser-driven MHD system has potential as a laser power receiver for space applications because of its high energy conversion efficiency, high energy density and relatively simple mechanism as compared to other energy conversion cycles.

  9. The Geometry of Generations

    NASA Astrophysics Data System (ADS)

    He, Yang-Hui; Jejjala, Vishnu; Matti, Cyril; Nelson, Brent D.; Stillman, Michael

    2015-10-01

    We present an intriguing and precise interplay between algebraic geometry and the phenomenology of generations of particles. Using the electroweak sector of the MSSM as a testing ground, we compute the moduli space of vacua as an algebraic variety for multiple generations of Standard Model matter and Higgs doublets. The space is shown to have Calabi-Yau, Grassmannian, and toric signatures, which sensitively depend on the number of generations of leptons, as well as inclusion of Majorana mass terms for right-handed neutrinos. We speculate as to why three generations is special.

  10. Finite Element Modeling of a Semi-Rigid Hybrid Mirror and a Highly Actuated Membrane Mirror as Candidates for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Craig, Larry; Jacobson, Dave; Mosier, Gary; Nein, Max; Page, Timothy; Redding, Dave; Sutherlin, Steve; Wilkerson, Gary

    2000-01-01

    Advanced space telescopes, which will eventually replace the Hubble Space Telescope (HTS), will have apertures of 8 - 20 n. Primary mirrors of these dimensions will have to be foldable to fit into the space launcher. By necessity these mirrors will be extremely light weight and flexible and the historical approaches to mirror designs, where the mirror is made as rigid as possible to maintain figure and to serve as the anchor for the entire telescope, cannot be applied any longer. New design concepts and verifications will depend entirely on analytical methods to predict optical performance. Finite element modeling of the structural and thermal behavior of such mirrors is becoming the tool for advanced space mirror designs. This paper discusses some of the preliminary tasks and study results, which are currently the basis for the design studies of the Next Generation Space Telescope.

  11. A nonuniform popularity-similarity optimization (nPSO) model to efficiently generate realistic complex networks with communities

    NASA Astrophysics Data System (ADS)

    Muscoloni, Alessandro; Vittorio Cannistraci, Carlo

    2018-05-01

    The investigation of the hidden metric space behind complex network topologies is a fervid topic in current network science and the hyperbolic space is one of the most studied, because it seems associated to the structural organization of many real complex systems. The popularity-similarity-optimization (PSO) model simulates how random geometric graphs grow in the hyperbolic space, generating realistic networks with clustering, small-worldness, scale-freeness and rich-clubness. However, it misses to reproduce an important feature of real complex networks, which is the community organization. The geometrical-preferential-attachment (GPA) model was recently developed in order to confer to the PSO also a soft community structure, which is obtained by forcing different angular regions of the hyperbolic disk to have a variable level of attractiveness. However, the number and size of the communities cannot be explicitly controlled in the GPA, which is a clear limitation for real applications. Here, we introduce the nonuniform PSO (nPSO) model. Differently from GPA, the nPSO generates synthetic networks in the hyperbolic space where heterogeneous angular node attractiveness is forced by sampling the angular coordinates from a tailored nonuniform probability distribution (for instance a mixture of Gaussians). The nPSO differs from GPA in other three aspects: it allows one to explicitly fix the number and size of communities; it allows one to tune their mixing property by means of the network temperature; it is efficient to generate networks with high clustering. Several tests on the detectability of the community structure in nPSO synthetic networks and wide investigations on their structural properties confirm that the nPSO is a valid and efficient model to generate realistic complex networks with communities.

  12. Algebraic Structure of tt * Equations for Calabi-Yau Sigma Models

    NASA Astrophysics Data System (ADS)

    Alim, Murad

    2017-08-01

    The tt * equations define a flat connection on the moduli spaces of {2d, \\mathcal{N}=2} quantum field theories. For conformal theories with c = 3 d, which can be realized as nonlinear sigma models into Calabi-Yau d-folds, this flat connection is equivalent to special geometry for threefolds and to its analogs in other dimensions. We show that the non-holomorphic content of the tt * equations, restricted to the conformal directions, in the cases d = 1, 2, 3 is captured in terms of finitely many generators of special functions, which close under derivatives. The generators are understood as coordinates on a larger moduli space. This space parameterizes a freedom in choosing representatives of the chiral ring while preserving a constant topological metric. Geometrically, the freedom corresponds to a choice of forms on the target space respecting the Hodge filtration and having a constant pairing. Linear combinations of vector fields on that space are identified with the generators of a Lie algebra. This Lie algebra replaces the non-holomorphic derivatives of tt * and provides these with a finer and algebraic meaning. For sigma models into lattice polarized K3 manifolds, the differential ring of special functions on the moduli space is constructed, extending known structures for d = 1 and 3. The generators of the differential rings of special functions are given by quasi-modular forms for d = 1 and their generalizations in d = 2, 3. Some explicit examples are worked out including the case of the mirror of the quartic in {\\mathbbm{P}^3}, where due to further algebraic constraints, the differential ring coincides with quasi modular forms.

  13. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  14. Realistic facial animation generation based on facial expression mapping

    NASA Astrophysics Data System (ADS)

    Yu, Hui; Garrod, Oliver; Jack, Rachael; Schyns, Philippe

    2014-01-01

    Facial expressions reflect internal emotional states of a character or in response to social communications. Though much effort has been taken to generate realistic facial expressions, it still remains a challenging topic due to human being's sensitivity to subtle facial movements. In this paper, we present a method for facial animation generation, which reflects true facial muscle movements with high fidelity. An intermediate model space is introduced to transfer captured static AU peak frames based on FACS to the conformed target face. And then dynamic parameters derived using a psychophysics method is integrated to generate facial animation, which is assumed to represent natural correlation of multiple AUs. Finally, the animation sequence in the intermediate model space is mapped to the target face to produce final animation.

  15. DOUBLE SHELL TANK (DST) HYDROXIDE DEPLETION MODEL FOR CARBON DIOXIDE ABSORPTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OGDEN DM; KIRCH NW

    2007-10-31

    This document generates a supernatant hydroxide ion depletion model based on mechanistic principles. The carbon dioxide absorption mechanistic model is developed in this report. The report also benchmarks the model against historical tank supernatant hydroxide data and vapor space carbon dioxide data. A comparison of the newly generated mechanistic model with previously applied empirical hydroxide depletion equations is also performed.

  16. Probabilistic #D data fusion for multiresolution surface generation

    NASA Technical Reports Server (NTRS)

    Manduchi, R.; Johnson, A. E.

    2002-01-01

    In this paper we present an algorithm for adaptive resolution integration of 3D data collected from multiple distributed sensors. The input to the algorithm is a set of 3D surface points and associated sensor models. Using a probabilistic rule, a surface probability function is generated that represents the probability that a particular volume of space contains the surface. The surface probability function is represented using an octree data structure; regions of space with samples of large conariance are stored at a coarser level than regions of space containing samples with smaller covariance. The algorithm outputs an adaptive resolution surface generated by connecting points that lie on the ridge of surface probability with triangles scaled to match the local discretization of space given by the algorithm, we present results from 3D data generated by scanning lidar and structure from motion.

  17. An improved empirical model for diversity gain on Earth-space propagation paths

    NASA Technical Reports Server (NTRS)

    Hodge, D. B.

    1981-01-01

    An empirical model was generated to estimate diversity gain on Earth-space propagation paths as a function of Earth terminal separation distance, link frequency, elevation angle, and angle between the baseline and the path azimuth. The resulting model reproduces the entire experimental data set with an RMS error of 0.73 dB.

  18. Ground Vibration Generated by a Load Moving Along a Railway Track

    NASA Astrophysics Data System (ADS)

    SHENG, X.; JONES, C. J. C.; PETYT, M.

    1999-11-01

    The propagation of vibration generated by a harmonic or a constant load moving along a layered beam resting on the layered half-space is investigated theoretically in this paper. The solution to this problem can be used to study the ground vibration generated by the motion of a train axle load on a railway track. In this application, the ground is modelled as a number of parallel viscoelastic layers overlying an elastic half-space or a rigid foundation. The track, including the rails, rail pad, sleepers and ballast, is modelled as an infinite, layered beam structure. The modal nature of propagation in the ground for a chosen set of ground parameters is discussed and the results of the model are presented showing the characteristics of the vibration generated by a constant load and an oscillatory load at speeds below, near to, and above the lowest ground wave speed.

  19. A probabilistic and continuous model of protein conformational space for template-free modeling.

    PubMed

    Zhao, Feng; Peng, Jian; Debartolo, Joe; Freed, Karl F; Sosnick, Tobin R; Xu, Jinbo

    2010-06-01

    One of the major challenges with protein template-free modeling is an efficient sampling algorithm that can explore a huge conformation space quickly. The popular fragment assembly method constructs a conformation by stringing together short fragments extracted from the Protein Data Base (PDB). The discrete nature of this method may limit generated conformations to a subspace in which the native fold does not belong. Another worry is that a protein with really new fold may contain some fragments not in the PDB. This article presents a probabilistic model of protein conformational space to overcome the above two limitations. This probabilistic model employs directional statistics to model the distribution of backbone angles and 2(nd)-order Conditional Random Fields (CRFs) to describe sequence-angle relationship. Using this probabilistic model, we can sample protein conformations in a continuous space, as opposed to the widely used fragment assembly and lattice model methods that work in a discrete space. We show that when coupled with a simple energy function, this probabilistic method compares favorably with the fragment assembly method in the blind CASP8 evaluation, especially on alpha or small beta proteins. To our knowledge, this is the first probabilistic method that can search conformations in a continuous space and achieves favorable performance. Our method also generated three-dimensional (3D) models better than template-based methods for a couple of CASP8 hard targets. The method described in this article can also be applied to protein loop modeling, model refinement, and even RNA tertiary structure prediction.

  20. Development of a preprototype trace contaminant control system. [for space stations

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The steady state contaminant load model based on shuttle equipment and material test programs, and on the current space station studies was revised. An emergency upset contaminant load model based on anticipated emergency upsets that could occur in an operational space station was defined. Control methods for the contaminants generated by the emergency upsets were established by test. Preliminary designs of both steady state and emergency contaminant control systems for the space station application are presented.

  1. Visitors Center Exhibits

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A child enjoys building his own LEGO model at a play table which was included in the exhibit 'Travel in Space' World Show. The exhibit consisted of 21 displays designed to teach children about flight and space travel from the Wright brothers to future generations of space vehicles.

  2. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  3. COI Structural Analysis Presentation

    NASA Technical Reports Server (NTRS)

    Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2001-01-01

    This report discusses the structural analysis of the Next Generation Space Telescope Mirror System Demonstrator (NMSD) developed by Composite Optics Incorporated (COI) in support of the Next Generation Space Telescope (NGST) project. The mirror was submitted to Marshall Space Flight Center (MSFC) for cryogenic testing and evaluation. Once at MSFC, the mirror was lowered to approximately 40 K and the optical surface distortions were measured. Alongside this experiment, an analytical model was developed and used to compare to the test results. A NASTRAN finite element model was provided by COI and a thermal model was developed from it. Using the thermal model, steady state nodal temperatures were calculated based on the predicted environment of the large cryogenic test chamber at MSFC. This temperature distribution was applied in the structural analysis to solve for the deflections of the optical surface. Finally, these deflections were submitted for optical analysis and comparison to the interferometer test data.

  4. Cross-talk between Rho and Rac GTPases drives deterministic exploration of cellular shape space and morphological heterogeneity.

    PubMed

    Sailem, Heba; Bousgouni, Vicky; Cooper, Sam; Bakal, Chris

    2014-01-22

    One goal of cell biology is to understand how cells adopt different shapes in response to varying environmental and cellular conditions. Achieving a comprehensive understanding of the relationship between cell shape and environment requires a systems-level understanding of the signalling networks that respond to external cues and regulate the cytoskeleton. Classical biochemical and genetic approaches have identified thousands of individual components that contribute to cell shape, but it remains difficult to predict how cell shape is generated by the activity of these components using bottom-up approaches because of the complex nature of their interactions in space and time. Here, we describe the regulation of cellular shape by signalling systems using a top-down approach. We first exploit the shape diversity generated by systematic RNAi screening and comprehensively define the shape space a migratory cell explores. We suggest a simple Boolean model involving the activation of Rac and Rho GTPases in two compartments to explain the basis for all cell shapes in the dataset. Critically, we also generate a probabilistic graphical model to show how cells explore this space in a deterministic, rather than a stochastic, fashion. We validate the predictions made by our model using live-cell imaging. Our work explains how cross-talk between Rho and Rac can generate different cell shapes, and thus morphological heterogeneity, in genetically identical populations.

  5. Space market model development project

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.

    1987-01-01

    The objectives of the research program, Space Market Model Development Project, (Phase 1) were: (1) to study the need for business information in the commercial development of space; and (2) to propose a design for an information system to meet the identified needs. Three simultaneous research strategies were used in proceeding toward this goal: (1) to describe the space business information which currently exists; (2) to survey government and business representatives on the information they would like to have; and (3) to investigate the feasibility of generating new economical information about the space industry.

  6. Individual-based models for adaptive diversification in high-dimensional phenotype spaces.

    PubMed

    Ispolatov, Iaroslav; Madhok, Vaibhav; Doebeli, Michael

    2016-02-07

    Most theories of evolutionary diversification are based on equilibrium assumptions: they are either based on optimality arguments involving static fitness landscapes, or they assume that populations first evolve to an equilibrium state before diversification occurs, as exemplified by the concept of evolutionary branching points in adaptive dynamics theory. Recent results indicate that adaptive dynamics may often not converge to equilibrium points and instead generate complicated trajectories if evolution takes place in high-dimensional phenotype spaces. Even though some analytical results on diversification in complex phenotype spaces are available, to study this problem in general we need to reconstruct individual-based models from the adaptive dynamics generating the non-equilibrium dynamics. Here we first provide a method to construct individual-based models such that they faithfully reproduce the given adaptive dynamics attractor without diversification. We then show that a propensity to diversify can be introduced by adding Gaussian competition terms that generate frequency dependence while still preserving the same adaptive dynamics. For sufficiently strong competition, the disruptive selection generated by frequency-dependence overcomes the directional evolution along the selection gradient and leads to diversification in phenotypic directions that are orthogonal to the selection gradient. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Using SpaceClaimTD Direct for Modeling Components with Complex Geometries for the Thermal Desktop-Based Advanced Stirling Radioisotope Generator Model

    NASA Technical Reports Server (NTRS)

    Fabanich, William A., Jr.

    2014-01-01

    SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractor's thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces/solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing/repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the "mark-up" of that geometry. These so-called "mark-ups" control how finite element (FE) meshes are to be generated through the "tagging" of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. "Domain-tags" were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine the objects each time as one would if using TDMesher. The use of SpaceClaim/TD Direct helps simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It also saves time and effort in the subsequent analysis.

  8. Using SpaceClaim/TD Direct for Modeling Components with Complex Geometries for the Thermal Desktop-Based Advanced Stirling Radioisotope Generator Model

    NASA Technical Reports Server (NTRS)

    Fabanich, William

    2014-01-01

    SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractors thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the mark-up of that geometry. These so-called mark-ups control how finite element (FE) meshes were generated and allowed the tagging of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. Domain-tags were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine these objects each time as one would if using TD Mesher.The use of SpaceClaim/TD Direct has helped simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It has also saved time and effort in the subsequent analysis.

  9. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    NASA Technical Reports Server (NTRS)

    Wu, Honglu; Atwell, William; Cucinotta, Francis A.; Yang, Chui-hsu

    1996-01-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  10. Numerical computation of complex multi-body Navier-Stokes flows with applications for the integrated Space Shuttle launch vehicle

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    1993-01-01

    An enhanced grid system for the Space Shuttle Orbiter was built by integrating CAD definitions from several sources and then generating the surface and volume grids. The new grid system contains geometric components not modeled previously plus significant enhancements on geometry that has been modeled in the old grid system. The new orbiter grids were then integrated with new grids for the rest of the launch vehicle. Enhancements were made to the hyperbolic grid generator HYPGEN and new tools for grid projection, manipulation, and modification, Cartesian box grid and far field grid generation and post-processing of flow solver data were developed.

  11. Neural Networks Based Approach to Enhance Space Hardware Reliability

    NASA Technical Reports Server (NTRS)

    Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.

    2011-01-01

    This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.

  12. Minimizing Actuator-Induced Residual Error in Active Space Telescope Primary Mirrors

    DTIC Science & Technology

    2010-09-01

    actuator geometry, and rib-to-facesheet intersection geometry are exploited to achieve improved performance in silicon carbide ( SiC ) mirrors . A...are exploited to achieve improved performance in silicon carbide ( SiC ) mirrors . A parametric finite element model is used to explore the trade space...MOST) finite element model. The move to lightweight actively-controlled silicon carbide ( SiC ) mirrors is traced back to previous generations of space

  13. NASA's Next Generation Space Geodesy Program

    NASA Technical Reports Server (NTRS)

    Pearlman, M. R.; Frey, H. V.; Gross, R. S.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry J. F.; Merkowitz, S. M.; Noll, C. E.; Pavilis, E. C.; hide

    2012-01-01

    Requirements for the ITRF have increased dramatically since the 1980s. The most stringent requirement comes from critical sea level monitoring programs: a global accuracy of 1.0 mm, and 0.1mm/yr stability, a factor of 10 to 20 beyond current capability. Other requirements for the ITRF coming from ice mass change, ground motion, and mass transport studies are similar. Current and future satellite missions will have ever-increasing measurement capability and will lead to increasingly sophisticated models of these and other changes in the Earth system. Ground space geodesy networks with enhanced measurement capability will be essential to meeting the ITRF requirements and properly interpreting the satellite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation of the observed geophysical signals. NASA has embarked on a Space Geodesy Program with a long-range goal to build, deploy and operate a next generation NASA Space Geodetic Network (SGN). The plan is to build integrated, multi-technique next-generation space geodetic observing systems as the core contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Phase 1 of this project has been funded to (1) Establish and demonstrate a next-generation prototype integrated Space Geodetic Station at Goddard s Geophysical and Astronomical Observatory (GGAO), including next-generation SLR and VLBI systems along with modern GNSS and DORIS; (2) Complete ongoing Network Design Studies that describe the appropriate number and distribution of next-generation Space Geodetic Stations for an improved global network; (3) Upgrade analysis capability to handle the next-generation data; (4) Implement a modern survey system to measure inter-technique vectors for co-location; and (5) Develop an Implementation Plan to build, deploy and operate a next-generation integrated NASA SGN that will serve as NASA s contribution to the international global geodetic network. An envisioned Phase 2 (which is not currently funded) would include the replication of up to ten such stations to be deployed either as integrated units or as a complement to already in-place components provided by other organizations. This talk will give an update on the activities underway and the plans for completion.

  14. NASA's Next Generation Space Geodesy Program

    NASA Technical Reports Server (NTRS)

    Merkowitz, S. M.; Desai, S. D.; Gross, R. S.; Hillard, L. M.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry, J. F.; Murphy, D.; Noll, C. E.; hide

    2012-01-01

    Requirements for the ITRF have increased dramatically since the 1980s. The most stringent requirement comes from critical sea level monitoring programs: a global accuracy of 1.0 mm, and 0.1mm/yr stability, a factor of 10 to 20 beyond current capability. Other requirements for the ITRF coming from ice mass change, ground motion, and mass transport studies are similar. Current and future satellite missions will have ever-increasing measurement capability and will lead to increasingly sophisticated models of these and other changes in the Earth system. Ground space geodesy networks with enhanced measurement capability will be essential to meeting the ITRF requirements and properly interpreting the satellite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation of the observed geophysical signals. NASA has embarked on a Space Geodesy Program with a long-range goal to build, deploy and operate a next generation NASA Space Geodetic Network (SGN). The plan is to build integrated, multi-technique next-generation space geodetic observing systems as the core contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Phase 1 of this project has been funded to (1) Establish and demonstrate a next-generation prototype integrated Space Geodetic Station at Goddard's Geophysical and Astronomical Observatory (GGAO), including next-generation SLR and VLBI systems along with modern GNSS and DORIS; (2) Complete ongoing Network Design Studies that describe the appropriate number and distribution of next-generation Space Geodetic Stations for an improved global network; (3) Upgrade analysis capability to handle the next-generation data; (4) Implement a modern survey system to measure inter-technique vectors for co-location; and (5) Develop an Implementation Plan to build, deploy and operate a next-generation integrated NASA SGN that will serve as NASA's contribution to the international global geodetic network. An envisioned Phase 2 (which is not currently funded) would include the replication of up to ten such stations to be deployed either as integrated units or as a complement to already in-place components provided by other organizations. This talk will give an update on the activities underway and the plans for completion.

  15. State-space reduction and equivalence class sampling for a molecular self-assembly model.

    PubMed

    Packwood, Daniel M; Han, Patrick; Hitosugi, Taro

    2016-07-01

    Direct simulation of a model with a large state space will generate enormous volumes of data, much of which is not relevant to the questions under study. In this paper, we consider a molecular self-assembly model as a typical example of a large state-space model, and present a method for selectively retrieving 'target information' from this model. This method partitions the state space into equivalence classes, as identified by an appropriate equivalence relation. The set of equivalence classes H, which serves as a reduced state space, contains none of the superfluous information of the original model. After construction and characterization of a Markov chain with state space H, the target information is efficiently retrieved via Markov chain Monte Carlo sampling. This approach represents a new breed of simulation techniques which are highly optimized for studying molecular self-assembly and, moreover, serves as a valuable guideline for analysis of other large state-space models.

  16. Forward modeling of gravity data using geostatistically generated subsurface density variations

    USGS Publications Warehouse

    Phelps, Geoffrey

    2016-01-01

    Using geostatistical models of density variations in the subsurface, constrained by geologic data, forward models of gravity anomalies can be generated by discretizing the subsurface and calculating the cumulative effect of each cell (pixel). The results of such stochastically generated forward gravity anomalies can be compared with the observed gravity anomalies to find density models that match the observed data. These models have an advantage over forward gravity anomalies generated using polygonal bodies of homogeneous density because generating numerous realizations explores a larger region of the solution space. The stochastic modeling can be thought of as dividing the forward model into two components: that due to the shape of each geologic unit and that due to the heterogeneous distribution of density within each geologic unit. The modeling demonstrates that the internally heterogeneous distribution of density within each geologic unit can contribute significantly to the resulting calculated forward gravity anomaly. Furthermore, the stochastic models match observed statistical properties of geologic units, the solution space is more broadly explored by producing a suite of successful models, and the likelihood of a particular conceptual geologic model can be compared. The Vaca Fault near Travis Air Force Base, California, can be successfully modeled as a normal or strike-slip fault, with the normal fault model being slightly more probable. It can also be modeled as a reverse fault, although this structural geologic configuration is highly unlikely given the realizations we explored.

  17. Definition of common support equipment and space station interface requirements for IOC model technology experiments

    NASA Technical Reports Server (NTRS)

    Russell, Richard A.; Waiss, Richard D.

    1988-01-01

    A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.

  18. Fault diagnosis based on continuous simulation models

    NASA Technical Reports Server (NTRS)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  19. Fuzzy logic application for modeling man-in-the-loop space shuttle proximity operations. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Brown, Robert B.

    1994-01-01

    A software pilot model for Space Shuttle proximity operations is developed, utilizing fuzzy logic. The model is designed to emulate a human pilot during the terminal phase of a Space Shuttle approach to the Space Station. The model uses the same sensory information available to a human pilot and is based upon existing piloting rules and techniques determined from analysis of human pilot performance. Such a model is needed to generate numerous rendezvous simulations to various Space Station assembly stages for analysis of current NASA procedures and plume impingement loads on the Space Station. The advantages of a fuzzy logic pilot model are demonstrated by comparing its performance with NASA's man-in-the-loop simulations and with a similar model based upon traditional Boolean logic. The fuzzy model is shown to respond well from a number of initial conditions, with results typical of an average human. In addition, the ability to model different individual piloting techniques and new piloting rules is demonstrated.

  20. Model-driven development of covariances for spatiotemporal environmental health assessment.

    PubMed

    Kolovos, Alexander; Angulo, José Miguel; Modis, Konstantinos; Papantonopoulos, George; Wang, Jin-Feng; Christakos, George

    2013-01-01

    Known conceptual and technical limitations of mainstream environmental health data analysis have directed research to new avenues. The goal is to deal more efficiently with the inherent uncertainty and composite space-time heterogeneity of key attributes, account for multi-sourced knowledge bases (health models, survey data, empirical relationships etc.), and generate more accurate predictions across space-time. Based on a versatile, knowledge synthesis methodological framework, we introduce new space-time covariance functions built by integrating epidemic propagation models and we apply them in the analysis of existing flu datasets. Within the knowledge synthesis framework, the Bayesian maximum entropy theory is our method of choice for the spatiotemporal prediction of the ratio of new infectives (RNI) for a case study of flu in France. The space-time analysis is based on observations during a period of 15 weeks in 1998-1999. We present general features of the proposed covariance functions, and use these functions to explore the composite space-time RNI dependency. We then implement the findings to generate sufficiently detailed and informative maps of the RNI patterns across space and time. The predicted distributions of RNI suggest substantive relationships in accordance with the typical physiographic and climatologic features of the country.

  1. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam's Window.

    PubMed

    Onorante, Luca; Raftery, Adrian E

    2016-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam's window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods.

  2. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam’s Window*

    PubMed Central

    Onorante, Luca; Raftery, Adrian E.

    2015-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam’s window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods. PMID:26917859

  3. Rocket Fuel R and D at AFRL: Recent Activities and Future Direction

    DTIC Science & Technology

    2017-04-12

    Clearance Number 17163 Rocket Cycles and Environments SpaceX Merlin 1D 190 klbf Russian RD-180 860 klbf Gas Generator Cycle Ox-Rich Staged Combustion...affordability & reusability • Modeling & Simulation • Key to development • Requires accurate models “CFD simulations… shorten the test-fail-fix loop” SpaceX

  4. Model verification of large structural systems. [space shuttle model response

    NASA Technical Reports Server (NTRS)

    Lee, L. T.; Hasselman, T. K.

    1978-01-01

    A computer program for the application of parameter identification on the structural dynamic models of space shuttle and other large models with hundreds of degrees of freedom is described. Finite element, dynamic, analytic, and modal models are used to represent the structural system. The interface with math models is such that output from any structural analysis program applied to any structural configuration can be used directly. Processed data from either sine-sweep tests or resonant dwell tests are directly usable. The program uses measured modal data to condition the prior analystic model so as to improve the frequency match between model and test. A Bayesian estimator generates an improved analytical model and a linear estimator is used in an iterative fashion on highly nonlinear equations. Mass and stiffness scaling parameters are generated for an improved finite element model, and the optimum set of parameters is obtained in one step.

  5. Towards a New Generation of Agricultural System Data, Models and Knowledge Products: Design and Improvement

    NASA Technical Reports Server (NTRS)

    Antle, John M.; Basso, Bruno; Conant, Richard T.; Godfray, H. Charles J.; Jones, James W.; Herrero, Mario; Howitt, Richard E.; Keating, Brian A.; Munoz-Carpena, Rafael; Rosenzweig, Cynthia

    2016-01-01

    This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.

  6. Towards a new generation of agricultural system data, models and knowledge products: Design and improvement.

    PubMed

    Antle, John M; Basso, Bruno; Conant, Richard T; Godfray, H Charles J; Jones, James W; Herrero, Mario; Howitt, Richard E; Keating, Brian A; Munoz-Carpena, Rafael; Rosenzweig, Cynthia; Tittonell, Pablo; Wheeler, Tim R

    2017-07-01

    This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.

  7. Using Bond Graphs for Articulated, Flexible Multi-bodies, Sensors, Actuators, and Controllers with Application to the International Space Station

    NASA Technical Reports Server (NTRS)

    Montgomery, Raymond C.; Granda, Jose J.

    2003-01-01

    Conceptually, modeling of flexible, multi-body systems involves a formulation as a set of time-dependent partial differential equations. However, for practical, engineering purposes, this modeling is usually done using the method of Finite Elements, which approximates the set of partial differential equations, thus generalizing the approach to all continuous media. This research investigates the links between the Bond Graph method and the classical methods used to develop system models and advocates the Bond Graph Methodology and current bond graph tools as alternate approaches that will lead to a quick and precise understanding of a flexible multi-body system under automatic control. For long endurance, complex spacecraft, because of articulation and mission evolution the model of the physical system may change frequently. So a method of automatic generation and regeneration of system models that does not lead to implicit equations, as does the Lagrange equation approach, is desirable. The bond graph method has been shown to be amenable to automatic generation of equations with appropriate consideration of causality. Indeed human-interactive software now exists that automatically generates both symbolic and numeric system models and evaluates causality as the user develops the model, e.g. the CAMP-G software package. In this paper the CAMP-G package is used to generate a bond graph model of the International Space Station (ISS) at an early stage in its assembly, Zvezda. The ISS is an ideal example because it is a collection of bodies that are articulated, many of which are highly flexible. Also many reaction jets are used to control translation and attitude, and many electric motors are used to articulate appendages, which consist of photovoltaic arrays and composite assemblies. The Zvezda bond graph model is compared to an existing model, which was generated by the NASA Johnson Space Center during the Verification and Analysis Cycle of Zvezda.

  8. Countdown to the Future

    NASA Technical Reports Server (NTRS)

    Cheng-Campbell, Meg; Scott, Ryan T.; Torres, Samantha; Murray, Matthew; Moyer, Eric

    2017-01-01

    At the NASA Ames Research Center in California, the next generation of space biologists are working to understand the effects of long duration space flight on model organisms, and are developing ways to protect the health of future astronauts.

  9. Final Report from The University of Texas at Austin for DEGAS: Dynamic Global Address Space programming environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erez, Mattan; Yelick, Katherine; Sarkar, Vivek

    The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. Our approach is to provide an efficient and scalable programming model that can be adapted to application needs through the use of dynamic runtime features and domain-specific languages for computational kernels. We address the following technical challenges: Programmability: Rich set of programming constructs based on a Hierarchical Partitioned Global Address Space (HPGAS) model, demonstrated in UPC++. Scalability: Hierarchical locality control, lightweight communication (extended GASNet), and ef- ficient synchronization mechanisms (Phasers). Performance Portability:more » Just-in-time specialization (SEJITS) for generating hardware-specific code and scheduling libraries for domain-specific adaptive runtimes (Habanero). Energy Efficiency: Communication-optimal code generation to optimize energy efficiency by re- ducing data movement. Resilience: Containment Domains for flexible, domain-specific resilience, using state capture mechanisms and lightweight, asynchronous recovery mechanisms. Interoperability: Runtime and language interoperability with MPI and OpenMP to encourage broad adoption.« less

  10. Future Market Share of Space Solar Electric Power Under Open Competition

    NASA Astrophysics Data System (ADS)

    Smith, S. J.; Mahasenan, N.; Clarke, J. F.; Edmonds, J. A.

    2002-01-01

    This paper assesses the value of Space Solar Power deployed under market competition with a full suite of alternative energy technologies over the 21st century. Our approach is to analyze the future energy system under a number of different scenarios that span a wide range of possible future demographic, socio-economic, and technological developments. Scenarios both with, and without, carbon dioxide concentration stabilization policies are considered. We use the comprehensive set of scenarios created for the Intergovernmental Panel on Climate Change Special Report on Emissions Scenarios (Nakicenovic and Swart 2000). The focus of our analysis will be the cost of electric generation. Cost is particularly important when considering electric generation since the type of generation is, from a practical point of view, largely irrelevant to the end-user. This means that different electricity generation technologies must compete on the basis of price. It is important to note, however, that even a technology that is more expensive than average can contribute to the overall generation mix due to geographical and economic heterogeneity (Clarke and Edmonds 1993). This type of competition is a central assumption of the modeling approach used here. Our analysis suggests that, under conditions of full competition of all available technologies, Space Solar Power at 7 cents per kW-hr could comprise 5-10% of global electric generation by the end of the century, with a global total generation of 10,000 TW-hr. The generation share of Space Solar Power is limited due to competition with lower-cost nuclear, biomass, and terrestrial solar PV and wind. The imposition of a carbon constraint does not significantly increase the total amount of power generated by Space Solar Power in cases where a full range of advanced electric generation technologies are also available. Potential constraints on the availability of these other electric generation options can increase the amount of electricity generated by Space Solar Power. In agreement with previous work on this subject, we note that launch costs are a significant impediment for the widespread implementation of Space Solar Power. KEY WORDS: space satellite power, advanced electric generation, electricity price, climate change

  11. Browsing Space Weather Data and Models with the Integrated Space Weather Analysis (iSWA) System

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo M.; Mullinix, Richard E.; Berrios, David H.; Hesse, Michael; Rastaetter, Lutz; Pulkkinen, Antti; Hourcle, Joseph A.; Thompson, Barbara J.

    2011-01-01

    The Integrated Space Weather Analysis (iSWA) System is a comprehensive web-based platform for space weather information that combines data from solar, heliospheric and geospace observatories with forecasts based on the most advanced space weather models. The iSWA system collects, generates, and presents a wide array of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. iSWA currently provides over 200 data and modeling products, and features a variety of tools that allow the user to browse, combine, and examine data and models from various sources. This presentation will consist of a summary of the iSWA products and an overview of the customizable user interfaces, and will feature several tutorial demonstrations highlighting the interactive tools and advanced capabilities.

  12. An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung

    2011-01-01

    In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less

  13. CCMC: bringing space weather awareness to the next generation

    NASA Astrophysics Data System (ADS)

    Chulaki, A.; Muglach, K.; Zheng, Y.; Mays, M. L.; Kuznetsova, M. M.; Taktakishvili, A.; Collado-Vega, Y. M.; Rastaetter, L.; Mendoza, A. M. M.; Thompson, B. J.; Pulkkinen, A. A.; Pembroke, A. D.

    2017-12-01

    Making space weather an element of core education is critical for the future of the young field of space weather. Community Coordinated Modeling Center (CCMC) is an interagency partnership established to aid the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable our small group to serve as a hub for rising generations of young space scientists and engineers. CCMC offers a variety of educational tools and resources publicly available online and providing access to the largest collection of modern space science models developed by the international research community. CCMC has revolutionized the way these simulations are utilized in classrooms settings, student projects, and scientific labs. Every year, this online system serves hundreds of students, educators and researchers worldwide. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unique capabilities and experiences, the team also provides in-depth space weather training to hundreds of students and professionals. One training module offers undergraduates an opportunity to actively engage in real-time space weather monitoring, analysis, forecasting, tools development and research, eventually serving remotely as NASA space weather forecasters. In yet another project, CCMC is collaborating with Hayden Planetarium and Linkoping University on creating a visualization platform for planetariums (and classrooms) to provide simulations of dynamic processes in the large domain stretching from the solar corona to the Earth's upper atmosphere, for near real-time and historical space weather events.

  14. Nonholonomic relativistic diffusion and exact solutions for stochastic Einstein spaces

    NASA Astrophysics Data System (ADS)

    Vacaru, S. I.

    2012-03-01

    We develop an approach to the theory of nonholonomic relativistic stochastic processes in curved spaces. The Itô and Stratonovich calculus are formulated for spaces with conventional horizontal (holonomic) and vertical (nonholonomic) splitting defined by nonlinear connection structures. Geometric models of the relativistic diffusion theory are elaborated for nonholonomic (pseudo) Riemannian manifolds and phase velocity spaces. Applying the anholonomic deformation method, the field equations in Einstein's gravity and various modifications are formally integrated in general forms, with generic off-diagonal metrics depending on some classes of generating and integration functions. Choosing random generating functions we can construct various classes of stochastic Einstein manifolds. We show how stochastic gravitational interactions with mixed holonomic/nonholonomic and random variables can be modelled in explicit form and study their main geometric and stochastic properties. Finally, the conditions when non-random classical gravitational processes transform into stochastic ones and inversely are analyzed.

  15. Creating Body Shapes From Verbal Descriptions by Linking Similarity Spaces.

    PubMed

    Hill, Matthew Q; Streuber, Stephan; Hahn, Carina A; Black, Michael J; O'Toole, Alice J

    2016-11-01

    Brief verbal descriptions of people's bodies (e.g., "curvy," "long-legged") can elicit vivid mental images. The ease with which these mental images are created belies the complexity of three-dimensional body shapes. We explored the relationship between body shapes and body descriptions and showed that a small number of words can be used to generate categorically accurate representations of three-dimensional bodies. The dimensions of body-shape variation that emerged in a language-based similarity space were related to major dimensions of variation computed directly from three-dimensional laser scans of 2,094 bodies. This relationship allowed us to generate three-dimensional models of people in the shape space using only their coordinates on analogous dimensions in the language-based description space. Human descriptions of photographed bodies and their corresponding models matched closely. The natural mapping between the spaces illustrates the role of language as a concise code for body shape that captures perceptually salient global and local body features. © The Author(s) 2016.

  16. Geant4 hadronic physics for space radiation environment.

    PubMed

    Ivantchenko, Anton V; Ivanchenko, Vladimir N; Molina, Jose-Manuel Quesada; Incerti, Sebastien L

    2012-01-01

    To test and to develop Geant4 (Geometry And Tracking version 4) Monte Carlo hadronic models with focus on applications in a space radiation environment. The Monte Carlo simulations have been performed using the Geant4 toolkit. Binary (BIC), its extension for incident light ions (BIC-ion) and Bertini (BERT) cascades were used as main Monte Carlo generators. For comparisons purposes, some other models were tested too. The hadronic testing suite has been used as a primary tool for model development and validation against experimental data. The Geant4 pre-compound (PRECO) and de-excitation (DEE) models were revised and improved. Proton, neutron, pion, and ion nuclear interactions were simulated with the recent version of Geant4 9.4 and were compared with experimental data from thin and thick target experiments. The Geant4 toolkit offers a large set of models allowing effective simulation of interactions of particles with matter. We have tested different Monte Carlo generators with our hadronic testing suite and accordingly we can propose an optimal configuration of Geant4 models for the simulation of the space radiation environment.

  17. A three-finger multisensory hand for dexterous space robotic tasks

    NASA Technical Reports Server (NTRS)

    Murase, Yuichi; Komada, Satoru; Uchiyama, Takashi; Machida, Kazuo; Akita, Kenzo

    1994-01-01

    The National Space Development Agency of Japan will launch ETS-7 in 1997, as a test bed for next generation space technology of RV&D and space robot. MITI has been developing a three-finger multisensory hand for complex space robotic tasks. The hand can be operated under remote control or autonomously. This paper describes the design and development of the hand and the performance of a breadboard model.

  18. The structure and development of streamwise vortex arrays embedded in a turbulent boundary layer. Ph.D. Thesis - Case Western Reserve Univ.

    NASA Technical Reports Server (NTRS)

    Wendt, Bruce J.; Greber, Isaac; Hingst, Warren R.

    1991-01-01

    An investigation of the structure and development of streamwise vortices embedded in a turbulent boundary layer was conducted. The vortices were generated by a single spanwise row of rectangular vortex generator blades. A single embedded vortex was examined, as well as arrays of embedded counter rotating vortices produced by equally spaced vortex generators. Measurements of the secondary velocity field in the crossplane provided the basis for characterization of vortex structure. Vortex structure was characterized by four descriptors. The center of each vortex core was located at the spanwise and normal position of peak streamwise vorticity. Vortex concentration was characterized by the magnitude of the peak streamwise vorticity, and the vortex strength by its circulation. Measurements of the secondary velocity field were conducted at two crossplane locations to examine the streamwise development of the vortex arrays. Large initial spacings of the vortex generators produced pairs of strong vortices which tended to move away from the wall region while smaller spacings produced tight arrays of weak vortices close to the wall. A model of vortex interaction and development is constructed using the experimental results. The model is based on the structure of the Oseen Vortex. Vortex trajectories are modelled by including the convective effects of neighbors.

  19. Vibrations and structureborne noise in space station

    NASA Technical Reports Server (NTRS)

    Vaicaitis, R.; Lyrintzis, C. S.; Bofilios, D. A.

    1987-01-01

    Analytical models were developed to predict vibrations and structureborne noise generation of cylindrical and rectangular acoustic enclosures. These models are then used to determine structural vibration levels and interior noise to random point input forces. The guidelines developed could provide preliminary information on acoustical and vibrational environments in space station habitability modules under orbital operations. The structural models include single wall monocoque shell, double wall shell, stiffened orthotropic shell, descretely stiffened flat panels, and a coupled system composed of a cantilever beam structure and a stiffened sidewall. Aluminum and fiber reinforced composite materials are considered for single and double wall shells. The end caps of the cylindrical enclosures are modeled either as single or double wall circular plates. Sound generation in the interior space is calculated by coupling the structural vibrations to the acoustic field in the enclosure. Modal methods and transfer matrix techniques are used to obtain structural vibrations. Parametric studies are performed to determine the sensitivity of interior noise environment to changes in input, geometric and structural conditions.

  20. Space Weather Model Testing And Validation At The Community Coordinated Modeling Center

    NASA Astrophysics Data System (ADS)

    Hesse, M.; Kuznetsova, M.; Rastaetter, L.; Falasca, A.; Keller, K.; Reitan, P.

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partner- ship aimed at the creation of next generation space weather models. The goal of the CCMC is to undertake the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to pro- vide models for transition to the rapid prototyping centers at the space weather forecast centers. This goal requires close collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of NASA's Living With aStar initiative, of the National Space Weather Program Implementation Plan, and of the Department of Defense Space Weather Tran- sition Plan. CCMC includes a facility at NASA Goddard Space Flight Center, as well as distributed computing facilities provided by the Air Force. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide updates on CCMC status, on current plans, research and devel- opment accomplishments and goals, and on the model testing and validation process undertaken as part of the CCMC mandate.

  1. Generation of topographic terrain models utilizing synthetic aperture radar and surface level data

    NASA Technical Reports Server (NTRS)

    Imhoff, Marc L. (Inventor)

    1991-01-01

    Topographical terrain models are generated by digitally delineating the boundary of the region under investigation from the data obtained from an airborne synthetic aperture radar image and surface elevation data concurrently acquired either from an airborne instrument or at ground level. A set of coregistered boundary maps thus generated are then digitally combined in three dimensional space with the acquired surface elevation data by means of image processing software stored in a digital computer. The method is particularly applicable for generating terrain models of flooded regions covered entirely or in part by foliage.

  2. Temporal models for the episodic volcanism of Campi Flegrei caldera (Italy) with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Flandoli, Franco; Neri, Augusto; Isaia, Roberto; Vitale, Stefano

    2016-11-01

    After the large-scale event of Neapolitan Yellow Tuff ( 15 ka B.P.), intense and mostly explosive volcanism has occurred within and along the boundaries of the Campi Flegrei caldera (Italy). Eruptions occurred closely spaced in time, over periods from a few centuries to a few millennia, and were alternated with periods of quiescence lasting up to several millennia. Often events also occurred closely in space, thus generating a cluster of events. This study had two main objectives: (1) to describe the uncertainty in the geologic record by using a quantitative model and (2) to develop, based on the uncertainty assessment, a long-term subdomain specific temporal probability model that describes the temporal and spatial eruptive behavior of the caldera. In particular, the study adopts a space-time doubly stochastic nonhomogeneous Poisson-type model with a local self-excitation feature able to generate clustering of events which are consistent with the reconstructed record of Campi Flegrei. Results allow the evaluation of similarities and differences between the three epochs of activity as well as to derive eruptive base rate of the caldera and its capacity to generate clusters of events. The temporal probability model is also used to investigate the effect of the most recent eruption of Monte Nuovo (A.D. 1538) in a possible reactivation of the caldera and to estimate the time to the next eruption under different volcanological and modeling assumptions.

  3. Evaluation of Private Sector Roles in Space Resource Development

    NASA Astrophysics Data System (ADS)

    Lamassoure, Elisabeth S.; Blair, Brad R.; Diaz, Javier; Oderman, Mark; Duke, Michael B.; Vaucher, Marc; Manvi, Ramachandra; Easter, Robert W.

    2003-01-01

    An integrated engineering and financial modeling approach has been developed and used to evaluate the potential for private sector investment in space resource development, and to assess possible roles of the public sector in fostering private interest. This paper presents the modeling approach and its results for a transportation service using propellant extracted from lunar regolith. The analysis starts with careful case study definition, including an analysis of the customer base and market requirements, which are the basis for design of a modular, scalable space architecture. The derived non-recurring, recurring and operations costs become inputs for a `standard' financial model, as used in any commercial business plan. This model generates pro forma financial statements, calculates the amount of capitalization required, and generates return on equity calculations using two valuation metrics of direct interest to private investors: market enterprise value and multiples of key financial measures. Use of this model on an architecture to sell transportation services in Earth orbit based on lunar propellants shows how to rapidly test various assumptions and identify interesting architectural options, key areas for investment in exploration and technology, or innovative business approaches that could produce an economically viable industry. The same approach can be used to evaluate any other possible private ventures in space, and conclude on the respective roles of NASA and the private sector in space resource development and solar system exploration.

  4. Bas-Relief Modeling from Normal Images with Intuitive Styles.

    PubMed

    Ji, Zhongping; Ma, Weiyin; Sun, Xianfang

    2014-05-01

    Traditional 3D model-based bas-relief modeling methods are often limited to model-dependent and monotonic relief styles. This paper presents a novel method for digital bas-relief modeling with intuitive style control. Given a composite normal image, the problem discussed in this paper involves generating a discontinuity-free depth field with high compression of depth data while preserving or even enhancing fine details. In our framework, several layers of normal images are composed into a single normal image. The original normal image on each layer is usually generated from 3D models or through other techniques as described in this paper. The bas-relief style is controlled by choosing a parameter and setting a targeted height for them. Bas-relief modeling and stylization are achieved simultaneously by solving a sparse linear system. Different from previous work, our method can be used to freely design bas-reliefs in normal image space instead of in object space, which makes it possible to use any popular image editing tools for bas-relief modeling. Experiments with a wide range of 3D models and scenes show that our method can effectively generate digital bas-reliefs.

  5. Architecture for the silver generation: exploring the meaning of appropriate space for ageing in a Swedish municipality.

    PubMed

    Andersson, Jonas E

    2011-03-01

    This paper focuses on an architecture competition for the silver generation, namely those aged 65 years and older. Twenty-seven Swedish informants were interviewed using an interviewing guide that included a photographic survey. The informants emphasised aesthetic dimensions in architecture for the prolongation of ageing in place and independent living in a residential home. This study highlights the individual adjustment of space, and the integrated location in existing urban settings near nature. Based on the findings, a habitational model for exploring the appropriate space for ageing is formulated. It suggests that architecture through location and spatial features needs to generate positive associations with the users. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Many Molecular Properties from One Kernel in Chemical Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Raghunathan; von Lilienfeld, O. Anatole

    We introduce property-independent kernels for machine learning modeling of arbitrarily many molecular properties. The kernels encode molecular structures for training sets of varying size, as well as similarity measures sufficiently diffuse in chemical space to sample over all training molecules. Corresponding molecular reference properties provided, they enable the instantaneous generation of ML models which can systematically be improved through the addition of more data. This idea is exemplified for single kernel based modeling of internal energy, enthalpy, free energy, heat capacity, polarizability, electronic spread, zero-point vibrational energy, energies of frontier orbitals, HOMOLUMO gap, and the highest fundamental vibrational wavenumber. Modelsmore » of these properties are trained and tested using 112 kilo organic molecules of similar size. Resulting models are discussed as well as the kernels’ use for generating and using other property models.« less

  7. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  8. Recent Enhancements to the Development of CFD-Based Aeroelastic Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    2007-01-01

    Recent enhancements to the development of CFD-based unsteady aerodynamic and aeroelastic reduced-order models (ROMs) are presented. These enhancements include the simultaneous application of structural modes as CFD input, static aeroelastic analysis using a ROM, and matched-point solutions using a ROM. The simultaneous application of structural modes as CFD input enables the computation of the unsteady aerodynamic state-space matrices with a single CFD execution, independent of the number of structural modes. The responses obtained from a simultaneous excitation of the CFD-based unsteady aerodynamic system are processed using system identification techniques in order to generate an unsteady aerodynamic state-space ROM. Once the unsteady aerodynamic state-space ROM is generated, a method for computing the static aeroelastic response using this unsteady aerodynamic ROM and a state-space model of the structure, is presented. Finally, a method is presented that enables the computation of matchedpoint solutions using a single ROM that is applicable over a range of dynamic pressures and velocities for a given Mach number. These enhancements represent a significant advancement of unsteady aerodynamic and aeroelastic ROM technology.

  9. Peripersonal Space and Margin of Safety around the Body: Learning Visuo-Tactile Associations in a Humanoid Robot with Artificial Skin.

    PubMed

    Roncone, Alessandro; Hoffmann, Matej; Pattacini, Ugo; Fadiga, Luciano; Metta, Giorgio

    2016-01-01

    This paper investigates a biologically motivated model of peripersonal space through its implementation on a humanoid robot. Guided by the present understanding of the neurophysiology of the fronto-parietal system, we developed a computational model inspired by the receptive fields of polymodal neurons identified, for example, in brain areas F4 and VIP. The experiments on the iCub humanoid robot show that the peripersonal space representation i) can be learned efficiently and in real-time via a simple interaction with the robot, ii) can lead to the generation of behaviors like avoidance and reaching, and iii) can contribute to the understanding the biological principle of motor equivalence. More specifically, with respect to i) the present model contributes to hypothesizing a learning mechanisms for peripersonal space. In relation to point ii) we show how a relatively simple controller can exploit the learned receptive fields to generate either avoidance or reaching of an incoming stimulus and for iii) we show how the robot can select arbitrary body parts as the controlled end-point of an avoidance or reaching movement.

  10. The TimeGeo modeling framework for urban mobility without travel surveys

    PubMed Central

    Jiang, Shan; Yang, Yingxiang; Gupta, Siddharth; Veneziano, Daniele; Athavale, Shounak; González, Marta C.

    2016-01-01

    Well-established fine-scale urban mobility models today depend on detailed but cumbersome and expensive travel surveys for their calibration. Not much is known, however, about the set of mechanisms needed to generate complete mobility profiles if only using passive datasets with mostly sparse traces of individuals. In this study, we present a mechanistic modeling framework (TimeGeo) that effectively generates urban mobility patterns with resolution of 10 min and hundreds of meters. It ties together the inference of home and work activity locations from data, with the modeling of flexible activities (e.g., other) in space and time. The temporal choices are captured by only three features: the weekly home-based tour number, the dwell rate, and the burst rate. These combined generate for each individual: (i) stay duration of activities, (ii) number of visited locations per day, and (iii) daily mobility networks. These parameters capture how an individual deviates from the circadian rhythm of the population, and generate the wide spectrum of empirically observed mobility behaviors. The spatial choices of visited locations are modeled by a rank-based exploration and preferential return (r-EPR) mechanism that incorporates space in the EPR model. Finally, we show that a hierarchical multiplicative cascade method can measure the interaction between land use and generation of trips. In this way, urban structure is directly related to the observed distance of travels. This framework allows us to fully embrace the massive amount of individual data generated by information and communication technologies (ICTs) worldwide to comprehensively model urban mobility without travel surveys. PMID:27573826

  11. The TimeGeo modeling framework for urban motility without travel surveys.

    PubMed

    Jiang, Shan; Yang, Yingxiang; Gupta, Siddharth; Veneziano, Daniele; Athavale, Shounak; González, Marta C

    2016-09-13

    Well-established fine-scale urban mobility models today depend on detailed but cumbersome and expensive travel surveys for their calibration. Not much is known, however, about the set of mechanisms needed to generate complete mobility profiles if only using passive datasets with mostly sparse traces of individuals. In this study, we present a mechanistic modeling framework (TimeGeo) that effectively generates urban mobility patterns with resolution of 10 min and hundreds of meters. It ties together the inference of home and work activity locations from data, with the modeling of flexible activities (e.g., other) in space and time. The temporal choices are captured by only three features: the weekly home-based tour number, the dwell rate, and the burst rate. These combined generate for each individual: (i) stay duration of activities, (ii) number of visited locations per day, and (iii) daily mobility networks. These parameters capture how an individual deviates from the circadian rhythm of the population, and generate the wide spectrum of empirically observed mobility behaviors. The spatial choices of visited locations are modeled by a rank-based exploration and preferential return (r-EPR) mechanism that incorporates space in the EPR model. Finally, we show that a hierarchical multiplicative cascade method can measure the interaction between land use and generation of trips. In this way, urban structure is directly related to the observed distance of travels. This framework allows us to fully embrace the massive amount of individual data generated by information and communication technologies (ICTs) worldwide to comprehensively model urban mobility without travel surveys.

  12. The DOE/NASA SRG110 Program Overview

    NASA Astrophysics Data System (ADS)

    Shaltens, R. K.; Richardson, R. L.

    2005-12-01

    The Department of Energy is developing the Stirling Radioisotope Generator (SRG110) for NASAs Science Mission Directorate for potential surface and deep space missions. The SRG110 is one of two new radioisotope power systems (RPSs) currently being developed for NASA space missions, and is capable of operating in a range of planetary atmospheres and in deep space environments. It has a mass of approximately 27 kg and produces more than 125We(dc) at beginning of mission (BOM), with a design lifetime of fourteen years. Electrical power is produced by two (2) free-piston Stirlings convertor heated by two General Purpose Heat Source (GPHS) modules. The complete SRG110 system is approximately 38 cm x 36 cm and 76 cm long. The SRG110 generator is being designed in 3 stages: Engineering Model, Qualification Generator, and Flight Generator. Current plans call for the Engineering Model to be fabricated and tested by October 2006. Completion of testing of the Qualification Generator is scheduled for mid-2009. This development is being performed by Lockheed Martin, Valley Forge, PA and Infinia Corporation, Kennewick, WA under contract to the Department of Energy, Germantown, Md. Glenn Research Center, Cleveland, Ohio is providing independent testing and support for the technology transition for the SRG110 Program.

  13. Space Weather Modeling Services at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2006-01-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership, which aims at the creation of next generation space weather models. The goal of the CCMC is to support the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to provide models for transition to the Rapid Prototyping Centers at the space weather forecast centers. This goal requires close collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of the National Space Weather Program Implementation Plan, of NASA's Living With a Star (LWS) initiative, and of the Department of Defense Space Weather Transition Plan. CCMC includes a facility at NASA Goddard Space Flight Center. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide a description of the current CCMC status, discuss current plans, research and development accomplishments and goals, and describe the model testing and validation process undertaken as part of the CCMC mandate. Special emphasis will be on solar and heliospheric models currently residing at CCMC, and on plans for validation and verification.

  14. Space Weather Modeling at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse M.

    2005-01-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership, which aims at the creation of next generation space weather models. The goal of the CCMC is to support the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to provide models for transition to the rapid prototyping centers at the space weather forecast centers. This goal requires dose collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of the National Space Weather Program Implementation Plan, of NASA's Living With a Star (LWS) initiative, and of the Department of Defense Space Weather Transition Plan. CCMC includes a facility at NASA Goddard Space Flight Center, as well as distributed computing facilities provided by the US Air Force. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide updates on CCMC status, on current plans, research and development accomplishments and goals, and on the model testing and validation process undertaken as part of the CCMC mandate. Special emphasis will be on solar and heliospheric models currently residing at CCMC, and on plans for validation and verification.

  15. Numerical analysis of ion wind flow using space charge for optimal design

    NASA Astrophysics Data System (ADS)

    Ko, Han Seo; Shin, Dong Ho; Baek, Soo Hong

    2014-11-01

    Ion wind flow has been widly studied for its advantages of a micro fluidic device. However, it is very difficult to predict the performance of the ion wind flow for various conditions because of its complicated electrohydrodynamic phenomena. Thus, a reliable numerical modeling is required to design an otimal ion wind generator and calculate velocity of the ion wind for the proper performance. In this study, the numerical modeling of the ion wind has been modified and newly defined to calculate the veloctiy of the ion wind flow by combining three basic models such as electrostatics, electrodynamics and fluid dynamics. The model has included presence of initial space charges to calculate transfer energy between space charges and air gas molecules using a developed space charge correlation. The simulation has been performed for a geometry of a pin to parallel plate electrode. Finally, the results of the simulation have been compared with the experimental data for the ion wind velocity to confirm the accuracy of the modified numerical modeling and to obtain the optimal design of the ion wind generator. This work was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean government (MEST) (No. 2013R1A2A2A01068653).

  16. The Space Weather Modeling Framework (SWMF): Models and Validation

    NASA Astrophysics Data System (ADS)

    Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Ridley, Aaron; Manchester, Ward, IV

    In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magneto-sphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. SWMF is a powerful tool for coupling regional models describing the space environment from the solar photosphere to the bottom of the ionosphere. Presently, SWMF contains over a dozen components: the solar corona (SC), eruptive event generator (EE), inner heliosphere (IE), outer heliosphere (OH), solar energetic particles (SE), global magnetosphere (GM), inner magnetosphere (IM), radiation belts (RB), plasmasphere (PS), ionospheric electrodynamics (IE), polar wind (PW), upper atmosphere (UA) and lower atmosphere (LA). This talk will present an overview of SWMF, new results obtained with improved physics as well as some validation studies.

  17. A unified 3D default space consciousness model combining neurological and physiological processes that underlie conscious experience

    PubMed Central

    Jerath, Ravinder; Crawford, Molly W.; Barnes, Vernon A.

    2015-01-01

    The Global Workspace Theory and Information Integration Theory are two of the most currently accepted consciousness models; however, these models do not address many aspects of conscious experience. We compare these models to our previously proposed consciousness model in which the thalamus fills-in processed sensory information from corticothalamic feedback loops within a proposed 3D default space, resulting in the recreation of the internal and external worlds within the mind. This 3D default space is composed of all cells of the body, which communicate via gap junctions and electrical potentials to create this unified space. We use 3D illustrations to explain how both visual and non-visual sensory information may be filled-in within this dynamic space, creating a unified seamless conscious experience. This neural sensory memory space is likely generated by baseline neural oscillatory activity from the default mode network, other salient networks, brainstem, and reticular activating system. PMID:26379573

  18. Architecture for spacecraft operations planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1991-01-01

    A system which generates plans for the dynamic environment of space operations is discussed. This system synthesizes plans by combining known operations under a set of physical, functional, and temperal constraints from various plan entities, which are modeled independently but combine in a flexible manner to suit dynamic planning needs. This independence allows the generation of a single plan source which can be compiled and applied to a variety of agents. The architecture blends elements of temperal logic, nonlinear planning, and object oriented constraint modeling to achieve its flexibility. This system was applied to the domain of the Intravehicular Activity (IVA) maintenance and repair aboard Space Station Freedom testbed.

  19. Computer Model Of Fragmentation Of Atomic Nuclei

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Tripathi, Ram K.; Norbury, John W.; KHAN FERDOUS; Badavi, Francis F.

    1995-01-01

    High Charge and Energy Semiempirical Nuclear Fragmentation Model (HZEFRG1) computer program developed to be computationally efficient, user-friendly, physics-based program for generating data bases on fragmentation of atomic nuclei. Data bases generated used in calculations pertaining to such radiation-transport applications as shielding against radiation in outer space, radiation dosimetry in outer space, cancer therapy in laboratories with beams of heavy ions, and simulation studies for designing detectors for experiments in nuclear physics. Provides cross sections for production of individual elements and isotopes in breakups of high-energy heavy ions by combined nuclear and Coulomb fields of interacting nuclei. Written in ANSI FORTRAN 77.

  20. Compilation of Abstracts for SC12 Conference Proceedings

    NASA Technical Reports Server (NTRS)

    Morello, Gina Francine (Compiler)

    2012-01-01

    1 A Breakthrough in Rotorcraft Prediction Accuracy Using Detached Eddy Simulation; 2 Adjoint-Based Design for Complex Aerospace Configurations; 3 Simulating Hypersonic Turbulent Combustion for Future Aircraft; 4 From a Roar to a Whisper: Making Modern Aircraft Quieter; 5 Modeling of Extended Formation Flight on High-Performance Computers; 6 Supersonic Retropropulsion for Mars Entry; 7 Validating Water Spray Simulation Models for the SLS Launch Environment; 8 Simulating Moving Valves for Space Launch System Liquid Engines; 9 Innovative Simulations for Modeling the SLS Solid Rocket Booster Ignition; 10 Solid Rocket Booster Ignition Overpressure Simulations for the Space Launch System; 11 CFD Simulations to Support the Next Generation of Launch Pads; 12 Modeling and Simulation Support for NASA's Next-Generation Space Launch System; 13 Simulating Planetary Entry Environments for Space Exploration Vehicles; 14 NASA Center for Climate Simulation Highlights; 15 Ultrascale Climate Data Visualization and Analysis; 16 NASA Climate Simulations and Observations for the IPCC and Beyond; 17 Next-Generation Climate Data Services: MERRA Analytics; 18 Recent Advances in High-Resolution Global Atmospheric Modeling; 19 Causes and Consequences of Turbulence in the Earths Protective Shield; 20 NASA Earth Exchange (NEX): A Collaborative Supercomputing Platform; 21 Powering Deep Space Missions: Thermoelectric Properties of Complex Materials; 22 Meeting NASA's High-End Computing Goals Through Innovation; 23 Continuous Enhancements to the Pleiades Supercomputer for Maximum Uptime; 24 Live Demonstrations of 100-Gbps File Transfers Across LANs and WANs; 25 Untangling the Computing Landscape for Climate Simulations; 26 Simulating Galaxies and the Universe; 27 The Mysterious Origin of Stellar Masses; 28 Hot-Plasma Geysers on the Sun; 29 Turbulent Life of Kepler Stars; 30 Modeling Weather on the Sun; 31 Weather on Mars: The Meteorology of Gale Crater; 32 Enhancing Performance of NASAs High-End Computing Applications; 33 Designing Curiosity's Perfect Landing on Mars; 34 The Search Continues: Kepler's Quest for Habitable Earth-Sized Planets.

  1. Draft Forecasts from Real-Time Runs of Physics-Based Models - A Road to the Future

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2008-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. After consultations with NOAA/SEC and with AFWA, CCMC has developed a set of tools as a first step to make real-time model output useful to forecast centers. In this presentation, we will discuss the motivation for this activity, the actions taken so far, and options for future tools from model output.

  2. Deriving Tools from Real-Time Runs: A New CCMC Support for SEC and AFWA

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2007-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. In particular, the CCMC provides to the research community the execution of "runs-on-request" for specific events of interest to space science researchers. Through this activity and the concurrent development of advanced visualization tools, CCMC provides, to the general science community, unprecedented access to a large number of state-of-the-art research models. CCMC houses models that cover the entire domain from the Sun to the Earth. In this presentation, we will provide an overview of CCMC modeling services that are available to support activities at the Space Environment Center, or at the Air Force Weather Agency.

  3. Maximizing photovoltaic power generation of a space-dart configured satellite

    NASA Astrophysics Data System (ADS)

    Lee, Dae Young; Cutler, James W.; Mancewicz, Joe; Ridley, Aaron J.

    2015-06-01

    Many small satellites are power constrained due to their minimal solar panel area and the eclipse environment of low-Earth orbit. As with larger satellites, these small satellites, including CubeSats, use deployable power arrays to increase power production. This presents a design opportunity to develop various objective functions related to energy management and methods for optimizing these functions over a satellite design. A novel power generation model was created, and a simulation system was developed to evaluate various objective functions describing energy management for complex satellite designs. The model uses a spacecraft-body-fixed spherical coordinate system to analyze the complex geometry of a satellite's self-induced shadowing with computation provided by the Open Graphics Library. As an example design problem, a CubeSat configured as a space-dart with four deployable panels is optimized. Due to the fast computation speed of the solution, an exhaustive search over the design space is used to find the solar panel deployment angles which maximize total power generation. Simulation results are presented for a variety of orbit scenarios. The method is extendable to a variety of complex satellite geometries and power generation systems.

  4. Vertebrate development in the environment of space: models, mechanisms, and use of the medaka

    NASA Technical Reports Server (NTRS)

    Wolgemuth, D. J.; Herrada, G.; Kiss, S.; Cannon, T.; Forsstrom, C.; Pranger, L. A.; Weismann, W. P.; Pearce, L.; Whalon, B.; Phillips, C. R.

    1997-01-01

    With the advent of space travel, it is of immediate interest and importance to study the effects of exposure to various aspects of the altered environment of space, including microgravity, on Earth-based life forms. Initial studies of space travel have focused primarily on the short-term effects of radiation and microgravity on adult organisms. However, with the potential for increased lengths of time in space, it is critical to now address the effects of space on all phases of an organism's life cycle, from embryogenesis to post-natal development to reproduction. It is already possible for certain species to undergo multiple generations within the confines of the Mir Space Station. The possibility now exists for scientists to consider the consequences of even potentially subtle defects in development through multiple phases of an organism's life cycle, or even through multiple generations. In this discussion, we highlight a few of the salient observations on the effects of the space environment on vertebrate development and reproductive function. We discuss some of the many unanswered questions, in particular, in the context of the choice of appropriate models in which to address these questions, as well as an assessment of the availability of hardware already existing or under development which would be useful in addressing these questions.

  5. Deriving Tools from Real-time Runs: A New CCMC Support for SEC and AFWA

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2008-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions. the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models. and on the transition of appropriate models to space weather forecast centers. As part of the latter activity. the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. After consultations with NOA/SEC and with AFWA, CCMC has developed a set of tools as a first step to make real-time model output useful to forecast centers. In this presentation, we will discuss the motivation for this activity, the actions taken so far, and options for future tools from model output.

  6. Simulated Wake Characteristics Data for Closely Spaced Parallel Runway Operations Analysis

    NASA Technical Reports Server (NTRS)

    Guerreiro, Nelson M.; Neitzke, Kurt W.

    2012-01-01

    A simulation experiment was performed to generate and compile wake characteristics data relevant to the evaluation and feasibility analysis of closely spaced parallel runway (CSPR) operational concepts. While the experiment in this work is not tailored to any particular operational concept, the generated data applies to the broader class of CSPR concepts, where a trailing aircraft on a CSPR approach is required to stay ahead of the wake vortices generated by a lead aircraft on an adjacent CSPR. Data for wake age, circulation strength, and wake altitude change, at various lateral offset distances from the wake-generating lead aircraft approach path were compiled for a set of nine aircraft spanning the full range of FAA and ICAO wake classifications. A total of 54 scenarios were simulated to generate data related to key parameters that determine wake behavior. Of particular interest are wake age characteristics that can be used to evaluate both time- and distance- based in-trail separation concepts for all aircraft wake-class combinations. A simple first-order difference model was developed to enable the computation of wake parameter estimates for aircraft models having weight, wingspan and speed characteristics similar to those of the nine aircraft modeled in this work.

  7. Online Community and User-Generated Content: Understanding the Role of Social Networks

    ERIC Educational Resources Information Center

    Oh, Jeong Ha

    2010-01-01

    Models of user generated content (UGC) creation such as Facebook, MySpace, and YouTube are facing robust growth accelerated by the adoption of Web 2.0 technologies and standards. These business models offer a fascinating avenue for exploring the role of social influence online. This dissertation is motivated by the success of YouTube, which is…

  8. Stirling Convertor Performance Mapping Test Results for Future Radioisotope Power Systems

    NASA Astrophysics Data System (ADS)

    Qiu, Songgang; Peterson, Allen A.; Faultersack, Franklyn D.; Redinger, Darin L.; Augenblick, John E.

    2004-02-01

    Long-life radioisotope-fueled generators based on free-piston Stirling convertors are an energy-conversion solution for future space applications. The high efficiency of Stirling machines makes them more attractive than the thermoelectric generators currently used in space. Stirling Technology Company (STC) has been performance-testing its Stirling generators to provide data for potential system integration contractors. This paper describes the most recent test results from the STC RemoteGen™ 55 W-class Stirling generators (RG-55). Comparisons are made between the new data and previous Stirling thermodynamic simulation models. Performance-mapping tests are presented including variations in: internal charge pressure, cold end temperature, hot end temperature, alternator temperature, input power, and variation of control voltage.

  9. Development of automation and robotics for space via computer graphic simulation methods

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  10. Earth-from-Luna Limb Imager (ELLI) for Deep Space Gateway

    NASA Astrophysics Data System (ADS)

    Gorkavyi, N.; DeLand, M.

    2018-02-01

    The new type of limb imager with a high-frequency imaging proposed for Deep Space Gateway. Each day this CubeSat' scale imager will generate the global 3D model of the aerosol component of the Earth's atmosphere and Polar Mesospheric Clouds.

  11. An algorithm to generate input data from meteorological and space shuttle observations to validate a CH4-CO model

    NASA Technical Reports Server (NTRS)

    Peters, L. K.; Yamanis, J.

    1981-01-01

    Objective procedures to analyze data from meteorological and space shuttle observations to validate a three dimensional model were investigated. The transport and chemistry of carbon monoxide and methane in the troposphere were studied. Four aspects were examined: (1) detailed evaluation of the variational calculus procedure, with the equation of continuity as a strong constraint, for adjustment of global tropospheric wind fields; (2) reduction of the National Meteorological Center (NMC) data tapes for data input to the OSTA-1/MAPS Experiment; (3) interpolation of the NMC Data for input to the CH4-CO model; and (4) temporal and spatial interpolation procedures of the CO measurements from the OSTA-1/MAPS Experiment to generate usable contours of the data.

  12. A Parallel Saturation Algorithm on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Ezekiel, Jonathan; Siminiceanu

    2007-01-01

    Symbolic state-space generators are notoriously hard to parallelize. However, the Saturation algorithm implemented in the SMART verification tool differs from other sequential symbolic state-space generators in that it exploits the locality of ring events in asynchronous system models. This paper explores whether event locality can be utilized to efficiently parallelize Saturation on shared-memory architectures. Conceptually, we propose to parallelize the ring of events within a decision diagram node, which is technically realized via a thread pool. We discuss the challenges involved in our parallel design and conduct experimental studies on its prototypical implementation. On a dual-processor dual core PC, our studies show speed-ups for several example models, e.g., of up to 50% for a Kanban model, when compared to running our algorithm only on a single core.

  13. Quantifying Astronaut Tasks: Robotic Technology and Future Space Suit Design

    NASA Technical Reports Server (NTRS)

    Newman, Dava

    2003-01-01

    The primary aim of this research effort was to advance the current understanding of astronauts' capabilities and limitations in space-suited EVA by developing models of the constitutive and compatibility relations of a space suit, based on experimental data gained from human test subjects as well as a 12 degree-of-freedom human-sized robot, and utilizing these fundamental relations to estimate a human factors performance metric for space suited EVA work. The three specific objectives are to: 1) Compile a detailed database of torques required to bend the joints of a space suit, using realistic, multi- joint human motions. 2) Develop a mathematical model of the constitutive relations between space suit joint torques and joint angular positions, based on experimental data and compare other investigators' physics-based models to experimental data. 3) Estimate the work envelope of a space suited astronaut, using the constitutive and compatibility relations of the space suit. The body of work that makes up this report includes experimentation, empirical and physics-based modeling, and model applications. A detailed space suit joint torque-angle database was compiled with a novel experimental approach that used space-suited human test subjects to generate realistic, multi-joint motions and an instrumented robot to measure the torques required to accomplish these motions in a space suit. Based on the experimental data, a mathematical model is developed to predict joint torque from the joint angle history. Two physics-based models of pressurized fabric cylinder bending are compared to experimental data, yielding design insights. The mathematical model is applied to EVA operations in an inverse kinematic analysis coupled to the space suit model to calculate the volume in which space-suited astronauts can work with their hands, demonstrating that operational human factors metrics can be predicted from fundamental space suit information.

  14. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-12-18

    This paper presents four algorithms to generate random forecast error time series, including a truncated-normal distribution model, a state-space based Markov model, a seasonal autoregressive moving average (ARMA) model, and a stochastic-optimization based model. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets, used for variable generation integration studies. A comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics. This paper discusses and comparesmore » the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  15. Space Weather Modeling at the Community Coordinated Modeling Center

    NASA Astrophysics Data System (ADS)

    Hesse, M.; Falasca, A.; Johnson, J.; Keller, K.; Kuznetsova, M.; Rastaetter, L.

    2003-04-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership aimed at the creation of next generation space weather models. The goal of the CCMC is to support the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to provide models for transition to the rapid prototyping centers at the space weather forecast centers. This goal requires close collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of NASA's Living With a Star (LWS) initiative, of the National Space Weather Program Implementation Plan, and of the Department of Defense Space Weather Transition Plan. CCMC includes a facility at NASA Goddard Space Flight Center, as well as distributed computing facilities provided by the US Air Force. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide updates on CCMC status, on current plans, research and development accomplishments and goals, and on the model testing and validation process undertaken as part of the CCMC mandate. We will demonstrate the capabilities of models resident at CCMC via the analysis of a geomagnetic storm, driven by a shock in the solar wind.

  16. Equations of motion for a spectrum-generating algebra: Lipkin Meshkov Glick model

    NASA Astrophysics Data System (ADS)

    Rosensteel, G.; Rowe, D. J.; Ho, S. Y.

    2008-01-01

    For a spectrum-generating Lie algebra, a generalized equations-of-motion scheme determines numerical values of excitation energies and algebra matrix elements. In the approach to the infinite particle number limit or, more generally, whenever the dimension of the quantum state space is very large, the equations-of-motion method may achieve results that are impractical to obtain by diagonalization of the Hamiltonian matrix. To test the method's effectiveness, we apply it to the well-known Lipkin-Meshkov-Glick (LMG) model to find its low-energy spectrum and associated generator matrix elements in the eigenenergy basis. When the dimension of the LMG representation space is 106, computation time on a notebook computer is a few minutes. For a large particle number in the LMG model, the low-energy spectrum makes a quantum phase transition from a nondegenerate harmonic vibrator to a twofold degenerate harmonic oscillator. The equations-of-motion method computes critical exponents at the transition point.

  17. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  18. Fusion of laser and image sensory data for 3-D modeling of the free navigation space

    NASA Technical Reports Server (NTRS)

    Mass, M.; Moghaddamzadeh, A.; Bourbakis, N.

    1994-01-01

    A fusion technique which combines two different types of sensory data for 3-D modeling of a navigation space is presented. The sensory data is generated by a vision camera and a laser scanner. The problem of different resolutions for these sensory data was solved by reduced image resolution, fusion of different data, and use of a fuzzy image segmentation technique.

  19. Deep Generative Models of Galaxy Images for the Calibration of the Next Generation of Weak Lensing Surveys

    NASA Astrophysics Data System (ADS)

    Lanusse, Francois; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Schneider, Jeff; Poczos, Barnabas

    2017-01-01

    Weak gravitational lensing has long been identified as one of the most powerful probes to investigate the nature of dark energy. As such, weak lensing is at the heart of the next generation of cosmological surveys such as LSST, Euclid or WFIRST.One particularly crititcal source of systematic errors in these surveys comes from the shape measurement algorithms tasked with estimating galaxy shapes. GREAT3, the last community challenge to assess the quality of state-of-the-art shape measurement algorithms has in particular demonstrated that all current methods are biased to various degrees and, more importantly, that these biases depend on the details of the galaxy morphologies. These biases can be measured and calibrated by generating mock observations where a known lensing signal has been introduced and comparing the resulting measurements to the ground-truth. Producing these mock observations however requires input galaxy images of higher resolution and S/N than the simulated survey, which typically implies acquiring extremely expensive space-based observations.The goal of this work is to train a deep generative model on already available Hubble Space Telescope data which can then be used to sample new galaxy images conditioned on parameters such as magnitude, size or redshift and exhibiting complex morphologies. Such model can allow us to inexpensively produce large set of realistic realistic images for calibration purposes.We implement a conditional generative model based on state-of-the-art deep learning methods and fit it to deep galaxy images from the COSMOS survey. The quality of the model is assessed by computing an extensive set of galaxy morphology statistics on the generated images. Beyond simple second moment statistics such as size and ellipticity, we apply more complex statistics specifically designed to be sensitive to disturbed galaxy morphologies. We find excellent agreement between the morphologies of real and model generated galaxies.Our results suggest that such deep generative models represent a reliable alternative to the acquisition of expensive high quality observations for generating the calibration data needed by the next generation of weak lensing surveys.

  20. The space-dependent model and output characteristics of intra-cavity pumped dual-wavelength lasers

    NASA Astrophysics Data System (ADS)

    He, Jin-Qi; Dong, Yuan; Zhang, Feng-Dong; Yu, Yong-Ji; Jin, Guang-Yong; Liu, Li-Da

    2016-01-01

    The intra-cavity pumping scheme which is used to simultaneously generate dual-wavelength lasers was proposed and published by us and the space-independent model of quasi-three-level and four-level intra-cavity pumped dual-wavelength lasers was constructed based on this scheme. In this paper, to make the previous study more rigorous, the space-dependent model is adopted. As an example, the output characteristics of 946 nm and 1064 nm dual-wavelength lasers under the conditions of different output mirror transmittances are numerically simulated by using the derived formula and the results are nearly identical to what was previously reported.

  1. Laplacian scale-space behavior of planar curve corners.

    PubMed

    Zhang, Xiaohong; Qu, Ying; Yang, Dan; Wang, Hongxing; Kymer, Jeff

    2015-11-01

    Scale-space behavior of corners is important for developing an efficient corner detection algorithm. In this paper, we analyze the scale-space behavior with the Laplacian of Gaussian (LoG) operator on a planar curve which constructs Laplacian Scale Space (LSS). The analytical expression of a Laplacian Scale-Space map (LSS map) is obtained, demonstrating the Laplacian Scale-Space behavior of the planar curve corners, based on a newly defined unified corner model. With this formula, some Laplacian Scale-Space behavior is summarized. Although LSS demonstrates some similarities to Curvature Scale Space (CSS), there are still some differences. First, no new extreme points are generated in the LSS. Second, the behavior of different cases of a corner model is consistent and simple. This makes it easy to trace the corner in a scale space. At last, the behavior of LSS is verified in an experiment on a digital curve.

  2. Secondary electron generation, emission and transport: Effects on spacecraft charging and NASCAP models

    NASA Technical Reports Server (NTRS)

    Katz, Ira; Mandell, Myron; Roche, James C.; Purvis, Carolyn

    1987-01-01

    Secondary electrons control a spacecraft's response to a plasma environment. To accurately simulate spacecraft charging, the NASA Charging Analyzer Program (NASCAP) has mathematical models of the generation, emission and transport of secondary electrons. The importance of each of the processes and the physical basis for each of the NASCAP models are discussed. Calculations are presented which show that the NASCAP formulations are in good agreement with both laboratory and space experiments.

  3. Peripersonal Space and Margin of Safety around the Body: Learning Visuo-Tactile Associations in a Humanoid Robot with Artificial Skin

    PubMed Central

    Roncone, Alessandro; Fadiga, Luciano; Metta, Giorgio

    2016-01-01

    This paper investigates a biologically motivated model of peripersonal space through its implementation on a humanoid robot. Guided by the present understanding of the neurophysiology of the fronto-parietal system, we developed a computational model inspired by the receptive fields of polymodal neurons identified, for example, in brain areas F4 and VIP. The experiments on the iCub humanoid robot show that the peripersonal space representation i) can be learned efficiently and in real-time via a simple interaction with the robot, ii) can lead to the generation of behaviors like avoidance and reaching, and iii) can contribute to the understanding the biological principle of motor equivalence. More specifically, with respect to i) the present model contributes to hypothesizing a learning mechanisms for peripersonal space. In relation to point ii) we show how a relatively simple controller can exploit the learned receptive fields to generate either avoidance or reaching of an incoming stimulus and for iii) we show how the robot can select arbitrary body parts as the controlled end-point of an avoidance or reaching movement. PMID:27711136

  4. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  5. PUS Services Software Building Block Automatic Generation for Space Missions

    NASA Astrophysics Data System (ADS)

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the characterization of the reference avionics architecture and of the reference on- board software high-level architecture.

  6. Entanglement Holographic Mapping of Many-Body Localized System by Spectrum Bifurcation Renormalization Group

    NASA Astrophysics Data System (ADS)

    You, Yi-Zhuang; Qi, Xiao-Liang; Xu, Cenke

    We introduce the spectrum bifurcation renormalization group (SBRG) as a generalization of the real-space renormalization group for the many-body localized (MBL) system without truncating the Hilbert space. Starting from a disordered many-body Hamiltonian in the full MBL phase, the SBRG flows to the MBL fixed-point Hamiltonian, and generates the local conserved quantities and the matrix product state representations for all eigenstates. The method is applicable to both spin and fermion models with arbitrary interaction strength on any lattice in all dimensions, as long as the models are in the MBL phase. In particular, we focus on the 1 d interacting Majorana chain with strong disorder, and map out its phase diagram using the entanglement entropy. The SBRG flow also generates an entanglement holographic mapping, which duals the MBL state to a fragmented holographic space decorated with small blackholes.

  7. Planning Inmarsat's second generation of spacecraft

    NASA Astrophysics Data System (ADS)

    Williams, W. P.

    1982-09-01

    The next generation of studies of the Inmarsat service are outlined, such as traffic forecasting studies, communications capacity estimates, space segment design, cost estimates, and financial analysis. Traffic forecasting will require future demand estimates, and a computer model has been developed which estimates demand over the Atlantic, Pacific, and Indian ocean regions. Communications estimates are based on traffic estimates, as a model converts traffic demand into a required capacity figure for a given area. The Erlang formula is used, requiring additional data such as peak hour ratios and distribution estimates. Basic space segment technical requirements are outlined (communications payload, transponder arrangements, etc), and further design studies involve such areas as space segment configuration, launcher and spacecraft studies, transmission planning, and earth segment configurations. Cost estimates of proposed design parameters will be performed, but options must be reduced to make construction feasible. Finally, a financial analysis will be carried out in order to calculate financial returns.

  8. Fracture prediction using modified mohr coulomb theory for non-linear strain paths using AA3104-H19

    NASA Astrophysics Data System (ADS)

    Dick, Robert; Yoon, Jeong Whan

    2016-08-01

    Experiment results from uniaxial tensile tests, bi-axial bulge tests, and disk compression tests for a beverage can AA3104-H19 material are presented. The results from the experimental tests are used to determine material coefficients for both Yld2000 and Yld2004 models. Finite element simulations are developed to study the influence of materials model on the predicted earing profile. It is shown that only the YLD2004 model is capable of accurately predicting the earing profile as the YLD2000 model only predicts 4 ears. Excellent agreement with the experimental data for earing is achieved using the AA3104-H19 material data and the Yld2004 constitutive model. Mechanical tests are also conducted on the AA3104-H19 to generate fracture data under different stress triaxiality conditions. Tensile tests are performed on specimens with a central hole and notched specimens. Torsion of a double bridge specimen is conducted to generate points near pure shear conditions. The Nakajima test is utilized to produce points in bi-axial tension. The data from the experiments is used to develop the fracture locus in the principal strain space. Mapping from principal strain space to stress triaxiality space, principal stress space, and polar effective plastic strain space is accomplished using a generalized mapping technique. Finite element modeling is used to validate the Modified Mohr-Coulomb (MMC) fracture model in the polar space. Models of a hole expansion during cup drawing and a cup draw/reverse redraw/expand forming sequence demonstrate the robustness of the modified PEPS fracture theory for the condition with nonlinear forming paths and accurately predicts the onset of failure. The proposed methods can be widely used for predicting failure for the examples which undergo nonlinear strain path including rigid-packaging and automotive forming.

  9. Application of Hyperspectral Techniques to Monitoring and Management of Invasive Plant Species Infestation

    DTIC Science & Technology

    2008-01-01

    the sensor is a data cloud in multi- dimensional space with each band generating an axis of dimension. When the data cloud is viewed in two or three...endmember of interest is not a true endmember in the data space . A ) B) Figure 8: Linear mixture models. A ) two- dimensional ...multi- dimensional space . A classifier is a computer algorithm that takes

  10. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 1: Overall approach and data generation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) data base for orbit injection stages, (2) data base for reusable space tug, (3) performance equations, (4) data integration and interpretation, (5) tug performance and mission model accomodation, (6) total program cost, (7) payload analysis, (8) computer software, and (9) comparison of tug concepts.

  11. Vibrations and structureborne noise in space station

    NASA Technical Reports Server (NTRS)

    Vaicaitis, R.

    1985-01-01

    The related literature was reviewed and a preliminary analytical model was developed for simplified acoustic and structural geometries for pressurized and unpressurized space station modules. In addition to the analytical work, an experimental program on structureborne noise generation and transmission was started. A brief review of those accomplishments is given.

  12. Adaptive Automation Design and Implementation

    DTIC Science & Technology

    2015-09-17

    Study : Space Navigator This section demonstrates the player modeling paradigm, focusing specifically on the response generation section of the player ...human-machine system, a real-time player modeling framework for imitating a specific person’s task performance, and the Adaptive Automation System...Model . . . . . . . . . . . . . . . . . . . . . . . 13 Clustering-Based Real-Time Player Modeling . . . . . . . . . . . . . . . . . . . . . . 15 An

  13. Modeling AWSoM CMEs with EEGGL: A New Approach for Space Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Jin, M.; Manchester, W.; van der Holst, B.; Sokolov, I.; Toth, G.; Vourlidas, A.; de Koning, C. A.; Gombosi, T. I.

    2015-12-01

    The major source of destructive space weather is coronal mass ejections (CMEs). However, our understanding of CMEs and their propagation in the heliosphere is limited by the insufficient observations. Therefore, the development of first-principals numerical models plays a vital role in both theoretical investigation and providing space weather forecasts. Here, we present results of the simulation of CME propagation from the Sun to 1AU by combining the analytical Gibson & Low (GL) flux rope model with the state-of-art solar wind model AWSoM. We also provide an approach for transferring this research model to a space weather forecasting tool by demonstrating how the free parameters of the GL flux rope can be prescribed based on remote observations via the new Eruptive Event Generator by Gibson-Low (EEGGL) toolkit. This capability allows us to predict the long-term evolution of the CME in interplanetary space. We perform proof-of-concept case studies to show the capability of the model to capture physical processes that determine CME evolution while also reproducing many observed features both in the corona and at 1 AU. We discuss the potential and limitations of this model as a future space weather forecasting tool.

  14. Advancing Space Sciences through Undergraduate Research Experiences at UC Berkeley's Space Sciences Laboratory - a novel approach to undergraduate internships for first generation community college students

    NASA Astrophysics Data System (ADS)

    Raftery, C. L.; Davis, H. B.; Peticolas, L. M.; Paglierani, R.

    2015-12-01

    The Space Sciences Laboratory at UC Berkeley launched an NSF-funded Research Experience for Undergraduates (REU) program in the summer of 2015. The "Advancing Space Sciences through Undergraduate Research Experiences" (ASSURE) program recruited heavily from local community colleges and universities, and provided a multi-tiered mentorship program for students in the fields of space science and engineering. The program was focussed on providing a supportive environment for 2nd and 3rd year undergraduates, many of whom were first generation and underrepresented students. This model provides three levels of mentorship support for the participating interns: 1) the primary research advisor provides academic and professional support. 2) The program coordinator, who meets with the interns multiple times per week, provides personal support and helps the interns to assimilate into the highly competitive environment of the research laboratory. 3) Returning undergraduate interns provided peer support and guidance to the new cohort of students. The impacts of this program on the first generation students and the research mentors, as well as the lessons learned will be discussed.

  15. Scientific and Technical Development of the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Burg, Richard

    2003-01-01

    The Next Generation Space Telescope (NGST) is part of the Origins program and is the key mission to discover the origins of galaxies in the Universe. It is essential that scientific requirements be translated into technical specifications at the beginning of the program and that there is technical participation by astronomers in the design and modeling of the observatory. During the active time period of this grant, the PI participated in the NGST program at GSFC by participating in the development of the Design Reference Mission, the development of the full end-to-end model of the observatory, the design trade-off based on the modeling, the Science Instrument Module definition and modeling, the study of proto-mission and test-bed development, and by participating in meetings including quarterly reviews and support of the NGST SWG. This work was documented in a series of NGST Monographs that are available on the NGST web site.

  16. Fermion Systems in Discrete Space-Time Exemplifying the Spontaneous Generation of a Causal Structure

    NASA Astrophysics Data System (ADS)

    Diethert, A.; Finster, F.; Schiefeneder, D.

    As toy models for space-time at the Planck scale, we consider examples of fermion systems in discrete space-time which are composed of one or two particles defined on two up to nine space-time points. We study the self-organization of the particles as described by a variational principle both analytically and numerically. We find an effect of spontaneous symmetry breaking which leads to the emergence of a discrete causal structure.

  17. Turing instability in reaction-diffusion models on complex networks

    NASA Astrophysics Data System (ADS)

    Ide, Yusuke; Izuhara, Hirofumi; Machida, Takuya

    2016-09-01

    In this paper, the Turing instability in reaction-diffusion models defined on complex networks is studied. Here, we focus on three types of models which generate complex networks, i.e. the Erdős-Rényi, the Watts-Strogatz, and the threshold network models. From analysis of the Laplacian matrices of graphs generated by these models, we numerically reveal that stable and unstable regions of a homogeneous steady state on the parameter space of two diffusion coefficients completely differ, depending on the network architecture. In addition, we theoretically discuss the stable and unstable regions in the cases of regular enhanced ring lattices which include regular circles, and networks generated by the threshold network model when the number of vertices is large enough.

  18. High-resolution stochastic downscaling of climate models: simulating wind advection, cloud cover and precipitation

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Fatichi, Simone; Burlando, Paolo

    2015-04-01

    A new stochastic approach to generate wind advection, cloud cover and precipitation fields is presented with the aim of formulating a space-time weather generator characterized by fields with high spatial and temporal resolution (e.g., 1 km x 1 km and 5 min). Its use is suitable for stochastic downscaling of climate scenarios in the context of hydrological, ecological and geomorphological applications. The approach is based on concepts from the Advanced WEather GENerator (AWE-GEN) presented by Fatichi et al. (2011, Adv. Water Resour.), the Space-Time Realizations of Areal Precipitation model (STREAP) introduced by Paschalis et al. (2013, Water Resour. Res.), and the High-Resolution Synoptically conditioned Weather Generator (HiReS-WG) presented by Peleg and Morin (2014, Water Resour. Res.). Advection fields are generated on the basis of the 500 hPa u and v wind direction variables derived from global or regional climate models. The advection velocity and direction are parameterized using Kappa and von Mises distributions respectively. A random Gaussian fields is generated using a fast Fourier transform to preserve the spatial correlation of advection. The cloud cover area, total precipitation area and mean advection of the field are coupled using a multi-autoregressive model. The approach is relatively parsimonious in terms of computational demand and, in the context of climate change, allows generating many stochastic realizations of current and projected climate in a fast and efficient way. A preliminary test of the approach is presented with reference to a case study in a complex orography terrain in the Swiss Alps.

  19. Relation of the runaway avalanche threshold to momentum space topology

    NASA Astrophysics Data System (ADS)

    McDevitt, Christopher J.; Guo, Zehua; Tang, Xian-Zhu

    2018-02-01

    The underlying physics responsible for the formation of an avalanche instability due to the generation of secondary electrons is studied. A careful examination of the momentum space topology of the runaway electron population is carried out with an eye toward identifying how qualitative changes in the momentum space of the runaway electrons is correlated with the avalanche threshold. It is found that the avalanche threshold is tied to the merger of an O and X point in the momentum space of the primary runaway electron population. Such a change of the momentum space topology is shown to be accurately described by a simple analytic model, thus providing a powerful means of determining the avalanche threshold for a range of model assumptions.

  20. Space station electrical power system availability study

    NASA Technical Reports Server (NTRS)

    Turnquist, Scott R.; Twombly, Mark A.

    1988-01-01

    ARINC Research Corporation performed a preliminary reliability, and maintainability (RAM) anlaysis of the NASA space station Electric Power Station (EPS). The analysis was performed using the ARINC Research developed UNIRAM RAM assessment methodology and software program. The analysis was performed in two phases: EPS modeling and EPS RAM assessment. The EPS was modeled in four parts: the insolar power generation system, the eclipse power generation system, the power management and distribution system (both ring and radial power distribution control unit (PDCU) architectures), and the power distribution to the inner keel PDCUs. The EPS RAM assessment was conducted in five steps: the use of UNIRAM to perform baseline EPS model analyses and to determine the orbital replacement unit (ORU) criticalities; the determination of EPS sensitivity to on-orbit spared of ORUs and the provision of an indication of which ORUs may need to be spared on-orbit; the determination of EPS sensitivity to changes in ORU reliability; the determination of the expected annual number of ORU failures; and the integration of the power generator system model results with the distribution system model results to assess the full EPS. Conclusions were drawn and recommendations were made.

  1. Space charge effects in ultrafast electron diffraction and imaging

    NASA Astrophysics Data System (ADS)

    Tao, Zhensheng; Zhang, He; Duxbury, P. M.; Berz, Martin; Ruan, Chong-Yu

    2012-02-01

    Understanding space charge effects is central for the development of high-brightness ultrafast electron diffraction and microscopy techniques for imaging material transformation with atomic scale detail at the fs to ps timescales. We present methods and results for direct ultrafast photoelectron beam characterization employing a shadow projection imaging technique to investigate the generation of ultrafast, non-uniform, intense photoelectron pulses in a dc photo-gun geometry. Combined with N-particle simulations and an analytical Gaussian model, we elucidate three essential space-charge-led features: the pulse lengthening following a power-law scaling, the broadening of the initial energy distribution, and the virtual cathode threshold. The impacts of these space charge effects on the performance of the next generation high-brightness ultrafast electron diffraction and imaging systems are evaluated.

  2. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  3. Modeling hydrogen-cyanide absorption in fires

    NASA Technical Reports Server (NTRS)

    Cagliostro, D. E.; Islas, A.

    1981-01-01

    A mathematical model is developed for predicting blood concentrations of cyanide as functions of exposure time to constant levels of cyanide in the atmosphere. A toxic gas (which may form as a result of decomposition of combustion materials used in transportation vehicles) is breathed into the alveolar space and transferred from the alveolar space to the blood by a first-order process, dependent on the concentration of the toxicant in the alveolar space. The model predicts that blood cyanide levels are more sensitive to the breathing cycle than to blood circulation. A model estimate of the relative effects of CO and HCN atmospheres, generated in an experimental chamber with an epoxy polymer, shows that toxic effects of cyanide occur long before those of carbon monoxide.

  4. GTM-Based QSAR Models and Their Applicability Domains.

    PubMed

    Gaspar, H A; Baskin, I I; Marcou, G; Horvath, D; Varnek, A

    2015-06-01

    In this paper we demonstrate that Generative Topographic Mapping (GTM), a machine learning method traditionally used for data visualisation, can be efficiently applied to QSAR modelling using probability distribution functions (PDF) computed in the latent 2-dimensional space. Several different scenarios of the activity assessment were considered: (i) the "activity landscape" approach based on direct use of PDF, (ii) QSAR models involving GTM-generated on descriptors derived from PDF, and, (iii) the k-Nearest Neighbours approach in 2D latent space. Benchmarking calculations were performed on five different datasets: stability constants of metal cations Ca(2+) , Gd(3+) and Lu(3+) complexes with organic ligands in water, aqueous solubility and activity of thrombin inhibitors. It has been shown that the performance of GTM-based regression models is similar to that obtained with some popular machine-learning methods (random forest, k-NN, M5P regression tree and PLS) and ISIDA fragment descriptors. By comparing GTM activity landscapes built both on predicted and experimental activities, we may visually assess the model's performance and identify the areas in the chemical space corresponding to reliable predictions. The applicability domain used in this work is based on data likelihood. Its application has significantly improved the model performances for 4 out of 5 datasets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Highly coherent vacuum ultraviolet radiation at the 15th harmonic with echo-enabled harmonic generation technique

    NASA Astrophysics Data System (ADS)

    Hemsing, E.; Dunning, M.; Hast, C.; Raubenheimer, T. O.; Weathersby, S.; Xiang, D.

    2014-07-01

    X-ray free-electron lasers are enabling access to new science by producing ultrafast and intense x rays that give researchers unparalleled power and precision in examining the fundamental nature of matter. In the quest for fully coherent x rays, the echo-enabled harmonic generation technique is one of the most promising methods. In this technique, coherent radiation at the high harmonic frequencies of two seed lasers is generated from the recoherence of electron beam phase space memory. Here we report on the generation of highly coherent and stable vacuum ultraviolet radiation at the 15th harmonic of an infrared seed laser with this technique. The experiment demonstrates two distinct advantages that are intrinsic to the highly nonlinear phase space gymnastics of echo-enabled harmonic generation in a new regime, i.e., high frequency up-conversion efficiency and insensitivity to electron beam phase space imperfections. Our results allow comparison and confirmation of predictive models and scaling laws, and mark a significant step towards fully coherent x-ray free-electron lasers that will open new scientific research.

  6. Visitors Center activities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Astronaut Katherine Hire and LEGO-Master Model Builders assisted children from Mississippi, Louisiana and Mississippi in the building of a 12-foot tall Space Shuttle made entirely from tiny LEGO bricks at the John C. Stennis Space Center Visitors Center in South Mississippi. The shuttle was part of an exhibit titled ' Travel in Space' World Show which depicts the history of flight and space travel from the Wright brothers to future generations of space vehicles. For more information concerning hours of operation or Visitors Center educational programs, call 1-800-237-1821 in Mississippi and Louisiana or (601) 688-2370.

  7. Design of a 10 GHz, 10 MW Gyrotron.

    DTIC Science & Technology

    1985-11-27

    beam, which can be located close to the cavity wall, reducing space charge effects . In addition, high current density beams can be generated (6) with the...calculates electron trajectories within potential boundaries, including the effects of beam space charge , and is fully relativistic. Modeling the... space charge would cause the bottom electrons to have too little perpendicular energy, and vice versa, as illustrated in Figures 11 and 12. The

  8. Modeling from Local to Subsystem Level Effects in Analog and Digital Circuits Due to Space Induced Single Event Transients

    NASA Technical Reports Server (NTRS)

    Perez, Reinaldo J.

    2011-01-01

    Single Event Transients in analog and digital electronics from space generated high energetic nuclear particles can disrupt either temporarily and sometimes permanently the functionality and performance of electronics in space vehicles. This work first provides some insights into the modeling of SET in electronic circuits that can be used in SPICE-like simulators. The work is then directed to present methodologies, one of which was developed by this author, for the assessment of SET at different levels of integration in electronics, from the circuit level to the subsystem level.

  9. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  10. Exploring theory space with Monte Carlo reweighting

    DOE PAGES

    Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.; ...

    2014-10-13

    Theories of new physics often involve a large number of unknown parameters which need to be scanned. Additionally, a putative signal in a particular channel may be due to a variety of distinct models of new physics. This makes experimental attempts to constrain the parameter space of motivated new physics models with a high degree of generality quite challenging. We describe how the reweighting of events may allow this challenge to be met, as fully simulated Monte Carlo samples generated for arbitrary benchmark models can be effectively re-used. Specifically, we suggest procedures that allow more efficient collaboration between theorists andmore » experimentalists in exploring large theory parameter spaces in a rigorous way at the LHC.« less

  11. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  12. Towards semantically sensitive text clustering: a feature space modeling technology based on dimension extension.

    PubMed

    Liu, Yuanchao; Liu, Ming; Wang, Xin

    2015-01-01

    The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach.

  13. Towards Semantically Sensitive Text Clustering: A Feature Space Modeling Technology Based on Dimension Extension

    PubMed Central

    Liu, Yuanchao; Liu, Ming; Wang, Xin

    2015-01-01

    The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach. PMID:25794172

  14. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  15. Analysis of screeching in a cold flow jet experiment

    NASA Technical Reports Server (NTRS)

    Wang, M. E.; Slone, R. M., Jr.; Robertson, J. E.; Keefe, L.

    1975-01-01

    The screech phenomenon observed in a one-sixtieth scale model space shuttle test of the solid rocket booster exhaust flow noise has been investigated. A critical review is given of the cold flow test data representative of Space Shuttle launch configurations to define those parameters which contribute to screech generation. An acoustic feedback mechanism is found to be responsible for the generation of screech. A simple equation which permits prediction of screech frequency in terms of basic testing parameters such as the jet exhaust Mach number and the separating distance from nozzle exit to the surface of model launch pad is presented and is found in good agreement with the test data. Finally, techniques are recommended to eliminate or reduce the screech.

  16. Results of tests of advanced flexible insulation vortex and flow environments in the North American Aerodynamics Laboratory lowspeed wind tunnel using 0.0405-scale Space Shuttle Orbiter model 16-0 (test OA-309)

    NASA Technical Reports Server (NTRS)

    Marshall, B. A.; Nichols, M. E.

    1984-01-01

    An experimental investigation (Test OA-309) was conducted using 0.0405-scale Space Shuttle Orbiter Model 16-0 in the North American Aerodynamics Laboratory 7.75 x 11.00-foot Lowspeed Wind Tunnel. The primary purpose was to locate and study any flow conditions or vortices that might have caused damage to the Advanced Flexible Reusable Surface Insulation (AFRSI) during the Space Transportation System STS-6 mission. A secondary objective was to evaluate vortex generators to be used for Wind Tunnel Test OS-314. Flowfield visualization was obtained by means of smoke, tufts, and oil flow. The test was conducted at Mach numbers between 0.07 and 0.23 and at dynamic pressures between 7 and 35 pounds per square foot. The angle-of-attack range of the model was -5 degrees through 35 degrees at 0 or 2 degrees of sideslip, while roll angle was held constant at zero degrees. The vortex generators were studied at angles of 0, 5, 10, and 15 degrees.

  17. Dynamic Modeling of Solar Dynamic Components and Systems

    NASA Technical Reports Server (NTRS)

    Hochstein, John I.; Korakianitis, T.

    1992-01-01

    The purpose of this grant was to support NASA in modeling efforts to predict the transient dynamic and thermodynamic response of the space station solar dynamic power generation system. In order to meet the initial schedule requirement of providing results in time to support installation of the system as part of the initial phase of space station, early efforts were executed with alacrity and often in parallel. Initially, methods to predict the transient response of a Rankine as well as a Brayton cycle were developed. Review of preliminary design concepts led NASA to select a regenerative gas-turbine cycle using a helium-xenon mixture as the working fluid and, from that point forward, the modeling effort focused exclusively on that system. Although initial project planning called for a three year period of performance, revised NASA schedules moved system installation to later and later phases of station deployment. Eventually, NASA selected to halt development of the solar dynamic power generation system for space station and to reduce support for this project to two-thirds of the original level.

  18. Quanta of Geometry and Unification

    NASA Astrophysics Data System (ADS)

    Chamseddine, Ali H.

    This is a tribute to Abdus Salam's memory whose insight and creative thinking set for me a role model to follow. In this contribution I show that the simple requirement of volume quantization in space-time (with Euclidean signature) uniquely determines the geometry to be that of a noncommutative space whose finite part is based on an algebra that leads to Pati-Salam grand unified models. The Standard Model corresponds to a special case where a mathematical constraint (order one condition) is satisfied. This provides evidence that Salam was a visionary who was generations ahead of his time.

  19. Trajectory generation for an on-road autonomous vehicle

    NASA Astrophysics Data System (ADS)

    Horst, John; Barbera, Anthony

    2006-05-01

    We describe an algorithm that generates a smooth trajectory (position, velocity, and acceleration at uniformly sampled instants of time) for a car-like vehicle autonomously navigating within the constraints of lanes in a road. The technique models both vehicle paths and lane segments as straight line segments and circular arcs for mathematical simplicity and elegance, which we contrast with cubic spline approaches. We develop the path in an idealized space, warp the path into real space and compute path length, generate a one-dimensional trajectory along the path length that achieves target speeds and positions, and finally, warp, translate, and rotate the one-dimensional trajectory points onto the path in real space. The algorithm moves a vehicle in lane safely and efficiently within speed and acceleration maximums. The algorithm functions in the context of other autonomous driving functions within a carefully designed vehicle control hierarchy.

  20. Investigation into Text Classification With Kernel Based Schemes

    DTIC Science & Technology

    2010-03-01

    Document Matrix TDMs Term-Document Matrices TMG Text to Matrix Generator TN True Negative TP True Positive VSM Vector Space Model xxii THIS PAGE...are represented as a term-document matrix, common evaluation metrics, and the software package Text to Matrix Generator ( TMG ). The classifier...AND METRICS This chapter introduces the indexing capabilities of the Text to Matrix Generator ( TMG ) Toolbox. Specific attention is placed on the

  1. On the tensionless limit of gauged WZW models

    NASA Astrophysics Data System (ADS)

    Bakas, I.; Sourdis, C.

    2004-06-01

    The tensionless limit of gauged WZW models arises when the level of the underlying Kac-Moody algebra assumes its critical value, equal to the dual Coxeter number, in which case the central charge of the Virasoro algebra becomes infinite. We examine this limit from the world-sheet and target space viewpoint and show that gravity decouples naturally from the spectrum. Using the two-dimensional black-hole coset SL(2,Bbb R)k/U(1) as illustrative example, we find for k = 2 that the world-sheet symmetry is described by a truncated version of Winfty generated by chiral fields with integer spin s geq 3, whereas the Virasoro algebra becomes abelian and it can be consistently factored out. The geometry of target space looks like an infinitely curved hyperboloid, which invalidates the effective field theory description and conformal invariance can no longer be used to yield reliable space-time interpretation. We also compare our results with the null gauging of WZW models, which correspond to infinite boost in target space and they describe the Liouville mode that decouples in the tensionless limit. A formal BRST analysis of the world-sheet symmetry suggests that the central charge of all higher spin generators should be fixed to a critical value, which is not seen by the contracted Virasoro symmetry. Generalizations to higher dimensional coset models are also briefly discussed in the tensionless limit, where similar observations are made.

  2. Quantifying the VNIR Effects of Nanophase Iron Generated through the Space Weathering of Silicates: Reconciling Modeled Data with Laboratory Observations

    NASA Astrophysics Data System (ADS)

    Legett, C., IV; Glotch, T. D.; Lucey, P. G.

    2015-12-01

    Space weathering is a diverse set of processes that occur on the surfaces of airless bodies due to exposure to the space environment. One of the effects of space weathering is the generation of nanophase iron particles in glassy rims on mineral grains due to sputtering of iron-bearing minerals. These particles have a size-dependent effect on visible and near infrared (VNIR) reflectance spectra with smaller diameter particles (< 50 nm) causing both reddening and darkening of the spectra with respect to unweathered material (Britt-Pieters particle behavior), while larger particles (> 300 nm) darken without reddening. Between these two sizes, a gradual shift between these two behaviors occurs. In this work, we present results from the Multiple Sphere T-Matrix (MSTM) scattering model in combination with Hapke theory to explore the particle size and iron content parameter spaces with respect to VNIR (700-1700 nm) spectral slope. Previous work has shown that the MSTM-Hapke hybrid model offers improvements over Mie-Hapke models. Virtual particles are constructed out of an arbitrary number of spheres, and each sphere is assigned a refractive index and extinction coefficient for each wavelength of interest. The model then directly solves Maxwell's Equations at every wave-particle interface to predict the scattering, extinction and absorption efficiencies. These are then put into a simplified Hapke bidirectional reflectance model that yields a predicted reflectance. Preliminary results show an area of maximum slopes for iron particle diameters < 80 nm and iron concentrations of ~1-10wt% in an amorphous silica matrix. Further model runs are planned to better refine the extent of this region. Companion laboratory work using mixtures of powdered aerogel and nanophase iron particles provides a point of comparison to modeling efforts. The effects on reflectance and emissivity values due to particle size in a nearly ideal scatterer (aerogel) are also observed with comparisons to model data.

  3. Prediction and generation of binary Markov processes: Can a finite-state fox catch a Markov mouse?

    NASA Astrophysics Data System (ADS)

    Ruebeck, Joshua B.; James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2018-01-01

    Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.

  4. High-resolution stochastic generation of extreme rainfall intensity for urban drainage modelling applications

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Blumensaat, Frank; Molnar, Peter; Fatichi, Simone; Burlando, Paolo

    2016-04-01

    Urban drainage response is highly dependent on the spatial and temporal structure of rainfall. Therefore, measuring and simulating rainfall at a high spatial and temporal resolution is a fundamental step to fully assess urban drainage system reliability and related uncertainties. This is even more relevant when considering extreme rainfall events. However, the current space-time rainfall models have limitations in capturing extreme rainfall intensity statistics for short durations. Here, we use the STREAP (Space-Time Realizations of Areal Precipitation) model, which is a novel stochastic rainfall generator for simulating high-resolution rainfall fields that preserve the spatio-temporal structure of rainfall and its statistical characteristics. The model enables a generation of rain fields at 102 m and minute scales in a fast and computer-efficient way matching the requirements for hydrological analysis of urban drainage systems. The STREAP model was applied successfully in the past to generate high-resolution extreme rainfall intensities over a small domain. A sub-catchment in the city of Luzern (Switzerland) was chosen as a case study to: (i) evaluate the ability of STREAP to disaggregate extreme rainfall intensities for urban drainage applications; (ii) assessing the role of stochastic climate variability of rainfall in flow response and (iii) evaluate the degree of non-linearity between extreme rainfall intensity and system response (i.e. flow) for a small urban catchment. The channel flow at the catchment outlet is simulated by means of a calibrated hydrodynamic sewer model.

  5. Relation of the runaway avalanche threshold to momentum space topology

    DOE PAGES

    McDevitt, Christopher J.; Guo, Zehua; Tang, Xian -Zhu

    2018-01-05

    Here, the underlying physics responsible for the formation of an avalanche instability due to the generation of secondary electrons is studied. A careful examination of the momentum space topology of the runaway electron population is carried out with an eye toward identifying how qualitative changes in the momentum space of the runaway electrons is correlated with the avalanche threshold. It is found that the avalanche threshold is tied to the merger of an O and X point in the momentum space of the primary runaway electron population. Such a change of the momentum space topology is shown to be accuratelymore » described by a simple analytic model, thus providing a powerful means of determining the avalanche threshold for a range of model assumptions.« less

  6. Relation of the runaway avalanche threshold to momentum space topology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDevitt, Christopher J.; Guo, Zehua; Tang, Xian -Zhu

    Here, the underlying physics responsible for the formation of an avalanche instability due to the generation of secondary electrons is studied. A careful examination of the momentum space topology of the runaway electron population is carried out with an eye toward identifying how qualitative changes in the momentum space of the runaway electrons is correlated with the avalanche threshold. It is found that the avalanche threshold is tied to the merger of an O and X point in the momentum space of the primary runaway electron population. Such a change of the momentum space topology is shown to be accuratelymore » described by a simple analytic model, thus providing a powerful means of determining the avalanche threshold for a range of model assumptions.« less

  7. The Development of a Stochastic Model of the Atmosphere Between 30 and 90 Km to Be Used in Determining the Effect of Atmospheric Variability on Space Shuttle Entry Parameters. Ph.D. Thesis - Virginia Polytechnic Inst. and State Univ.

    NASA Technical Reports Server (NTRS)

    Campbell, J. W.

    1973-01-01

    A stochasitc model of the atmosphere between 30 and 90 km was developed for use in Monte Carlo space shuttle entry studies. The model is actually a family of models, one for each latitude-season category as defined in the 1966 U.S. Standard Atmosphere Supplements. Each latitude-season model generates a pseudo-random temperature profile whose mean is the appropriate temperature profile from the Standard Atmosphere Supplements. The standard deviation of temperature at each altitude for a given latitude-season model was estimated from sounding-rocket data. Departures from the mean temperature at each altitude were produced by assuming a linear regression of temperature on the solar heating rate of ozone. A profile of random ozone concentrations was first generated using an auxiliary stochastic ozone model, also developed as part of this study, and then solar heating rates were computed for the random ozone concentrations.

  8. A model and simulation of fast space charge pulses in polymers

    NASA Astrophysics Data System (ADS)

    Lv, Zepeng; Rowland, Simon M.; Wu, Kai

    2017-11-01

    The transport of space charge packets across polyethylene and epoxy resin in high electric fields has been characterized as fast or slow depending on packet mobility. Several explanations for the formation and transport of slow space charge packets have been proposed, but the origins of fast space charge pulses, with mobilities above 10-11 m2 V-1 s-1, are unclear. In one suggested model, it is assumed that the formation of fast charge pulses is due to discontinuous electromechanical compression and charge injection at the electrode-insulation interface, and their transport is related to corresponding relaxation processes. In that model, charges travel as a pulse because of group polarization. This paper provides an alternative model based on the reduction of charge carrier activation energy due to charge density triggered polymer chain movement and subsequent chain relaxation times. The generation and transport of fast charge pulses are readily simulated by a bipolar charge transport model with three additional parameters: reduced activation energy, charge density threshold, and chain relaxation time. Such a model is shown to reproduce key features of fast space charge pulses including speed, duration, repetition rate and pulse size. This model provides the basis for a deep understanding of the physical origins of fast space charge pulses in polymers.

  9. A realistic intersecting D6-brane model after the first LHC run

    NASA Astrophysics Data System (ADS)

    Li, Tianjun; Nanopoulos, D. V.; Raza, Shabbar; Wang, Xiao-Chuan

    2014-08-01

    With the Higgs boson mass around 125 GeV and the LHC supersymmetry search constraints, we revisit a three-family Pati-Salam model from intersecting D6-branes in Type IIA string theory on the T 6/(ℤ2 × ℤ2) orientifold which has a realistic phenomenology. We systematically scan the parameter space for μ < 0 and μ > 0, and find that the gravitino mass is generically heavier than about 2 TeV for both cases due to the Higgs mass low bound 123 GeV. In particular, we identify a region of parameter space with the electroweak fine-tuning as small as Δ EW ~ 24-32 (3-4%). In the viable parameter space which is consistent with all the current constraints, the mass ranges for gluino, the first two-generation squarks and sleptons are respectively [3, 18] TeV, [3, 16] TeV, and [2, 7] TeV. For the third-generation sfermions, the light stop satisfying 5 σ WMAP bounds via neutralino-stop coannihilation has mass from 0.5 to 1.2 TeV, and the light stau can be as light as 800 GeV. We also show various coannihilation and resonance scenarios through which the observed dark matter relic density is achieved. Interestingly, the certain portions of parameter space has excellent t- b- τ and b- τ Yukawa coupling unification. Three regions of parameter space are highlighted as well where the dominant component of the lightest neutralino is a bino, wino or higgsino. We discuss various scenarios in which such solutions may avoid recent astrophysical bounds in case if they satisfy or above observed relic density bounds. Prospects of finding higgsino-like neutralino in direct and indirect searches are also studied. And we display six tables of benchmark points depicting various interesting features of our model. Note that the lightest neutralino can be heavy up to 2.8 TeV, and there exists a natural region of parameter space from low-energy fine-tuning definition with heavy gluino and first two-generation squarks/sleptons, we point out that the 33 TeV and 100 TeV proton-proton colliders are indeed needed to probe our D-brane model.

  10. An investigation of a movable mass-attitude stabilization system for artificial-G space

    NASA Technical Reports Server (NTRS)

    Childs, D. W.

    1972-01-01

    The application of a single movable mass to generate control torques for the attitude control of space vehicles is discussed. The feasibility of a movable mass control in stabilizing a cable-connected, artificial gravity configuration is proposed. A dynamic model for cable-connected configurations to account for the aggregate motion of the space station and relative torsional motion between the crew quarters and counter weight is developed.

  11. Accelerating Science with Generative Adversarial Networks: An Application to 3D Particle Showers in Multilayer Calorimeters

    NASA Astrophysics Data System (ADS)

    Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin

    2018-01-01

    Physicists at the Large Hadron Collider (LHC) rely on detailed simulations of particle collisions to build expectations of what experimental data may look like under different theoretical modeling assumptions. Petabytes of simulated data are needed to develop analysis techniques, though they are expensive to generate using existing algorithms and computing resources. The modeling of detectors and the precise description of particle cascades as they interact with the material in the calorimeter are the most computationally demanding steps in the simulation pipeline. We therefore introduce a deep neural network-based generative model to enable high-fidelity, fast, electromagnetic calorimeter simulation. There are still challenges for achieving precision across the entire phase space, but our current solution can reproduce a variety of particle shower properties while achieving speedup factors of up to 100 000 × . This opens the door to a new era of fast simulation that could save significant computing time and disk space, while extending the reach of physics searches and precision measurements at the LHC and beyond.

  12. Path generation algorithm for UML graphic modeling of aerospace test software

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  13. Heat transfer measurements for Stirling machine cylinders

    NASA Technical Reports Server (NTRS)

    Kornhauser, Alan A.; Kafka, B. C.; Finkbeiner, D. L.; Cantelmi, F. C.

    1994-01-01

    The primary purpose of this study was to measure the effects of inflow-produced heat turbulence on heat transfer in Stirling machine cylinders. A secondary purpose was to provide new experimental information on heat transfer in gas springs without inflow. The apparatus for the experiment consisted of a varying-volume piston-cylinder space connected to a fixed volume space by an orifice. The orifice size could be varied to adjust the level of inflow-produced turbulence, or the orifice plate could be removed completely so as to merge the two spaces into a single gas spring space. Speed, cycle mean pressure, overall volume ratio, and varying volume space clearance ratio could also be adjusted. Volume, pressure in both spaces, and local heat flux at two locations were measured. The pressure and volume measurements were used to calculate area averaged heat flux, heat transfer hysteresis loss, and other heat transfer-related effects. Experiments in the one space arrangement extended the range of previous gas spring tests to lower volume ratio and higher nondimensional speed. The tests corroborated previous results and showed that analytic models for heat transfer and loss based on volume ratio approaching 1 were valid for volume ratios ranging from 1 to 2, a range covering most gas springs in Stirling machines. Data from experiments in the two space arrangement were first analyzed based on lumping the two spaces together and examining total loss and averaged heat transfer as a function of overall nondimensional parameter. Heat transfer and loss were found to be significantly increased by inflow-produced turbulence. These increases could be modeled by appropriate adjustment of empirical coefficients in an existing semi-analytic model. An attempt was made to use an inverse, parameter optimization procedure to find the heat transfer in each of the two spaces. This procedure was successful in retrieving this information from simulated pressure-volume data with artificially generated noise, but it failed with the actual experimental data. This is evidence that the models used in the parameter optimization procedure (and to generate the simulated data) were not correct. Data from the surface heat flux sensors indicated that the primary shortcoming of these models was that they assumed turbulence levels to be constant over the cycle. Sensor data in the varying volume space showed a large increase in heat flux, probably due to turbulence, during the expansion stroke.

  14. Evaluation of indoor air composition time variation in air-tight occupied spaces during night periods

    NASA Astrophysics Data System (ADS)

    Markov, Detelin

    2012-11-01

    This paper presents an easy-to-understand procedure for prediction of indoor air composition time variation in air-tight occupied spaces during the night periods. The mathematical model is based on the assumptions for homogeneity and perfect mixing of the indoor air, the ideal gas model for non-reacting gas mixtures, mass conservation equations for the entire system and for each species, a model for prediction of basal metabolic rate of humans as well as a model for prediction of O2 consumption rate and both CO2 and H2O generation rates by breathing. Time variation of indoor air composition is predicted at constant indoor air temperature for three scenarios based on the analytical solution of the mathematical model. The results achieved reveal both the most probable scenario for indoor air time variation in air-tight occupied spaces as well as the cause for morning tiredness after having a sleep in a modern energy efficient space.

  15. Mechanism of the free charge carrier generation in the dielectric breakdown

    NASA Astrophysics Data System (ADS)

    Rahim, N. A. A.; Ranom, R.; Zainuddin, H.

    2017-12-01

    Many studies have been conducted to investigate the effect of environmental, mechanical and electrical stresses on insulator. However, studies on physical process of discharge phenomenon, leading to the breakdown of the insulator surface are lacking and difficult to comprehend. Therefore, this paper analysed charge carrier generation mechanism that can cause free charge carrier generation, leading toward surface discharge development. Besides, this paper developed a model of surface discharge based on the charge generation mechanism on the outdoor insulator. Nernst’s Planck theory was used in order to model the behaviour of the charge carriers while Poisson’s equation was used to determine the distribution of electric field on insulator surface. In the modelling of surface discharge on the outdoor insulator, electric field dependent molecular ionization was used as the charge generation mechanism. A mathematical model of the surface discharge was solved using method of line technique (MOL). The result from the mathematical model showed that the behaviour of net space charge density was correlated with the electric field distribution.

  16. Fermion masses through four-fermion condensates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyar, Venkitesh; Chandrasekharan, Shailesh

    Fermion masses can be generated through four-fermion condensates when symmetries prevent fermion bilinear condensates from forming. This less explored mechanism of fermion mass generation is responsible for making four reduced staggered lattice fermions massive at strong couplings in a lattice model with a local four-fermion coupling. The model has a massless fermion phase at weak couplings and a massive fermion phase at strong couplings. In particular there is no spontaneous symmetry breaking of any lattice symmetries in both these phases. Recently it was discovered that in three space-time dimensions there is a direct second order phase transition between the twomore » phases. Here we study the same model in four space-time dimensions and find results consistent with the existence of a narrow intermediate phase with fermion bilinear condensates, that separates the two asymptotic phases by continuous phase transitions.« less

  17. Generative Topographic Mapping (GTM): Universal Tool for Data Visualization, Structure-Activity Modeling and Dataset Comparison.

    PubMed

    Kireeva, N; Baskin, I I; Gaspar, H A; Horvath, D; Marcou, G; Varnek, A

    2012-04-01

    Here, the utility of Generative Topographic Maps (GTM) for data visualization, structure-activity modeling and database comparison is evaluated, on hand of subsets of the Database of Useful Decoys (DUD). Unlike other popular dimensionality reduction approaches like Principal Component Analysis, Sammon Mapping or Self-Organizing Maps, the great advantage of GTMs is providing data probability distribution functions (PDF), both in the high-dimensional space defined by molecular descriptors and in 2D latent space. PDFs for the molecules of different activity classes were successfully used to build classification models in the framework of the Bayesian approach. Because PDFs are represented by a mixture of Gaussian functions, the Bhattacharyya kernel has been proposed as a measure of the overlap of datasets, which leads to an elegant method of global comparison of chemical libraries. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Investigation of the possible effects of comet Encke's meteoroid stream on the Ca exosphere of Mercury

    NASA Astrophysics Data System (ADS)

    Plainaki, Christina; Mura, Alessandro; Milillo, Anna; Orsini, Stefano; Livi, Stefano; Mangano, Valeria; Massetti, Stefano; Rispoli, Rosanna; De Angelis, Elisabetta

    2017-06-01

    The MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) observations of the seasonal variability of Mercury's Ca exosphere are consistent with the general idea that the Ca atoms originate from the bombardment of the surface by particles from comet 2P/Encke. The generating mechanism is believed to be a combination of different processes including the release of atomic and molecular surface particles and the photodissociation of exospheric molecules. Considering different generation and loss mechanisms, we perform simulations with a 3-D Monte Carlo model based on the exosphere generation model by Mura et al. (2009). We present for the first time the 3-D spatial distribution of the CaO and Ca exospheres generated through the process of micrometeoroid impact vaporization, and we show that the morphology of the latter is consistent with the available MESSENGER/Mercury Atmospheric and Surface Composition Spectrometer observations. The results presented in this paper can be useful in the exosphere observations planning for BepiColombo, the upcoming European Space Agency-Japanese Aerospace Exploration Agency mission to Mercury.

  19. Model-data synthesis for the next generation of forest free-air CO2 enrichment (FACE) experiments.

    PubMed

    Norby, Richard J; De Kauwe, Martin G; Domingues, Tomas F; Duursma, Remko A; Ellsworth, David S; Goll, Daniel S; Lapola, David M; Luus, Kristina A; MacKenzie, A Rob; Medlyn, Belinda E; Pavlick, Ryan; Rammig, Anja; Smith, Benjamin; Thomas, Rick; Thonicke, Kirsten; Walker, Anthony P; Yang, Xiaojuan; Zaehle, Sönke

    2016-01-01

    The first generation of forest free-air CO2 enrichment (FACE) experiments has successfully provided deeper understanding about how forests respond to an increasing CO2 concentration in the atmosphere. Located in aggrading stands in the temperate zone, they have provided a strong foundation for testing critical assumptions in terrestrial biosphere models that are being used to project future interactions between forest productivity and the atmosphere, despite the limited inference space of these experiments with regards to the range of global ecosystems. Now, a new generation of FACE experiments in mature forests in different biomes and over a wide range of climate space and biodiversity will significantly expand the inference space. These new experiments are: EucFACE in a mature Eucalyptus stand on highly weathered soil in subtropical Australia; AmazonFACE in a highly diverse, primary rainforest in Brazil; BIFoR-FACE in a 150-yr-old deciduous woodland stand in central England; and SwedFACE proposed in a hemiboreal, Pinus sylvestris stand in Sweden. We now have a unique opportunity to initiate a model-data interaction as an integral part of experimental design and to address a set of cross-site science questions on topics including responses of mature forests; interactions with temperature, water stress, and phosphorus limitation; and the influence of biodiversity. © UT-Battelle, LLC New Phytologist © 2015 New Phytologist Trust.

  20. Modeling a space-based quantum link that includes an adaptive optics system

    NASA Astrophysics Data System (ADS)

    Duchane, Alexander W.; Hodson, Douglas D.; Mailloux, Logan O.

    2017-10-01

    Quantum Key Distribution uses optical pulses to generate shared random bit strings between two locations. If a high percentage of the optical pulses are comprised of single photons, then the statistical nature of light and information theory can be used to generate secure shared random bit strings which can then be converted to keys for encryption systems. When these keys are incorporated along with symmetric encryption techniques such as a one-time pad, then this method of key generation and encryption is resistant to future advances in quantum computing which will significantly degrade the effectiveness of current asymmetric key sharing techniques. This research first reviews the transition of Quantum Key Distribution free-space experiments from the laboratory environment to field experiments, and finally, ongoing space experiments. Next, a propagation model for an optical pulse from low-earth orbit to ground and the effects of turbulence on the transmitted optical pulse is described. An Adaptive Optics system is modeled to correct for the aberrations caused by the atmosphere. The long-term point spread function of the completed low-earth orbit to ground optical system is explored in the results section. Finally, the impact of this optical system and its point spread function on an overall quantum key distribution system as well as the future work necessary to show this impact is described.

  1. CCMC: Serving research and space weather communities with unique space weather services, innovative tools and resources

    NASA Astrophysics Data System (ADS)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti; Maddox, Marlo

    2015-04-01

    With the addition of Space Weather Research Center (a sub-team within CCMC) in 2010 to address NASA’s own space weather needs, CCMC has become a unique entity that not only facilitates research through providing access to the state-of-the-art space science and space weather models, but also plays a critical role in providing unique space weather services to NASA robotic missions, developing innovative tools and transitioning research to operations via user feedback. With scientists, forecasters and software developers working together within one team, through close and direct connection with space weather customers and trusted relationship with model developers, CCMC is flexible, nimble and effective to meet customer needs. In this presentation, we highlight a few unique aspects of CCMC/SWRC’s space weather services, such as addressing space weather throughout the solar system, pushing the frontier of space weather forecasting via the ensemble approach, providing direct personnel and tool support for spacecraft anomaly resolution, prompting development of multi-purpose tools and knowledge bases, and educating and engaging the next generation of space weather scientists.

  2. A hybrid phase-space and histogram source model for GPU-based Monte Carlo radiotherapy dose calculation

    NASA Astrophysics Data System (ADS)

    Townson, Reid W.; Zavgorodni, Sergei

    2014-12-01

    In GPU-based Monte Carlo simulations for radiotherapy dose calculation, source modelling from a phase-space source can be an efficiency bottleneck. Previously, this has been addressed using phase-space-let (PSL) sources, which provided significant efficiency enhancement. We propose that additional speed-up can be achieved through the use of a hybrid primary photon point source model combined with a secondary PSL source. A novel phase-space derived and histogram-based implementation of this model has been integrated into gDPM v3.0. Additionally, a simple method for approximately deriving target photon source characteristics from a phase-space that does not contain inheritable particle history variables (LATCH) has been demonstrated to succeed in selecting over 99% of the true target photons with only ~0.3% contamination (for a Varian 21EX 18 MV machine). The hybrid source model was tested using an array of open fields for various Varian 21EX and TrueBeam energies, and all cases achieved greater than 97% chi-test agreement (the mean was 99%) above the 2% isodose with 1% / 1 mm criteria. The root mean square deviations (RMSDs) were less than 1%, with a mean of 0.5%, and the source generation time was 4-5 times faster. A seven-field intensity modulated radiation therapy patient treatment achieved 95% chi-test agreement above the 10% isodose with 1% / 1 mm criteria, 99.8% for 2% / 2 mm, a RMSD of 0.8%, and source generation speed-up factor of 2.5. Presented as part of the International Workshop on Monte Carlo Techniques in Medical Physics

  3. Langley's CSI evolutionary model: Phase 2

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Elliott, Kenny B.; Belvin, W. Keith; Teter, John E.

    1995-01-01

    Phase 2 testbed is part of a sequence of laboratory models, developed at NASA Langley Research Center, to enhance our understanding on how to model, control, and design structures for space applications. A key problem with structures that must perform in space is the appearance of unwanted vibrations during operations. Instruments, design independently by different scientists, must share the same vehicle causing them to interact with each other. Once in space, these problems are difficult to correct and therefore, prediction via analysis design, and experiments is very important. Phase 2 laboratory model and its predecessors are designed to fill a gap between theory and practice and to aid in understanding important aspects in modeling, sensor and actuator technology, ground testing techniques, and control design issues. This document provides detailed information on the truss structure and its main components, control computer architecture, and structural models generated along with corresponding experimental results.

  4. Tuning iteration space slicing based tiled multi-core code implementing Nussinov's RNA folding.

    PubMed

    Palkowski, Marek; Bielecki, Wlodzimierz

    2018-01-15

    RNA folding is an ongoing compute-intensive task of bioinformatics. Parallelization and improving code locality for this kind of algorithms is one of the most relevant areas in computational biology. Fortunately, RNA secondary structure approaches, such as Nussinov's recurrence, involve mathematical operations over affine control loops whose iteration space can be represented by the polyhedral model. This allows us to apply powerful polyhedral compilation techniques based on the transitive closure of dependence graphs to generate parallel tiled code implementing Nussinov's RNA folding. Such techniques are within the iteration space slicing framework - the transitive dependences are applied to the statement instances of interest to produce valid tiles. The main problem at generating parallel tiled code is defining a proper tile size and tile dimension which impact parallelism degree and code locality. To choose the best tile size and tile dimension, we first construct parallel parametric tiled code (parameters are variables defining tile size). With this purpose, we first generate two nonparametric tiled codes with different fixed tile sizes but with the same code structure and then derive a general affine model, which describes all integer factors available in expressions of those codes. Using this model and known integer factors present in the mentioned expressions (they define the left-hand side of the model), we find unknown integers in this model for each integer factor available in the same fixed tiled code position and replace in this code expressions, including integer factors, with those including parameters. Then we use this parallel parametric tiled code to implement the well-known tile size selection (TSS) technique, which allows us to discover in a given search space the best tile size and tile dimension maximizing target code performance. For a given search space, the presented approach allows us to choose the best tile size and tile dimension in parallel tiled code implementing Nussinov's RNA folding. Experimental results, received on modern Intel multi-core processors, demonstrate that this code outperforms known closely related implementations when the length of RNA strands is bigger than 2500.

  5. System Mass Variation and Entropy Generation in 100k We Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Reid, Bryan M.

    2004-01-01

    State-of-the-art closed-Brayton-cycle (CBC) space power systems were modeled to study performance trends in a trade space characteristic of interplanetary orbiters. For working-fluid molar masses of 48.6, 39.9, and 11.9 kg/kmol, peak system pressures of 1.38 and 3.0 MPa and compressor pressure ratios ranging from 1.6 to 2.4, total system masses were estimated. System mass increased as peak operating pressure increased for all compressor pressure ratios and molar mass values examined. Minimum mass point comparison between 72 percent He at 1.38 MPa peak and 94 percent He at 3.0 MPa peak showed an increase in system mass of 14 percent. Converter flow loop entropy generation rates were calculated for 1.38 and 3.0 MPa peak pressure cases. Physical system behavior was approximated using a pedigreed NASA Glenn modeling code, Closed Cycle Engine Program (CCEP), which included realistic performance prediction for heat exchangers, radiators and turbomachinery.

  6. System Mass Variation and Entropy Generation in 100-kWe Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Reid, Bryan M.

    2004-01-01

    State-of-the-art closed-Brayton-cycle (CBC) space power systems were modeled to study performance trends in a trade space characteristic of interplanetary orbiters. For working-fluid molar masses of 48.6, 39.9, and 11.9 kg/kmol, peak system pressures of 1.38 and 3.0 MPa and compressor pressure ratios ranging from 1.6 to 2.4, total system masses were estimated. System mass increased as peak operating pressure increased for all compressor pressure ratios and molar mass values examined. Minimum mass point comparison between 72 percent He at 1.38 MPa peak and 94 percent He at 3.0 MPa peak showed an increase in system mass of 14 percent. Converter flow loop entropy generation rates were calculated for 1.38 and 3.0 MPa peak pressure cases. Physical system behavior was approximated using a pedigreed NASA Glenn modeling code, Closed Cycle Engine Program (CCEP), which included realistic performance prediction for heat exchangers, radiators and turbomachinery.

  7. VARiD: a variation detection framework for color-space and letter-space platforms.

    PubMed

    Dalca, Adrian V; Rumble, Stephen M; Levy, Samuel; Brudno, Michael

    2010-06-15

    High-throughput sequencing (HTS) technologies are transforming the study of genomic variation. The various HTS technologies have different sequencing biases and error rates, and while most HTS technologies sequence the residues of the genome directly, generating base calls for each position, the Applied Biosystem's SOLiD platform generates dibase-coded (color space) sequences. While combining data from the various platforms should increase the accuracy of variation detection, to date there are only a few tools that can identify variants from color space data, and none that can analyze color space and regular (letter space) data together. We present VARiD--a probabilistic method for variation detection from both letter- and color-space reads simultaneously. VARiD is based on a hidden Markov model and uses the forward-backward algorithm to accurately identify heterozygous, homozygous and tri-allelic SNPs, as well as micro-indels. Our analysis shows that VARiD performs better than the AB SOLiD toolset at detecting variants from color-space data alone, and improves the calls dramatically when letter- and color-space reads are combined. The toolset is freely available at http://compbio.cs.utoronto.ca/varid.

  8. Formal methods for test case generation

    NASA Technical Reports Server (NTRS)

    Rushby, John (Inventor); De Moura, Leonardo Mendonga (Inventor); Hamon, Gregoire (Inventor)

    2011-01-01

    The invention relates to the use of model checkers to generate efficient test sets for hardware and software systems. The method provides for extending existing tests to reach new coverage targets; searching *to* some or all of the uncovered targets in parallel; searching in parallel *from* some or all of the states reached in previous tests; and slicing the model relative to the current set of coverage targets. The invention provides efficient test case generation and test set formation. Deep regions of the state space can be reached within allotted time and memory. The approach has been applied to use of the model checkers of SRI's SAL system and to model-based designs developed in Stateflow. Stateflow models achieving complete state and transition coverage in a single test case are reported.

  9. Space Shuttle Main Engine structural analysis and data reduction/evaluation. Volume 3A: High pressure oxidizer turbo-pump preburner pump housing stress analysis report

    NASA Technical Reports Server (NTRS)

    Shannon, Robert V., Jr.

    1989-01-01

    The model generation and structural analysis performed for the High Pressure Oxidizer Turbopump (HPOTP) preburner pump volute housing located on the main pump end of the HPOTP in the space shuttle main engine are summarized. An ANSYS finite element model of the volute housing was built and executed. A static structural analysis was performed on the Engineering Analysis and Data System (EADS) Cray-XMP supercomputer

  10. Exact calculation of the time convolutionless master equation generator: Application to the nonequilibrium resonant level model

    NASA Astrophysics Data System (ADS)

    Kidon, Lyran; Wilner, Eli Y.; Rabani, Eran

    2015-12-01

    The generalized quantum master equation provides a powerful tool to describe the dynamics in quantum impurity models driven away from equilibrium. Two complementary approaches, one based on Nakajima-Zwanzig-Mori time-convolution (TC) and the other on the Tokuyama-Mori time-convolutionless (TCL) formulations provide a starting point to describe the time-evolution of the reduced density matrix. A key in both approaches is to obtain the so called "memory kernel" or "generator," going beyond second or fourth order perturbation techniques. While numerically converged techniques are available for the TC memory kernel, the canonical approach to obtain the TCL generator is based on inverting a super-operator in the full Hilbert space, which is difficult to perform and thus, nearly all applications of the TCL approach rely on a perturbative scheme of some sort. Here, the TCL generator is expressed using a reduced system propagator which can be obtained from system observables alone and requires the calculation of super-operators and their inverse in the reduced Hilbert space rather than the full one. This makes the formulation amenable to quantum impurity solvers or to diagrammatic techniques, such as the nonequilibrium Green's function. We implement the TCL approach for the resonant level model driven away from equilibrium and compare the time scales for the decay of the generator with that of the memory kernel in the TC approach. Furthermore, the effects of temperature, source-drain bias, and gate potential on the TCL/TC generators are discussed.

  11. A Simulation Testbed for Airborne Merging and Spacing

    NASA Technical Reports Server (NTRS)

    Santos, Michel; Manikonda, Vikram; Feinberg, Art; Lohr, Gary

    2008-01-01

    The key innovation in this effort is the development of a simulation testbed for airborne merging and spacing (AM&S). We focus on concepts related to airports with Super Dense Operations where new airport runway configurations (e.g. parallel runways), sequencing, merging, and spacing are some of the concepts considered. We focus on modeling and simulating a complementary airborne and ground system for AM&S to increase efficiency and capacity of these high density terminal areas. From a ground systems perspective, a scheduling decision support tool generates arrival sequences and spacing requirements that are fed to the AM&S system operating on the flight deck. We enhanced NASA's Airspace Concept Evaluation Systems (ACES) software to model and simulate AM&S concepts and algorithms.

  12. Printing Space: Using 3D Printing of Digital Terrain Models in Geosciences Education and Research

    ERIC Educational Resources Information Center

    Horowitz, Seth S.; Schultz, Peter H.

    2014-01-01

    Data visualization is a core component of every scientific project; however, generation of physical models previously depended on expensive or labor-intensive molding, sculpting, or laser sintering techniques. Physical models have the advantage of providing not only visual but also tactile modes of inspection, thereby allowing easier visual…

  13. Examining Factor Score Distributions to Determine the Nature of Latent Spaces

    ERIC Educational Resources Information Center

    Steinley, Douglas; McDonald, Roderick P.

    2007-01-01

    Similarities between latent class models with K classes and linear factor models with K-1 factors are investigated. Specifically, the mathematical equivalence between the covariance structure of the two models is discussed, and a Monte Carlo simulation is performed using generated data that represents both latent factors and latent classes with…

  14. Continuous Sub-daily Rainfall Simulation for Regional Flood Risk Assessment - Modelling of Spatio-temporal Correlation Structure of Extreme Precipitation in the Austrian Alps

    NASA Astrophysics Data System (ADS)

    Salinas, J. L.; Nester, T.; Komma, J.; Bloeschl, G.

    2017-12-01

    Generation of realistic synthetic spatial rainfall is of pivotal importance for assessing regional hydroclimatic hazard as the input for long term rainfall-runoff simulations. The correct reproduction of observed rainfall characteristics, such as regional intensity-duration-frequency curves, and spatial and temporal correlations is necessary to adequately model the magnitude and frequency of the flood peaks, by reproducing antecedent soil moisture conditions before extreme rainfall events, and joint probability of flood waves at confluences. In this work, a modification of the model presented by Bardossy and Platte (1992), where precipitation is first modeled on a station basis as a multivariate autoregressive model (mAr) in a Normal space. The spatial and temporal correlation structures are imposed in the Normal space, allowing for a different temporal autocorrelation parameter for each station, and simultaneously ensuring the positive-definiteness of the correlation matrix of the mAr errors. The Normal rainfall is then transformed to a Gamma-distributed space, with parameters varying monthly according to a sinusoidal function, in order to adapt to the observed rainfall seasonality. One of the main differences with the original model is the simulation time-step, reduced from 24h to 6h. Due to a larger availability of daily rainfall data, as opposite to sub-daily (e.g. hourly), the parameters of the Gamma distributions are calibrated to reproduce simultaneously a series of daily rainfall characteristics (mean daily rainfall, standard deviations of daily rainfall, and 24h intensity-duration-frequency [IDF] curves), as well as other aggregated rainfall measures (mean annual rainfall, and monthly rainfall). The calibration of the spatial and temporal correlation parameters is performed in a way that the catchment-averaged IDF curves aggregated at different temporal scales fit the measured ones. The rainfall model is used to generate 10.000 years of synthetic precipitation, fed into a rainfall-runoff model to derive the flood frequency in the Tirolean Alps in Austria. Given the number of generated events, the simulation framework is able to generate a large variety of rainfall patterns, as well as reproduce the variograms of relevant extreme rainfall events in the region of interest.

  15. Anderson localization in sigma models

    NASA Astrophysics Data System (ADS)

    Bruckmann, Falk; Wellnhofer, Jacob

    2018-03-01

    In QCD above the chiral restoration temperature there exists an Anderson transition in the fermion spectrum from localized to delocalized modes. We investigate whether the same holds for nonlinear sigma models which share properties like dynamical mass generation and asymptotic freedom with QCD. In particular we study the spectra of fermions coupled to (quenched) CP(N-1) configurations at high temperatures. We compare results in two and three space-time dimensions: in two dimensions the Anderson transition is absent, since all fermion modes are localized, while in three dimensions it is present. Our measurements include a more recent observable characterizing level spacings: the distribution of ratios of consecutive level spacings.

  16. The Importance of Distance to Resources in the Spatial Modelling of Bat Foraging Habitat

    PubMed Central

    Rainho, Ana; Palmeirim, Jorge M.

    2011-01-01

    Many bats are threatened by habitat loss, but opportunities to manage their habitats are now increasing. Success of management depends greatly on the capacity to determine where and how interventions should take place, so models predicting how animals use landscapes are important to plan them. Bats are quite distinctive in the way they use space for foraging because (i) most are colonial central-place foragers and (ii) exploit scattered and distant resources, although this increases flying costs. To evaluate how important distances to resources are in modelling foraging bat habitat suitability, we radio-tracked two cave-dwelling species of conservation concern (Rhinolophus mehelyi and Miniopterus schreibersii) in a Mediterranean landscape. Habitat and distance variables were evaluated using logistic regression modelling. Distance variables greatly increased the performance of models, and distance to roost and to drinking water could alone explain 86 and 73% of the use of space by M. schreibersii and R. mehelyi, respectively. Land-cover and soil productivity also provided a significant contribution to the final models. Habitat suitability maps generated by models with and without distance variables differed substantially, confirming the shortcomings of maps generated without distance variables. Indeed, areas shown as highly suitable in maps generated without distance variables proved poorly suitable when distance variables were also considered. We concluded that distances to resources are determinant in the way bats forage across the landscape, and that using distance variables substantially improves the accuracy of suitability maps generated with spatially explicit models. Consequently, modelling with these variables is important to guide habitat management in bats and similarly mobile animals, particularly if they are central-place foragers or depend on spatially scarce resources. PMID:21547076

  17. Application of Generative Topographic Mapping to Gear Failures Monitoring

    NASA Astrophysics Data System (ADS)

    Liao, Guanglan; Li, Weihua; Shi, Tielin; Rao, Raj B. K. N.

    2002-07-01

    The Generative Topographic Mapping (GTM) model is introduced as a probabilistic re-formation of the self-organizing map and has already been used in a variety of applications. This paper presents a study of the GTM in industrial gear failures monitoring. Vibration signals are analyzed using the GTM model, and the results show that gear feature data sets can be projected into a two-dimensional space and clustered in different areas according to their conditions, which can classify and identify clearly a gear work condition with cracked or broken tooth compared with the normal condition. With the trace of the image points in the two-dimensional space, the variation of gear work conditions can be observed visually, therefore, the occurrence and varying trend of gear failures can be monitored in time.

  18. HVI-Test Setup for Debris Detector Verification

    NASA Astrophysics Data System (ADS)

    Bauer, Waldemar; Romberg, Oliver; Wiedemann, Carsten; Putzar, Robin; Drolshagen, Gerhard; Vorsmann, Peter

    2013-08-01

    Risk assessment concerning impacting space debris or micrometeoroids with spacecraft or payloads can be performed by using environmental models such as MASTER (ESA) or ORDEM (NASA). The validation of such models is performed by comparison of simulated results with measured data. Such data can be obtained from ground-based or space-based radars or telescopes, or by analysis of space hardware (e.g. Hubble Space Telescope, Space Shuttle Windows), which are retrieved from orbit. An additional data source is in-situ impact detectors, which are purposed for the collection of space debris and micrometeoroids impact data. In comparison to the impact data gained by analysis of the retrieved surfaces, the detected data contains additional information regarding impact time and orbit. In the past, many such in-situ detectors have been developed, with different measurement methods for the identification and classification of impacting objects. However, existing detectors have a drawback in terms of data acquisition. Generally the detection area is small, limiting the collected data as the number of recorded impacts has a linear dependence to the exposed area. An innovative impact detector concept is currently under development at the German Aerospace Centre (DLR) in Bremen, in order to increase the surface area while preserving the advantages offered by dedicated in-situ impact detectors. The Solar Generator based Impact Detector (SOLID) is not an add-on component on the spacecraft, making it different to all previous impact detectors. SOLID utilises existing subsystems of the spacecraft and adapts them for impact detection purposes. Solar generators require large panel surfaces in order to provide the spacecraft with sufficient energy. Therefore, the spacecraft solar panels provide a perfect opportunity for application as impact detectors. Employment of the SOLID method in several spacecraft in various orbits would serve to significantly increase the spatial coverage concerning space debris and micrometeoroids. In this way, the SOLID method will allow the generation of a large amount of impact data for environmental model validation. The ground verification of the SOLID method was performed at Fraunhofer EMI. For this purpose, a test model was developed. This paper focuses on the test methodology and development of the Hypervelocity Impact (HVI) test setup, including pretesting at the German Aerospace Centre (DLR), Bremen. Foreseen hardware and software for the automatic damage assessment of the detector after the impact are also presented.

  19. The economics of bootstrapping space industries - Development of an analytic computer model

    NASA Technical Reports Server (NTRS)

    Goldberg, A. H.; Criswell, D. R.

    1982-01-01

    A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.

  20. Preliminary Results from a Model-Driven Architecture Methodology for Development of an Event-Driven Space Communications Service Concept

    NASA Technical Reports Server (NTRS)

    Roberts, Christopher J.; Morgenstern, Robert M.; Israel, David J.; Borky, John M.; Bradley, Thomas H.

    2017-01-01

    NASA's next generation space communications network will involve dynamic and autonomous services analogous to services provided by current terrestrial wireless networks. This architecture concept, known as the Space Mobile Network (SMN), is enabled by several technologies now in development. A pillar of the SMN architecture is the establishment and utilization of a continuous bidirectional control plane space link channel and a new User Initiated Service (UIS) protocol to enable more dynamic and autonomous mission operations concepts, reduced user space communications planning burden, and more efficient and effective provider network resource utilization. This paper provides preliminary results from the application of model driven architecture methodology to develop UIS. Such an approach is necessary to ensure systematic investigation of several open questions concerning the efficiency, robustness, interoperability, scalability and security of the control plane space link and UIS protocol.

  1. Geostatistical analysis of fault and joint measurements in Austin Chalk, Superconducting Super Collider Site, Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, R.E.; Nance, H.S.; Laubach, S.E.

    1995-06-01

    Faults and joints are conduits for ground-water flow and targets for horizontal drilling in the petroleum industry. Spacing and size distribution are rarely predicted accurately by current structural models or documented adequately by conventional borehole or outcrop samples. Tunnel excavations present opportunities to measure fracture attributes in continuous subsurface exposures. These fracture measurements ran be used to improve structural models, guide interpretation of conventional borehole and outcrop data, and geostatistically quantify spatial and spacing characteristics for comparison to outcrop data or for generating distributions of fracture for numerical flow and transport modeling. Structure maps of over 9 mi of nearlymore » continuous tunnel excavations in Austin Chalk at the Superconducting Super Collider (SSC) site in Ellis County, Texas, provide a unique database of fault and joint populations for geostatistical analysis. Observationally, small faults (<10 ft. throw) occur in clusters or swarms that have as many as 24 faults, fault swarms are as much as 2,000 ft. wide and appear to be on average 1,000 ft. apart, and joints are in swarms spaced 500 to more than 2l,000 ft. apart. Semi-variograms show varying degrees of spatial correlation. These variograms have structured sills that correlate directly to highs and lows in fracture frequency observed in the tunnel. Semi-variograms generated with respect to fracture spacing and number also have structured sills, but tend to not show any near-field correlation. The distribution of fault spacing can be described with a negative exponential, which suggests a random distribution. However, there is clearly some structure and clustering in the spacing data as shown by running average and variograms, which implies that a number of different methods should be utilized to characterize fracture spacing.« less

  2. An efficient algorithm for generating diverse microstructure sets and delineating properties closures

    DOE PAGES

    Johnson, Oliver K.; Kurniawan, Christian

    2018-02-03

    Properties closures delineate the theoretical objective space for materials design problems, allowing designers to make informed trade-offs between competing constraints and target properties. In this paper, we present a new algorithm called hierarchical simplex sampling (HSS) that approximates properties closures more efficiently and faithfully than traditional optimization based approaches. By construction, HSS generates samples of microstructure statistics that span the corresponding microstructure hull. As a result, we also find that HSS can be coupled with synthetic polycrystal generation software to generate diverse sets of microstructures for subsequent mesoscale simulations. Finally, by more broadly sampling the space of possible microstructures, itmore » is anticipated that such diverse microstructure sets will expand our understanding of the influence of microstructure on macroscale effective properties and inform the construction of higher-fidelity mesoscale structure-property models.« less

  3. An efficient algorithm for generating diverse microstructure sets and delineating properties closures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Oliver K.; Kurniawan, Christian

    Properties closures delineate the theoretical objective space for materials design problems, allowing designers to make informed trade-offs between competing constraints and target properties. In this paper, we present a new algorithm called hierarchical simplex sampling (HSS) that approximates properties closures more efficiently and faithfully than traditional optimization based approaches. By construction, HSS generates samples of microstructure statistics that span the corresponding microstructure hull. As a result, we also find that HSS can be coupled with synthetic polycrystal generation software to generate diverse sets of microstructures for subsequent mesoscale simulations. Finally, by more broadly sampling the space of possible microstructures, itmore » is anticipated that such diverse microstructure sets will expand our understanding of the influence of microstructure on macroscale effective properties and inform the construction of higher-fidelity mesoscale structure-property models.« less

  4. NASA USRP Internship Final Report

    NASA Technical Reports Server (NTRS)

    Black, Jesse A.

    2010-01-01

    The purpose of this report is to describe the body of work I have produced as a NASA USRP intern in the spring 2010. My mentor during this time was Richard Birr and I assisted him with many tasks in the advanced systems group in the engineering design lab at NASA's Kennedy space center. The main priority was and scenario modeling for the FAA's next generation air traffic control system and also developing next generation range systems for implementation at Kennedy space center. Also of importance was the development of wiring diagrams for the portable communications terminal for the desert rats program.

  5. IHY Modeling Support at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Chulaki, A.; Hesse, Michael; Kuznetsova, Masha; MacNeice, P.; Rastaetter, L.

    2005-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. In particular, the CCMC provides to the research community the execution of "runs-onrequest" for specific events of interest to space science researchers. Through this activity and the concurrent development of advanced visualization tools, CCMC provides, to the general science community, unprecedented access to a large number of state-of-the-art research models. CCMC houses models that cover the entire domain from the Sun to the Earth. In this presentation, we will provide an overview of CCMC modeling services that are available to support activities during the International Heliospheric Year. In order to tailor CCMC activities to IHY needs, we will also invite community input into our IHY planning activities.

  6. Relative Panoramic Camera Position Estimation for Image-Based Virtual Reality Networks in Indoor Environments

    NASA Astrophysics Data System (ADS)

    Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.

    2017-09-01

    Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.

  7. Hydraulic Properties of Closely Spaced Dipping Open Fractures Intersecting a Fluid-Filled Borehole Derived From Tube Wave Generation and Scattering

    NASA Astrophysics Data System (ADS)

    Minato, Shohei; Ghose, Ranajit; Tsuji, Takeshi; Ikeda, Michiharu; Onishi, Kozo

    2017-10-01

    Fluid-filled fractures and fissures often determine the pathways and volume of fluid movement. They are critically important in crustal seismology and in the exploration of geothermal and hydrocarbon reservoirs. We introduce a model for tube wave scattering and generation at dipping, parallel-wall fractures intersecting a fluid-filled borehole. A new equation reveals the interaction of tube wavefield with multiple, closely spaced fractures, showing that the fracture dip significantly affects the tube waves. Numerical modeling demonstrates the possibility of imaging these fractures using a focusing analysis. The focused traces correspond well with the known fracture density, aperture, and dip angles. Testing the method on a VSP data set obtained at a fault-damaged zone in the Median Tectonic Line, Japan, presents evidences of tube waves being generated and scattered at open fractures and thin cataclasite layers. This finding leads to a new possibility for imaging, characterizing, and monitoring in situ hydraulic properties of dipping fractures using the tube wavefield.

  8. JSC engineers visit area schools for National Engineers Week

    NASA Image and Video Library

    1996-02-28

    Johnson Space Center (JSC) engineers visit Houston area schools for National Engineers Week. Students examine a machine that generates static electricity (4296-7). Students examine model rockets (4298).

  9. Application of reaction-diffusion models to cell patterning in Xenopus retina. Initiation of patterns and their biological stability.

    PubMed

    Shoaf, S A; Conway, K; Hunt, R K

    1984-08-07

    We have examined the behavior of two reaction-diffusion models, originally proposed by Gierer & Meinhardt (1972) and by Kauffman, Shymko & Trabert (1978), for biological pattern formation. Calculations are presented for pattern formation on a disc (approximating the geometry of a number of embryonic anlagen including the frog eye rudiment), emphasizing the sensitivity of patterns to changes in initial conditions and to perturbations in the geometry of the morphogen-producing space. Analysis of the linearized equations from the models enabled us to select appropriate parameters and disc size for pattern growth. A computer-implemented finite element method was used to solve the non-linear model equations reiteratively. For the Gierer-Meinhardt model, initial activation (varying in size over two orders of magnitude) of one point on the disc's edge was sufficient to generate the primary gradient. Various parts of the disc were removed (remaining only as diffusible space) from the morphogen-producing cycle to investigate the effects of cells dropping out of the cycle due to cell death or malfunction (single point removed) or differentiation (center removed), as occur in the Xenopus eye rudiment. The resulting patterns had the same general shape and amplitude as normal gradients. Nor did a two-fold increase in disc size affect the pattern-generating ability of the model. Disc fragments bearing their primary gradient patterns were fused (with gradients in opposite directions, but each parallel to the fusion line). The resulting patterns generated by the model showed many similarities to results of "compound eye" experiments in Xenopus. Similar patterns were obtained with the model of Kauffman's group (1978), but we found less stability of the pattern subject to simulations of central differentiation. However, removal of a single point from the morphogen cycle (cell death) did not result in any change. The sensitivity of the Kauffman et al. model to shape perturbations is not surprising since the model was originally designed to use shape and increasing size during growth to generate a sequence of transient patterns. However, the Gierer-Meinhardt model is remarkably stable even when subjected to a wide range of perturbations in the diffusible space, thus allowing it to cope with normal biological variability, and offering an exciting range of possibilities for reaction-diffusion models as mechanisms underlying the spatial patterns of tissue structures.

  10. Control of free-flying space robot manipulator systems

    NASA Technical Reports Server (NTRS)

    Cannon, Robert H., Jr.

    1990-01-01

    New control techniques for self contained, autonomous free flying space robots were developed and tested experimentally. Free flying robots are envisioned as a key element of any successful long term presence in space. These robots must be capable of performing the assembly, maintenance, and inspection, and repair tasks that currently require human extravehicular activity (EVA). A set of research projects were developed and carried out using lab models of satellite robots and a flexible manipulator. The second generation space robot models use air cushion vehicle (ACV) technology to simulate in 2-D the drag free, zero g conditions of space. The current work is divided into 5 major projects: Global Navigation and Control of a Free Floating Robot, Cooperative Manipulation from a Free Flying Robot, Multiple Robot Cooperation, Thrusterless Robotic Locomotion, and Dynamic Payload Manipulation. These projects are examined in detail.

  11. Community Coordinated Modeling Center: A Powerful Resource in Space Science and Space Weather Education

    NASA Astrophysics Data System (ADS)

    Chulaki, A.; Kuznetsova, M. M.; Rastaetter, L.; MacNeice, P. J.; Shim, J. S.; Pulkkinen, A. A.; Taktakishvili, A.; Mays, M. L.; Mendoza, A. M. M.; Zheng, Y.; Mullinix, R.; Collado-Vega, Y. M.; Maddox, M. M.; Pembroke, A. D.; Wiegand, C.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) is a NASA affiliated interagency partnership with the primary goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable this small group to serve as a hub for raising generations of young space scientists and engineers. CCMC resources are publicly available online, providing unprecedented global access to the largest collection of modern space science models (developed by the international research community). CCMC has revolutionized the way simulations are utilized in classrooms settings, student projects, and scientific labs and serves hundreds of educators, students and researchers every year. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unrivaled capabilities and experiences, the team provides in-depth space weather training to students and professionals worldwide, and offers an amazing opportunity for undergraduates to engage in real-time space weather monitoring, analysis, forecasting and research. In-house development of state-of-the-art space weather tools and applications provides exciting opportunities to students majoring in computer science and computer engineering fields to intern with the software engineers at the CCMC while also learning about the space weather from the NASA scientists.

  12. Sensitivity of Attitude Determination on the Model Assumed for ISAR Radar Mappings

    NASA Astrophysics Data System (ADS)

    Lemmens, S.; Krag, H.

    2013-09-01

    Inverse synthetic aperture radars (ISAR) are valuable instrumentations for assessing the state of a large object in low Earth orbit. The images generated by these radars can reach a sufficient quality to be used during launch support or contingency operations, e.g. for confirming the deployment of structures, determining the structural integrity, or analysing the dynamic behaviour of an object. However, the direct interpretation of ISAR images can be a demanding task due to the nature of the range-Doppler space in which these images are produced. Recently, a tool has been developed by the European Space Agency's Space Debris Office to generate radar mappings of a target in orbit. Such mappings are a 3D-model based simulation of how an ideal ISAR image would be generated by a ground based radar under given processing conditions. These radar mappings can be used to support a data interpretation process. E.g. by processing predefined attitude scenarios during an observation sequence and comparing them with actual observations, one can detect non-nominal behaviour. Vice versa, one can also estimate the attitude states of the target by fitting the radar mappings to the observations. It has been demonstrated for the latter use case that a coarse approximation of the target through an 3D-model is already sufficient to derive the attitude information from the generated mappings. The level of detail required for the 3D-model is determined by the process of generating ISAR images, which is based on the theory of scattering bodies. Therefore, a complex surface can return an intrinsically noisy ISAR image. E.g. when many instruments on a satellite are visible to the observer, the ISAR image can suffer from multipath reflections. In this paper, we will further analyse the sensitivity of the attitude fitting algorithms to variations in the dimensions and the level of detail of the underlying 3D model. Moreover, we investigate the ability to estimate the orientations of different spacecraft components with respect to each other from the fitting procedure.

  13. Innovative Near Real-Time Data Dissemination Tools Developed by the Space Weather Research Center

    NASA Astrophysics Data System (ADS)

    Maddox, Marlo M.; Mullinix, Richard; Mays, M. Leila; Kuznetsova, Maria; Zheng, Yihua; Pulkkinen, Antti; Rastaetter, Lutz

    2013-03-01

    Access to near real-time and real-time space weather data is essential to accurately specifying and forecasting the space environment. The Space Weather Research Center at NASA Goddard Space Flight Center's Space Weather Laboratory provides vital space weather forecasting services primarily to NASA robotic mission operators, as well as external space weather stakeholders including the Air Force Weather Agency. A key component in this activity is the iNtegrated Space Weather Analysis System which is a joint development project at NASA GSFC between the Space Weather Laboratory, Community Coordinated Modeling Center, Applied Engineering & Technology Directorate, and NASA HQ Office Of Chief Engineer. The iSWA system was developed to address technical challenges in acquiring and disseminating space weather environment information. A key design driver for the iSWA system was to generate and present vast amounts of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. Having access to near real-time and real-time data is essential to not only ensuring that relevant observational data is available for analysis - but also in ensuring that models can be driven with the requisite input parameters at proper and efficient temporal and spacial resolutions. The iSWA system currently manages over 300 unique near-real and real-time data feeds from various sources consisting of both observational and simulation data. A comprehensive suite of actionable space weather analysis tools and products are generated and provided utilizing a mixture of the ingested data - enabling new capabilities in quickly assessing past, present, and expected space weather effects. This paper will highlight current and future iSWA system capabilities including the utilization of data from the Solar Dynamics Observatory mission. http://iswa.gsfc.nasa.gov/

  14. Next-Generation NASA Earth-Orbiting Relay Satellites: Fusing Optical and Microwave Communications

    NASA Technical Reports Server (NTRS)

    Israel, David J.; Shaw, Harry

    2018-01-01

    NASA is currently considering architectures and concepts for the generation of relay satellites that will replace the Tracking and Data Relay Satellite (TDRS) constellation, which has been flying since 1983. TDRS-M, the last of the second TDRS generation, launched in August 2017, extending the life of the TDRS constellation beyond 2030. However, opportunities exist to re-engineer the concepts of geosynchronous Earth relay satellites. The needs of the relay satellite customers have changed dramatically over the last 34 years since the first TDRS launch. There is a demand for greater bandwidth as the availability of the traditional RF spectrum for space communications diminishes and the demand for ground station access grows. The next generation of NASA relay satellites will provide for operations that have factored in these new constraints. In this paper, we describe a heterogeneous constellation of geosynchronous relay satellites employing optical and RF communications. The new constellation will enable new optical communications services formed by user-to-space relay, space relay-to-space relay and space relay-to-ground links. It will build upon the experience from the Lunar Laser Communications Demonstration from 2013 and the Laser Communications Relay Demonstration to be launched in 2019.Simultaneous to establishment of the optical communications space segment, spacecraft in the TDRS constellation will be replaced with RF relay satellites with targeted subsets of the TDRS capabilities. This disaggregation of the TDRS service model will allow for flexibility in replenishing the needs of legacy users as well as addition of new capabilities for future users. It will also permit the U.S. government access to launch capabilities such as rideshare and to hosted payloads that were not previously available.In this paper, we also explore how the next generation of Earth relay satellites provides a significant boost in the opportunities for commercial providers to the communications space segment. For optical communications, the backbone of this effort is adoption of commercial technologies from the terrestrial high-bandwidth telecommunications industry into optical payloads. For RF communications, the explosion of software-defined radio, high-speed digital signal processing technologies and networking from areas such as 5G multicarrier will be important. Future commercial providers will not be limited to a small set of large aerospace companies. Ultimately, entirely government-owned and -operated satellite communications will phase out and make way for commercial business models that satisfy NASA's satellite communications requirements. The competition being provided by new entrants in the space communications business may result in a future in which all NASA communications needs can be satisfied commercially.

  15. Next-Generation NASA Earth-Orbiting Relay Satellites: Fusing Microwave and Optical Communications

    NASA Technical Reports Server (NTRS)

    Israel, David J.

    2018-01-01

    NASA is currently considering architectures and concepts for the generation of relay satellites that will replace the Tracking and Data Relay Satellite (TDRS) constellation, which has been flying since 1983. TDRS-M, the last of the second TDRS generation, launched in August 2017, extending the life of the TDRS constellation beyond 2030. However, opportunities exist to re-engineer the concepts of geosynchronous Earth relay satellites. The needs of the relay satellite customers have changed dramatically over the last 34 years since the first TDRS launch. There is a demand for greater bandwidth as the availability of the traditional RF spectrum for space communications diminishes and the demand for ground station access grows. The next generation of NASA relay satellites will provide for operations that have factored in these new constraints. In this paper, we describe a heterogeneous constellation of geosynchronous relay satellites employing optical and RF communications. The new constellation will enable new optical communications services formed by user-to-space relay, space relay-to-space relay and space relay-to-ground links. It will build upon the experience from the Lunar Laser Communications Demonstration from 2013 and the Laser Communications Relay Demonstration to be launched in 2019.Simultaneous to establishment of the optical communications space segment, spacecraft in the TDRS constellation will be replaced with RF relay satellites with targeted subsets of the TDRS capabilities. This disaggregation of the TDRS service model will allow for flexibility in replenishing the needs of legacy users as well as addition of new capabilities for future users. It will also permit the U.S. government access to launch capabilities such as rideshare and to hosted payloads that were not previously available. In this paper, we also explore how the next generation of Earth relay satellites provides a significant boost in the opportunities for commercial providers to the communications space segment. For optical communications, the backbone of this effort is adoption of commercial technologies from the terrestrial high-bandwidth telecommunications industry into optical payloads. For RF communications, the explosion of software-defined radio, high-speed digital signal processing technologies and networking from areas such as 5G multicarrier will be important. Future commercial providers will not be limited to a small set of large aerospace companies. Ultimately, entirely government-owned and -operated satellite communications will phase out and make way for commercial business models that satisfy NASAs satellite communications requirements. The competition being provided by new entrants in the space communications business may result in a future in which all NASA communications needs can be satisfied commercially.

  16. Effects of Free Molecular Heating on the Space Shuttle Active Thermal Control System

    NASA Technical Reports Server (NTRS)

    McCloud, Peter L.; Wobick, Craig A.

    2007-01-01

    During Space Transportation System (STS) flight 121, higher than predicted radiator outlet temperatures were experienced from post insertion and up until nominal correction (NC) burn two. Effects from the higher than predicted heat loads on the radiator panels led to an additional 50 lbm of supply water consumed by the Flash Evaporator System (FES). Post-flight analysis and research revealed that the additional heat loads were due to Free Molecular Heating (FMH) on the radiator panels, which previously had not been considered as a significant environmental factor for the Space Shuttle radiators. The current Orbiter radiator heat flux models were adapted to incorporate the effects of FMH in addition to solar, earth infrared and albedo sources. Previous STS flights were also examined to find additional flight data on the FMH environment. Results of the model were compared to flight data and verified against results generated by the National Aeronautics and Space Administration (NASA), Johnson Space Center (JSC) Aero-sciences group to verify the accuracy of the model.

  17. Analysis and Ground Testing for Validation of the Inflatable Sunshield in Space (ISIS) Experiment

    NASA Technical Reports Server (NTRS)

    Lienard, Sebastien; Johnston, John; Adams, Mike; Stanley, Diane; Alfano, Jean-Pierre; Romanacci, Paolo

    2000-01-01

    The Next Generation Space Telescope (NGST) design requires a large sunshield to protect the large aperture mirror and instrument module from constant solar exposure at its L2 orbit. The structural dynamics of the sunshield must be modeled in order to predict disturbances to the observatory attitude control system and gauge effects on the line of site jitter. Models of large, non-linear membrane systems are not well understood and have not been successfully demonstrated. To answer questions about sunshield dynamic behavior and demonstrate controlled deployment, the NGST project is flying a Pathfinder experiment, the Inflatable Sunshield in Space (ISIS). This paper discusses in detail the modeling and ground-testing efforts performed at the Goddard Space Flight Center to: validate analytical tools for characterizing the dynamic behavior of the deployed sunshield, qualify the experiment for the Space Shuttle, and verify the functionality of the system. Included in the discussion will be test parameters, test setups, problems encountered, and test results.

  18. Extended spin symmetry and the standard model

    NASA Astrophysics Data System (ADS)

    Besprosvany, J.; Romero, R.

    2010-12-01

    We review unification ideas and explain the spin-extended model in this context. Its consideration is also motivated by the standard-model puzzles. With the aim of constructing a common description of discrete degrees of freedom, as spin and gauge quantum numbers, the model departs from q-bits and generalized Hilbert spaces. Physical requirements reduce the space to one that is represented by matrices. The classification of the representations is performed through Clifford algebras, with its generators associated with Lorentz and scalar symmetries. We study a reduced space with up to two spinor elements within a matrix direct product. At given dimension, the demand that Lorentz symmetry be maintained, determines the scalar symmetries, which connect to vector-and-chiral gauge-interacting fields; we review the standard-model information in each dimension. We obtain fermions and bosons, with matter fields in the fundamental representation, radiation fields in the adjoint, and scalar particles with the Higgs quantum numbers. We relate the fields' representation in such spaces to the quantum-field-theory one, and the Lagrangian. The model provides a coupling-constant definition.

  19. Discrete mathematical physics and particle modeling

    NASA Astrophysics Data System (ADS)

    Greenspan, D.

    The theory and application of the arithmetic approach to the foundations of both Newtonian and special relativistic mechanics are explored. Using only arithmetic, a reformulation of the Newtonian approach is given for: gravity; particle modeling of solids, liquids, and gases; conservative modeling of laminar and turbulent fluid flow, heat conduction, and elastic vibration; and nonconservative modeling of heat convection, shock-wave generation, the liquid drop problem, porous flow, the interface motion of a melting solid, soap films, string vibrations, and solitons. An arithmetic reformulation of special relativistic mechanics is given for theory in one space dimension, relativistic harmonic oscillation, and theory in three space dimensions. A speculative quantum mechanical model of vibrations in the water molecule is also discussed.

  20. Lagrange multiplier and Wess-Zumino variable as extra dimensions in the torus universe

    NASA Astrophysics Data System (ADS)

    Nejad, Salman Abarghouei; Dehghani, Mehdi; Monemzadeh, Majid

    2018-01-01

    We study the effect of the simplest geometry which is imposed via the topology of the universe by gauging non-relativistic particle model on torus and 3-torus with the help of symplectic formalism of constrained systems. Also, we obtain generators of gauge transformations for gauged models. Extracting corresponding Poisson structure of existed constraints, we show the effect of the shape of the universe on canonical structure of phase-spaces of models and suggest some phenomenology to prove the topology of the universe and probable non-commutative structure of the space. In addition, we show that the number of extra dimensions in the phase-spaces of gauged embedded models are exactly two. Moreover, in classical form, we talk over modification of Newton's second law in order to study the origin of the terms appeared in the gauged theory.

  1. Integrated Control Modeling for Propulsion Systems Using NPSS

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Felder, James L.; Lavelle, Thomas M.; Withrow, Colleen A.; Yu, Albert Y.; Lehmann, William V. A.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS), an advanced engineering simulation environment used to design and analyze aircraft engines, has been enhanced by integrating control development tools into it. One of these tools is a generic controller interface that allows NPSS to communicate with control development software environments such as MATLAB and EASY5. The other tool is a linear model generator (LMG) that gives NPSS the ability to generate linear, time-invariant state-space models. Integrating these tools into NPSS enables it to be used for control system development. This paper will discuss the development and integration of these tools into NPSS. In addition, it will show a comparison of transient model results of a generic, dual-spool, military-type engine model that has been implemented in NPSS and Simulink. It will also show the linear model generator s ability to approximate the dynamics of a nonlinear NPSS engine model.

  2. High-Performance Computer Modeling of the Cosmos-Iridium Collision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S; Cook, K; Fasenfest, B

    2009-08-28

    This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellitemore » collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.« less

  3. A Model for Space Shuttle Orbiter Tire Side Forces Based on NASA Landing Systems Research Aircraft Test Results

    NASA Technical Reports Server (NTRS)

    Carter, John F.; Nagy, Christopher J.; Barnicki, Joseph S.

    1997-01-01

    Forces generated by the Space Shuttle orbiter tire under varying vertical load, slip angle, speed, and surface conditions were measured using the Landing System Research Aircraft (LSRA). Resulting data were used to calculate a mathematical model for predicting tire forces in orbiter simulations. Tire side and drag forces experienced by an orbiter tire are cataloged as a function of vertical load and slip angle. The mathematical model is compared to existing tire force models for the Space Shuttle orbiter. This report describes the LSRA and a typical test sequence. Testing methods, data reduction, and error analysis are presented. The LSRA testing was conducted on concrete and lakebed runways at the Edwards Air Force Flight Test Center and on concrete runways at the Kennedy Space Center (KSC). Wet runway tire force tests were performed on test strips made at the KSC using different surfacing techniques. Data were corrected for ply steer forces and conicity.

  4. System cost/performance analysis (study 2.3). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Kazangey, T.

    1973-01-01

    The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.

  5. Design and Development of a Model to Simulate 0-G Treadmill Running Using the European Space Agency's Subject Loading System

    NASA Technical Reports Server (NTRS)

    Caldwell, E. C.; Cowley, M. S.; Scott-Pandorf, M. M.

    2010-01-01

    Develop a model that simulates a human running in 0 G using the European Space Agency s (ESA) Subject Loading System (SLS). The model provides ground reaction forces (GRF) based on speed and pull-down forces (PDF). DESIGN The theoretical basis for the Running Model was based on a simple spring-mass model. The dynamic properties of the spring-mass model express theoretical vertical GRF (GRFv) and shear GRF in the posterior-anterior direction (GRFsh) during running gait. ADAMs VIEW software was used to build the model, which has a pelvis, thigh segment, shank segment, and a spring foot (see Figure 1).the model s movement simulates the joint kinematics of a human running at Earth gravity with the aim of generating GRF data. DEVELOPMENT & VERIFICATION ESA provided parabolic flight data of subjects running while using the SLS, for further characterization of the model s GRF. Peak GRF data were fit to a linear regression line dependent on PDF and speed. Interpolation and extrapolation of the regression equation provided a theoretical data matrix, which is used to drive the model s motion equations. Verification of the model was conducted by running the model at 4 different speeds, with each speed accounting for 3 different PDF. The model s GRF data fell within a 1-standard-deviation boundary derived from the empirical ESA data. CONCLUSION The Running Model aids in conducting various simulations (potential scenarios include a fatigued runner or a powerful runner generating high loads at a fast cadence) to determine limitations for the T2 vibration isolation system (VIS) aboard the International Space Station. This model can predict how running with the ESA SLS affects the T2 VIS and may be used for other exercise analyses in the future.

  6. A mesostate-space model for EEG and MEG.

    PubMed

    Daunizeau, Jean; Friston, Karl J

    2007-10-15

    We present a multi-scale generative model for EEG, that entails a minimum number of assumptions about evoked brain responses, namely: (1) bioelectric activity is generated by a set of distributed sources, (2) the dynamics of these sources can be modelled as random fluctuations about a small number of mesostates, (3) mesostates evolve in a temporal structured way and are functionally connected (i.e. influence each other), and (4) the number of mesostates engaged by a cognitive task is small (e.g. between one and a few). A Variational Bayesian learning scheme is described that furnishes the posterior density on the models parameters and its evidence. Since the number of meso-sources specifies the model, the model evidence can be used to compare models and find the optimum number of meso-sources. In addition to estimating the dynamics at each cortical dipole, the mesostate-space model and its inversion provide a description of brain activity at the level of the mesostates (i.e. in terms of the dynamics of meso-sources that are distributed over dipoles). The inclusion of a mesostate level allows one to compute posterior probability maps of each dipole being active (i.e. belonging to an active mesostate). Critically, this model accommodates constraints on the number of meso-sources, while retaining the flexibility of distributed source models in explaining data. In short, it bridges the gap between standard distributed and equivalent current dipole models. Furthermore, because it is explicitly spatiotemporal, the model can embed any stochastic dynamical causal model (e.g. a neural mass model) as a Markov process prior on the mesostate dynamics. The approach is evaluated and compared to standard inverse EEG techniques, using synthetic data and real data. The results demonstrate the added-value of the mesostate-space model and its variational inversion.

  7. Model-based high-throughput design of ion exchange protein chromatography.

    PubMed

    Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo

    2016-08-12

    This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. NASA Missions Enabled by Space Nuclear Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Schmidt, George R.

    2009-01-01

    This viewgraph presentation reviews NASA Space Missions that are enabled by Space Nuclear Systems. The topics include: 1) Space Nuclear System Applications; 2) Trade Space for Electric Power Systems; 3) Power Generation Specific Energy Trade Space; 4) Radioisotope Power Generation; 5) Radioisotope Missions; 6) Fission Power Generation; 7) Solar Powered Lunar Outpost; 8) Fission Powered Lunar Outpost; 9) Fission Electric Power Generation; and 10) Fission Nuclear Thermal Propulsion.

  9. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  10. Space-time least-squares Petrov-Galerkin projection in nonlinear model reduction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Youngsoo; Carlberg, Kevin Thomas

    Our work proposes a space-time least-squares Petrov-Galerkin (ST-LSPG) projection method for model reduction of nonlinear dynamical systems. In contrast to typical nonlinear model-reduction methods that first apply Petrov-Galerkin projection in the spatial dimension and subsequently apply time integration to numerically resolve the resulting low-dimensional dynamical system, the proposed method applies projection in space and time simultaneously. To accomplish this, the method first introduces a low-dimensional space-time trial subspace, which can be obtained by computing tensor decompositions of state-snapshot data. The method then computes discrete-optimal approximations in this space-time trial subspace by minimizing the residual arising after time discretization over allmore » space and time in a weighted ℓ 2-norm. This norm can be de ned to enable complexity reduction (i.e., hyper-reduction) in time, which leads to space-time collocation and space-time GNAT variants of the ST-LSPG method. Advantages of the approach relative to typical spatial-projection-based nonlinear model reduction methods such as Galerkin projection and least-squares Petrov-Galerkin projection include: (1) a reduction of both the spatial and temporal dimensions of the dynamical system, (2) the removal of spurious temporal modes (e.g., unstable growth) from the state space, and (3) error bounds that exhibit slower growth in time. Numerical examples performed on model problems in fluid dynamics demonstrate the ability of the method to generate orders-of-magnitude computational savings relative to spatial-projection-based reduced-order models without sacrificing accuracy.« less

  11. AF-GEOSpace Version 2.0: Space Environment Software Products for 2002

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Tautz, M.

    2002-05-01

    AF-GEOSpace Version 2.0 (release 2002 on WindowsNT/2000/XP) is a graphics-intensive software program developed by AFRL with space environment models and applications. It has grown steadily to become a development tool for automated space weather visualization products and helps with a variety of tasks: orbit specification for radiation hazard avoidance; satellite design assessment and post-event analysis; solar disturbance effects forecasting; frequency and antenna management for radar and HF communications; determination of link outage regions for active ionospheric conditions; and physics research and education. The object-oriented C++ code is divided into five module classes. Science Modules control science models to give output data on user-specified grids. Application Modules manipulate these data and provide orbit generation and magnetic field line tracing capabilities. Data Modules read and assist with the analysis of user-generated data sets. Graphics Modules enable the display of features such as plane slices, magnetic field lines, line plots, axes, the Earth, stars, and satellites. Worksheet Modules provide commonly requested coordinate transformations and calendar conversion tools. Common input data archive sets, application modules, and 1-, 2-, and 3-D visualization tools are provided to all models. The code documentation includes detailed examples with click-by-click instructions for investigating phenomena that have well known effects on communications and spacecraft systems. AF-GEOSpace Version 2.0 builds on the success of its predecessors. The first release (Version 1.21, 1996/IRIX on SGI) contained radiation belt particle flux and dose models derived from CRRES satellite data, an aurora model, an ionosphere model, and ionospheric HF ray tracing capabilities. Next (Version 1.4, 1999/IRIX on SGI) science modules were added related to cosmic rays and solar protons, low-Earth orbit radiation dosages, single event effects probability maps, ionospheric scintillation, and shock propagation models. New application modules for estimating linear energy transfer (LET) and single event upset (SEU) rates in solid-state devices, and graphic modules for visualizing radar fans, communication domes, and satellite detector cones and links were added. Automated FTP scripts permitted users to update their global input parameter set directly from NOAA/SEC. What?s New? Version 2.0 includes the first true dynamic run capabilities and offers new and enhanced graphical and data visualization tools such as 3-D volume rendering and eclipse umbra and penumbra determination. Animations of all model results can now be displayed together in all dimensions. There is a new realistic day-to-day ionospheric scintillation simulation generator (IONSCINT), an upgrade to the WBMOD scintillation code, a simplified HF ionospheric ray tracing module, and applications built on the NASA AE-8 and AP-8 radiation belt models. User-generated satellite data sets can now be visualized along with their orbital ephemeris. A prototype tool for visualizing MHD model results stored in structured grids provides a hint of where future space weather model development efforts are headed. A new graphical user interface (GUI) with improved module tracking and renaming features greatly simplifies software operation. AF-GEOSpace is distributed by the Space Weather Center of Excellence in the Space Vehicles Directorate of AFRL. Recently released for WindowsNT/2000/XP, versions for UNIX and LINUX operating systems will follow shortly. To obtain AF-GEOSpace Version 2.0, please send an e-mail request to the first author.

  12. Modeling and dynamic simulation of astronaut's upper limb motions considering counter torques generated by the space suit.

    PubMed

    Li, Jingwen; Ye, Qing; Ding, Li; Liao, Qianfang

    2017-07-01

    Extravehicular activity (EVA) is an inevitable task for astronauts to maintain proper functions of both the spacecraft and the space station. Both experimental research in a microgravity simulator (e.g. neutral buoyancy tank, zero-g aircraft or a drop tower/tube) and mathematical modeling were used to study EVA to provide guidance for the training on Earth and task design in space. Modeling has become more and more promising because of its efficiency. Based on the task analysis, almost 90% of EVA activity is accomplished through upper limb motions. Therefore, focusing on upper limb models of the body and space suit is valuable to this effort. In previous modeling studies, some multi-rigid-body systems were developed to simplify the human musculoskeletal system, and the space suit was mostly considered as a part of the astronaut body. With the aim to improve the reality of the models, we developed an astronauts' upper limb model, including a torque model and a muscle-force model, with the counter torques from the space suit being considered as a boundary condition. Inverse kinematics and the Maggi-Kane's method was applied to calculate the joint angles, joint torques and muscle force given that the terminal trajectory of upper limb motion was known. Also, we validated the muscle-force model using electromyogram (EMG) data collected in a validation experiment. Muscle force calculated from our model presented a similar trend with the EMG data, supporting the effectiveness and feasibility of the muscle-force model we established, and also, partially validating the joint model in kinematics aspect.

  13. Second generation spectrograph for the Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Woodgate, B. E.; Boggess, A.; Gull, T. R.; Heap, S. R.; Krueger, V. L.; Maran, S. P.; Melcher, R. W.; Rebar, F. J.; Vitagliano, H. D.; Green, R. F.; Wolff, S. C.; Hutchings, J. B.; Jenkins, E. B.; Linsky, J. L.; Moos, H. W.; Roesler, F.; Shine, R. A.; Timothy, J. G.; Weistrop, D. E.; Bottema, M.; Meyer, W.

    1986-01-01

    The preliminary design for the Space Telescope Imaging Spectrograph (STIS), which has been selected by NASA for definition study for future flight as a second-generation instrument on the Hubble Space Telescope (HST), is presented. STIS is a two-dimensional spectrograph that will operate from 1050 A to 11,000 A at the limiting HST resolution of 0.05 arcsec FWHM, with spectral resolutions of 100, 1200, 20,000, and 100,000 and a maximum field-of-view of 50 x 50 arcsec. Its basic operating modes include echelle model, long slit mode, slitless spectrograph mode, coronographic spectroscopy, photon time-tagging, and direct imaging. Research objectives are active galactic nuclei, the intergalactic medium, global properties of galaxies, the origin of stellar systems, stelalr spectral variability, and spectrographic mapping of solar system processes.

  14. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  15. Generic magnetohydrodynamic model at the Community Coordinated Modeling Center

    NASA Astrophysics Data System (ADS)

    Honkonen, I. J.; Rastaetter, L.; Glocer, A.

    2016-12-01

    The Community Coordinated Modeling Center (CCMC) at NASA Goddard Space Flight Center is a multi-agency partnership to enable, support and perform research and development for next-generation space science and space weather models. CCMC currently hosts nearly 100 numerical models and a cornerstone of this activity is the Runs on Request (RoR) system which allows anyone to request a model run and analyse/visualize the results via a web browser. CCMC is also active in the education community by organizing student research contests, heliophysics summer schools, and space weather forecaster training for students, government and industry representatives. Recently a generic magnetohydrodynamic (MHD) model was added to the CCMC RoR system which allows the study of a variety of fluid and plasma phenomena in one, two and three dimensions using a dynamic point-and-click web interface. For example students can experiment with the physics of fundamental wave modes of hydrodynamic and MHD theory, behavior of discontinuities and shocks as well as instabilities such as Kelvin-Helmholtz.Students can also use the model to experiments with numerical effects of models, i.e. how the process of discretizing a system of equations and solving them on a computer changes the solution. This can provide valuable background understanding e.g. for space weather forecasters on the effects of model resolution, numerical resistivity, etc. on the prediction.

  16. Channel analysis for single photon underwater free space quantum key distribution.

    PubMed

    Shi, Peng; Zhao, Shi-Cheng; Gu, Yong-Jian; Li, Wen-Dong

    2015-03-01

    We investigate the optical absorption and scattering properties of underwater media pertinent to our underwater free space quantum key distribution (QKD) channel model. With the vector radiative transfer theory and Monte Carlo method, we obtain the attenuation of photons, the fidelity of the scattered photons, the quantum bit error rate, and the sifted key generation rate of underwater quantum communication. It can be observed from our simulations that the most secure single photon underwater free space QKD is feasible in the clearest ocean water.

  17. Bone Research and Animal Support of Human Space Exploration: Where do we go from here?

    NASA Technical Reports Server (NTRS)

    Morey-Holton, Emily R.

    2004-01-01

    NASA exploration goals include returning humans to the moon by 20 15-2020 as a prelude for human exploration of Mars and beyond. The number of human flight subjects available during this very short time period is insufficient to solve high-risk problems without data from animals. This presentation will focus on three questions: What do we know? What do we need to know? Where do we go from here?: roles for animals in the exploration era. Answers to these questions are based on flight and ground-based models using humans and animals. First, what do we know? Adult humans have spent less than 1% of their lifespan in space while juvenile rats have spent almost 2%. This information suggests that our data are rather meager for projecting to a 30-month mission to Mars. The space platforms for humans have included Skylab, STS/MIR, and STS/ISS and for animals have included the unmanned Bion series and shuttle. The ground-based models include head-down bedrest in humans (BR) and hindlimb unloading in rodents (HU). We know that as gravity decreases, the impact forces generated by the body during locomotion decrease. For example, on Earth, your legs supports approximately 1 body weight (BW) when standing, 1.33BW when walking, and 3BW when jogging. On Mars, the same activity would generate 0.38BW standing, 0.5BW walking, and 1BW when jogging. In space, no impact load is generated, as gravity is minimal.

  18. Microwave ablation with multiple simultaneously powered small-gauge triaxial antennas: results from an in vivo swine liver model.

    PubMed

    Brace, Christopher L; Laeseke, Paul F; Sampson, Lisa A; Frey, Tina M; van der Weide, Daniel W; Lee, Fred T

    2007-07-01

    To prospectively investigate the ability of a single generator to power multiple small-diameter antennas and create large zones of ablation in an in vivo swine liver model. Thirteen female domestic swine (mean weight, 70 kg) were used for the study as approved by the animal care and use committee. A single generator was used to simultaneously power three triaxial antennas at 55 W per antenna for 10 minutes in three groups: a control group where antennas were spaced to eliminate ablation zone overlap (n=6; 18 individual zones of ablation) and experimental groups where antennas were spaced 2.5 cm (n=7) or 3.0 cm (n=5) apart. Animals were euthanized after ablation, and ablation zones were sectioned and measured. A mixed linear model was used to test for differences in size and circularity among groups. Mean (+/-standard deviation) cross-sectional areas of multiple-antenna zones of ablation at 2.5- and 3.0-cm spacing (26.6 cm(2) +/- 9.7 and 32.2 cm(2) +/- 8.1, respectively) were significantly larger than individual ablation zones created with single antennas (6.76 cm(2) +/- 2.8, P<.001) and were 31% (2.5-cm spacing group: multiple antenna mean area, 26.6 cm(2); 3 x single antenna mean area, 20.28 cm(2)) to 59% (3.0-cm spacing group: multiple antenna mean area, 32.2 cm(2); 3 x single antenna mean area, 20.28 cm(2)) larger than 3 times the mean area of the single-antenna zones. Zones of ablation were found to be very circular, and vessels as large as 1.1 cm were completely coagulated with multiple antennas. A single generator may effectively deliver microwave power to multiple antennas. Large volumes of tissue may be ablated and large vessels coagulated with multiple-antenna ablation in the same time as single-antenna ablation. (c) RSNA, 2007.

  19. Space Partitioning for Privacy Enabled 3D City Models

    NASA Astrophysics Data System (ADS)

    Filippovska, Y.; Wichmann, A.; Kada, M.

    2016-10-01

    Due to recent technological progress, data capturing and processing of highly detailed (3D) data has become extensive. And despite all prospects of potential uses, data that includes personal living spaces and public buildings can also be considered as a serious intrusion into people's privacy and a threat to security. It becomes especially critical if data is visible by the general public. Thus, a compromise is needed between open access to data and privacy requirements which can be very different for each application. As privacy is a complex and versatile topic, the focus of this work particularly lies on the visualization of 3D urban data sets. For the purpose of privacy enabled visualizations of 3D city models, we propose to partition the (living) spaces into privacy regions, each featuring its own level of anonymity. Within each region, the depicted 2D and 3D geometry and imagery is anonymized with cartographic generalization techniques. The underlying spatial partitioning is realized as a 2D map generated as a straight skeleton of the open space between buildings. The resulting privacy cells are then merged according to the privacy requirements associated with each building to form larger regions, their borderlines smoothed, and transition zones established between privacy regions to have a harmonious visual appearance. It is exemplarily demonstrated how the proposed method generates privacy enabled 3D city models.

  20. Transient Pressure Measurements in the Vaneless Space of a Francis Turbine during Load Acceptances from Minimum Load

    NASA Astrophysics Data System (ADS)

    Goyal, R.; Gandhi, B. K.; Cervantes, M. J.

    2018-06-01

    Increased penetration of solar and the wind impels the designers of the hydroelectric power generation unit to provide more flexibility in operation for the stability of the grid. The power generating unit includes turbine which needs to sustain sudden change in its operating conditions. Thus, the hydraulic turbine experiences more transients per day which result in chronic problems such as fatigue to the runner, instrument malfunctioning, vibrations, wear and tear etc. This paper describes experiments performed on a high model (1.5:1) Francis turbine for load acceptances from the minimum load. The experiments presented in the paper are the part of Francis-99 workshop which aims to determine the performance of numerical models in simulations of model Francis turbine under steady and transient operating conditions. The aim of the paper is to present the transient pressure variation in the vaneless space of a Francis turbine where high-frequency pulsations are normally expected. For this, two pressure sensors, VL1 and VL2, are mounted at the vaneless space, one near the beginning of the spiral casing and the other before the end of the spiral casing. Both are used to capture the unsteady pressure field developed in the space between guide vanes and runner inlet. The time-resolved pressure signals are analyzed and presented during the transient to observe the pressure variation and dominant frequencies of pulsations.

  1. Examining, Documenting, and Modeling the Problem Space of a Variable Domain

    DTIC Science & Technology

    2002-06-14

    Feature-Oriented Domain Analysis ( FODA ) .............................................................................................. 9...development of this proposed process include: Feature-Oriented Domain Analysis ( FODA ) [3,4], Organization Domain Modeling (ODM) [2,5,6], Family-Oriented...configuration knowledge using generators [2]. 8 Existing Methods of Domain Engineering Feature-Oriented Domain Analysis ( FODA ) FODA is a domain

  2. Space-Time Analysis of the Air Quality Model Evaluation International Initiative (AQMEII) Phase 1 Air Quality Simulations

    EPA Science Inventory

    This study presents an evaluation of summertime daily maximum ozone concentrations over North America (NA) and Europe (EU) using the database generated during Phase 1 of the Air Quality Model Evaluation International Initiative (AQMEII). The analysis focuses on identifying tempor...

  3. Simulator of Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Jennings, Esther; Gao, Jay; Segui, John; Kwong, Winston

    2005-01-01

    Multimission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) is a suite of software tools that simulates the behaviors of communication networks to be used in space exploration, and predict the performance of established and emerging space communication protocols and services. MACHETE consists of four general software systems: (1) a system for kinematic modeling of planetary and spacecraft motions; (2) a system for characterizing the engineering impact on the bandwidth and reliability of deep-space and in-situ communication links; (3) a system for generating traffic loads and modeling of protocol behaviors and state machines; and (4) a system of user-interface for performance metric visualizations. The kinematic-modeling system makes it possible to characterize space link connectivity effects, including occultations and signal losses arising from dynamic slant-range changes and antenna radiation patterns. The link-engineering system also accounts for antenna radiation patterns and other phenomena, including modulations, data rates, coding, noise, and multipath fading. The protocol system utilizes information from the kinematic-modeling and link-engineering systems to simulate operational scenarios of space missions and evaluate overall network performance. In addition, a Communications Effect Server (CES) interface for MACHETE has been developed to facilitate hybrid simulation of space communication networks with actual flight/ground software/hardware embedded in the overall system.

  4. NASA Office of Aeronautics and Space Technology Summer Workshop. Volume 4: Power technology panel

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Technology requirements in the areas of energy sources and conversion, power processing, distribution, conversion, and transmission, and energy storage are identified for space shuttle payloads. It is concluded that the power system technology currently available is adequate to accomplish all missions in the 1973 Mission Model, but that further development is needed to support space opportunities of the future as identified by users. Space experiments are proposed in the following areas: power generation in space, advanced photovoltaic energy converters, solar and nuclear thermoelectric technology, nickel-cadmium batteries, flywheels (mechanical storage), satellite-to-ground transmission and reconversion systems, and regenerative fuel cells.

  5. Simulation of Earthquake-Generated Sea-Surface Deformation

    NASA Astrophysics Data System (ADS)

    Vogl, Chris; Leveque, Randy

    2016-11-01

    Earthquake-generated tsunamis can carry with them a powerful, destructive force. One of the most well-known, recent examples is the tsunami generated by the Tohoku earthquake, which was responsible for the nuclear disaster in Fukushima. Tsunami simulation and forecasting, a necessary element of emergency procedure planning and execution, is typically done using the shallow-water equations. A typical initial condition is that using the Okada solution for a homogeneous, elastic half-space. This work focuses on simulating earthquake-generated sea-surface deformations that are more true to the physics of the materials involved. In particular, a water layer is added on top of the half-space that models the seabed. Sea-surface deformations are then simulated using the Clawpack hyperbolic PDE package. Results from considering the water layer both as linearly elastic and as "nearly incompressible" are compared to that of the Okada solution.

  6. Automatic mathematical modeling for real time simulation system

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1988-01-01

    A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.

  7. Use of open space box: supporting tele-medicine in space through efficient data transmission

    NASA Astrophysics Data System (ADS)

    Mohammad, Atif F.; Straub, Jeremy

    2015-05-01

    This paper presents a framework for a denormalized data ingestion and egress method which can be used among several types of in-space devices. Open Space Box is a novel model of communication that supports the data processing required to transform this data into products for utilization by the requesting stakeholders. One such set of data is the data that could be generated from a space-based 3D scanner. We provide an overview of 3D scanning technologies and discuss the storage/transmission needs and types of data generated by an optical 3D scanner developed and deployed at the University of North Dakota. Prospective usage patterns are discussed, as might be applicable to its use for regularly assessing astronauts' health and performing scientific experiments. Communication of this sort of data may be critical to the health of astronauts and to scientific mission goals and ongoing operations. Some of this data may be processed even before it is transmitted to the attending healthcare provider or requesting scientist. Given that the data communications to and from a spacecraft or space station happens at a limited rate and the massive amounts of data that could be generated from 3D scanning, the receiver or egress application to prepare data needs to be designed to transmit and receive data via an application with a flexible protocol base.

  8. SubductionGenerator: A program to build three-dimensional plate configurations

    NASA Astrophysics Data System (ADS)

    Jadamec, M. A.; Kreylos, O.; Billen, M. I.; Turcotte, D. L.; Knepley, M.

    2016-12-01

    Geologic, geochemical, and geophysical data from subduction zones indicate that a two-dimensional paradigm for plate tectonic boundaries is no longer adequate to explain the observations. Many open source software packages exist to simulate the viscous flow of the Earth, such as the dynamics of subduction. However, there are few open source programs that generate the three-dimensional model input. We present an open source software program, SubductionGenerator, that constructs the three-dimensional initial thermal structure and plate boundary structure. A 3D model mesh and tectonic configuration are constructed based on a user specified model domain, slab surface, seafloor age grid file, and shear zone surface. The initial 3D thermal structure for the plates and mantle within the model domain is then constructed using a series of libraries within the code that use a half-space cooling model, plate cooling model, and smoothing functions. The code maps the initial 3D thermal structure and the 3D plate interface onto the mesh nodes using a series of libraries including a k-d tree to increase efficiency. In this way, complicated geometries and multiple plates with variable thickness can be built onto a multi-resolution finite element mesh with a 3D thermal structure and 3D isotropic shear zones oriented at any angle with respect to the grid. SubductionGenerator is aimed at model set-ups more representative of the earth, which can be particularly challenging to construct. Examples include subduction zones where the physical attributes vary in space, such as slab dip and temperature, and overriding plate temperature and thickness. Thus, the program can been used to construct initial tectonic configurations for triple junctions and plate boundary corners.

  9. Lumped Model Generation and Evaluation: Sensitivity and Lie Algebraic Techniques with Applications to Combustion

    DTIC Science & Technology

    1989-03-03

    address global parameter space mapping issues for first order differential equations. The rigorous criteria for the existence of exact lumping by linear projective transformations was also established.

  10. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  11. Electromagnetic sinc Schell-model beams and their statistical properties.

    PubMed

    Mei, Zhangrong; Mao, Yonghua

    2014-09-22

    A class of electromagnetic sources with sinc Schell-model correlations is introduced. The conditions on source parameters guaranteeing that the source generates a physical beam are derived. The evolution behaviors of statistical properties for the electromagnetic stochastic beams generated by this new source on propagating in free space and in atmosphere turbulence are investigated with the help of the weighted superposition method and by numerical simulations. It is demonstrated that the intensity distributions of such beams exhibit unique features on propagating in free space and produce a double-layer flat-top profile of being shape-invariant in the far field. This feature makes this new beam particularly suitable for some special laser processing applications. The influences of the atmosphere turbulence with a non-Kolmogorov power spectrum on statistical properties of the new beams are analyzed in detail.

  12. Extracting Hydrologic Understanding from the Unique Space-time Sampling of the Surface Water and Ocean Topography (SWOT) Mission

    NASA Astrophysics Data System (ADS)

    Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.

    2017-12-01

    The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.

  13. Transitional flow in thin tubes for space station freedom radiator

    NASA Technical Reports Server (NTRS)

    Loney, Patrick; Ibrahim, Mounir

    1995-01-01

    A two dimensional finite volume method is used to predict the film coefficients in the transitional flow region (laminar or turbulent) for the radiator panel tubes. The code used to perform this analysis is CAST (Computer Aided Simulation of Turbulent Flows). The information gathered from this code is then used to augment a Sinda85 model that predicts overall performance of the radiator. A final comparison is drawn between the results generated with a Sinda85 model using the Sinda85 provided transition region heat transfer correlations and the Sinda85 model using the CAST generated data.

  14. Multistage Monte Carlo simulation of jet modification in a static medium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, S.; Park, C.; Barbieri, R. A.

    In this work, the modification of hard jets in an extended static medium held at a fixed temperature is studied using three different Monte Carlo event generators: linear Boltzmann transport (LBT), modular all twist transverse-scattering elastic-drag and radiation (MATTER), and modular algorithm for relativistic treatment of heavy-ion interactions (MARTINI). Each event generator contains a different set of assumptions regarding the energy and virtuality of the partons within a jet versus the energy scale of the medium and, hence, applies to a different epoch in the space-time history of the jet evolution. Here modeling is developed where a jet may sequentiallymore » transition from one generator to the next, on a parton-by-parton level, providing a detailed simulation of the space-time evolution of medium modified jets over a much broader dynamic range than has been attempted previously in a single calculation. Comparisons are carried out for different observables sensitive to jet quenching, including the parton fragmentation function and the azimuthal distribution of jet energy around the jet axis. The effect of varying the boundary between different generators is studied and a theoretically motivated criterion for the location of this boundary is proposed. Lastly, the importance of such an approach with coupled generators to the modeling of jet quenching is discussed.« less

  15. Multistage Monte Carlo simulation of jet modification in a static medium

    DOE PAGES

    Cao, S.; Park, C.; Barbieri, R. A.; ...

    2017-08-22

    In this work, the modification of hard jets in an extended static medium held at a fixed temperature is studied using three different Monte Carlo event generators: linear Boltzmann transport (LBT), modular all twist transverse-scattering elastic-drag and radiation (MATTER), and modular algorithm for relativistic treatment of heavy-ion interactions (MARTINI). Each event generator contains a different set of assumptions regarding the energy and virtuality of the partons within a jet versus the energy scale of the medium and, hence, applies to a different epoch in the space-time history of the jet evolution. Here modeling is developed where a jet may sequentiallymore » transition from one generator to the next, on a parton-by-parton level, providing a detailed simulation of the space-time evolution of medium modified jets over a much broader dynamic range than has been attempted previously in a single calculation. Comparisons are carried out for different observables sensitive to jet quenching, including the parton fragmentation function and the azimuthal distribution of jet energy around the jet axis. The effect of varying the boundary between different generators is studied and a theoretically motivated criterion for the location of this boundary is proposed. Lastly, the importance of such an approach with coupled generators to the modeling of jet quenching is discussed.« less

  16. Estimating long-term behavior of periodically driven flows without trajectory integration

    NASA Astrophysics Data System (ADS)

    Froyland, Gary; Koltai, Péter

    2017-05-01

    Periodically driven flows are fundamental models of chaotic behavior and the study of their transport properties is an active area of research. A well-known analytic construction is the augmentation of phase space with an additional time dimension; in this augmented space, the flow becomes autonomous or time-independent. We prove several results concerning the connections between the original time-periodic representation and the time-extended representation, focusing on transport properties. In the deterministic setting, these include single-period outflows and time-asymptotic escape rates from time-parameterized families of sets. We also consider stochastic differential equations with time-periodic advection term. In this stochastic setting one has a time-periodic generator (the differential operator given by the right-hand-side of the corresponding time-periodic Fokker-Planck equation). We define in a natural way an autonomous generator corresponding to the flow on time-extended phase space. We prove relationships between these two generator representations and use these to quantify decay rates of observables and to determine time-periodic families of sets with slow escape rate. Finally, we use the generator on the time-extended phase space to create efficient numerical schemes to implement the various theoretical constructions. These ideas build on the work of Froyland et al (2013 SIAM J. Numer. Anal. 51 223-47), and no expensive time integration is required. We introduce an efficient new hybrid approach, which treats the space and time dimensions separately.

  17. Exact calculation of the time convolutionless master equation generator: Application to the nonequilibrium resonant level model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kidon, Lyran; The Sackler Center for Computational Molecular and Materials Science, Tel Aviv University, Tel Aviv 69978; Wilner, Eli Y.

    2015-12-21

    The generalized quantum master equation provides a powerful tool to describe the dynamics in quantum impurity models driven away from equilibrium. Two complementary approaches, one based on Nakajima–Zwanzig–Mori time-convolution (TC) and the other on the Tokuyama–Mori time-convolutionless (TCL) formulations provide a starting point to describe the time-evolution of the reduced density matrix. A key in both approaches is to obtain the so called “memory kernel” or “generator,” going beyond second or fourth order perturbation techniques. While numerically converged techniques are available for the TC memory kernel, the canonical approach to obtain the TCL generator is based on inverting a super-operatormore » in the full Hilbert space, which is difficult to perform and thus, nearly all applications of the TCL approach rely on a perturbative scheme of some sort. Here, the TCL generator is expressed using a reduced system propagator which can be obtained from system observables alone and requires the calculation of super-operators and their inverse in the reduced Hilbert space rather than the full one. This makes the formulation amenable to quantum impurity solvers or to diagrammatic techniques, such as the nonequilibrium Green’s function. We implement the TCL approach for the resonant level model driven away from equilibrium and compare the time scales for the decay of the generator with that of the memory kernel in the TC approach. Furthermore, the effects of temperature, source-drain bias, and gate potential on the TCL/TC generators are discussed.« less

  18. Modal Survey of ETM-3, A 5-Segment Derivative of the Space Shuttle Solid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Nielsen, D.; Townsend, J.; Kappus, K.; Driskill, T.; Torres, I.; Parks, R.

    2005-01-01

    The complex interactions between internal motor generated pressure oscillations and motor structural vibration modes associated with the static test configuration of a Reusable Solid Rocket Motor have potential to generate significant dynamic thrust loads in the 5-segment configuration (Engineering Test Motor 3). Finite element model load predictions for worst-case conditions were generated based on extrapolation of a previously correlated 4-segment motor model. A modal survey was performed on the largest rocket motor to date, Engineering Test Motor #3 (ETM-3), to provide data for finite element model correlation and validation of model generated design loads. The modal survey preparation included pretest analyses to determine an efficient analysis set selection using the Effective Independence Method and test simulations to assure critical test stand component loads did not exceed design limits. Historical Reusable Solid Rocket Motor modal testing, ETM-3 test analysis model development and pre-test loads analyses, as well as test execution, and a comparison of results to pre-test predictions are discussed.

  19. Optimized decoy state QKD for underwater free space communication

    NASA Astrophysics Data System (ADS)

    Lopes, Minal; Sarwade, Nisha

    Quantum cryptography (QC) is envisioned as a solution for global key distribution through fiber optic, free space and underwater optical communication due to its unconditional security. In view of this, this paper investigates underwater free space quantum key distribution (QKD) model for enhanced transmission distance, secret key rates and security. It is reported that secure underwater free space QKD is feasible in the clearest ocean water with the sifted key rates up to 207kbps. This paper extends this work by testing performance of optimized decoy state QKD protocol with underwater free space communication model. The attenuation of photons, quantum bit error rate and the sifted key generation rate of underwater quantum communication is obtained with vector radiative transfer theory and Monte Carlo method. It is observed from the simulations that optimized decoy state QKD evidently enhances the underwater secret key transmission distance as well as secret key rates.

  20. Ultra-low current beams in UMER to model space-charge effects in high-energy proton and ion machines

    NASA Astrophysics Data System (ADS)

    Bernal, S.; Beaudoin, B.; Baumgartner, H.; Ehrenstein, S.; Haber, I.; Koeth, T.; Montgomery, E.; Ruisard, K.; Sutter, D.; Yun, D.; Kishek, R. A.

    2017-03-01

    The University of Maryland Electron Ring (UMER) has operated traditionally in the regime of strong space-charge dominated beam transport, but small-current beams are desirable to significantly reduce the direct (incoherent) space-charge tune shift as well as the tune depression. This regime is of interest to model space-charge effects in large proton and ion rings similar to those used in nuclear physics and spallation neutron sources, and also for nonlinear dynamics studies of lattices inspired on the Integrable Optics Test Accelerator (IOTA). We review the definitions of beam vs. space-charge intensities and discuss three methods for producing very small beam currents in UMER. We aim at generating 60µA - 1.0mA, 100 ns, 10 keV beams with normalized rms emittances of the order of 0.1 - 1.0µm.

  1. Complexity and diversity.

    PubMed

    Doebeli, Michael; Ispolatov, Iaroslav

    2010-04-23

    The mechanisms for the origin and maintenance of biological diversity are not fully understood. It is known that frequency-dependent selection, generating advantages for rare types, can maintain genetic variation and lead to speciation, but in models with simple phenotypes (that is, low-dimensional phenotype spaces), frequency dependence needs to be strong to generate diversity. However, we show that if the ecological properties of an organism are determined by multiple traits with complex interactions, the conditions needed for frequency-dependent selection to generate diversity are relaxed to the point where they are easily satisfied in high-dimensional phenotype spaces. Mathematically, this phenomenon is reflected in properties of eigenvalues of quadratic forms. Because all living organisms have at least hundreds of phenotypes, this casts the potential importance of frequency dependence for the origin and maintenance of diversity in a new light.

  2. Thermal Analysis and Testing of Fastrac Gas Generator Design

    NASA Technical Reports Server (NTRS)

    Nguyen, H.

    1998-01-01

    The Fastrac Engine is being developed by the Marshall Space Flight Center (MSFC) to help meet the goal of substantially reducing the cost of access to space. This engine relies on a simple gas-generator cycle, which burns a small amount of RP-1 and oxygen to provide gas to drive the turbine and then exhausts the spent fuel. The Fastrac program envisions a combination of analysis, design and hot-fire evaluation testing. This paper provides the supporting thermal analysis of the gas generator design. In order to ensure that the design objectives were met, the evaluation tests have started on a component level and a total of 15 tests of different durations were completed to date at MSFC. The correlated thermal model results will also be compared against hot-fire thermocouple data gathered.

  3. Aeroelastic modeling of the active flexible wing wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Heeg, Jennifer; Bennett, Robert M.

    1991-01-01

    The primary issues involved in the generation of linear, state-space equations of motion of a flexible wind tunnel model, the Active Flexible Wing (AFW), are discussed. The codes that were used and their inherent assumptions and limitations are also briefly discussed. The application of the CAP-TSD code to the AFW for determination of the model's transonic flutter boundary is included as well.

  4. Visualization of groundwater withdrawals

    USGS Publications Warehouse

    Winston, Richard B.; Goode, Daniel J.

    2017-12-21

    Generating an informative display of groundwater withdrawals can sometimes be difficult because the symbols for closely spaced wells can overlap. An alternative method for displaying groundwater withdrawals is to generate a “footprint” of the withdrawals. WellFootprint version 1.0 implements the Footprint algorithm with two optional variations that can speed up the footprint calculation. ModelMuse has been modified in order to generate the input for WellFootprint and to read and graphically display the output from WellFootprint.

  5. Human Modeling for Ground Processing Human Factors Engineering Analysis

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  6. Reducing a Knowledge-Base Search Space When Data Are Missing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.

  7. Community Coordinated Modeling Center (CCMC): Using innovative tools and services to support worldwide space weather scientific communities and networks

    NASA Astrophysics Data System (ADS)

    Mendoza, A. M.; Bakshi, S.; Berrios, D.; Chulaki, A.; Evans, R. M.; Kuznetsova, M. M.; Lee, H.; MacNeice, P. J.; Maddox, M. M.; Mays, M. L.; Mullinix, R. E.; Ngwira, C. M.; Patel, K.; Pulkkinen, A.; Rastaetter, L.; Shim, J.; Taktakishvili, A.; Zheng, Y.

    2012-12-01

    Community Coordinated Modeling Center (CCMC) was established to enhance basic solar terrestrial research and to aid in the development of models for specifying and forecasting conditions in the space environment. In achieving this goal, CCMC has developed and provides a set of innovative tools varying from: Integrated Space Weather Analysis (iSWA) web -based dissemination system for space weather information, Runs-On-Request System providing access to unique collection of state-of-the-art solar and space physics models (unmatched anywhere in the world), Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and recently Mobile apps (iPhone/Android) to view space weather data anywhere to the scientific community. The number of runs requested and the number of resulting scientific publications and presentations from the research community has not only been an indication of the broad scientific usage of the CCMC and effective participation by space scientists and researchers, but also guarantees active collaboration and coordination amongst the space weather research community. Arising from the course of CCMC activities, CCMC also supports community-wide model validation challenges and research focus group projects for a broad range of programs such as the multi-agency National Space Weather Program, NSF's CEDAR (Coupling, Energetics and Dynamics of Atmospheric Regions), GEM (Geospace Environment Modeling) and Shine (Solar Heliospheric and INterplanetary Environment) programs. In addition to performing research and model development, CCMC also supports space science education by hosting summer students through local universities; through the provision of simulations in support of classroom programs such as Heliophysics Summer School (with student research contest) and CCMC Workshops; training next generation of junior scientists in space weather forecasting; and educating the general public about the importance and impacts of space weather effects. Although CCMC is organizationally comprised of United States federal agencies, CCMC services are open to members of the international science community and encourages interagency and international collaboration. In this poster, we provide an overview of using Community Coordinated Modeling Center (CCMC) tools and services to support worldwide space weather scientific communities and networks.;

  8. Ames Culture Chamber System: Enabling Model Organism Research Aboard the international Space Station

    NASA Technical Reports Server (NTRS)

    Steele, Marianne

    2014-01-01

    Understanding the genetic, physiological, and behavioral effects of spaceflight on living organisms and elucidating the molecular mechanisms that underlie these effects are high priorities for NASA. Certain organisms, known as model organisms, are widely studied to help researchers better understand how all biological systems function. Small model organisms such as nem-atodes, slime mold, bacteria, green algae, yeast, and moss can be used to study the effects of micro- and reduced gravity at both the cellular and systems level over multiple generations. Many model organisms have sequenced genomes and published data sets on their transcriptomes and proteomes that enable scientific investigations of the molecular mechanisms underlying the adaptations of these organisms to space flight.

  9. Modeling and testing of a tube-in-tube separation mechanism of bodies in space

    NASA Astrophysics Data System (ADS)

    Michaels, Dan; Gany, Alon

    2016-12-01

    A tube-in-tube concept for separation of bodies in space was investigated theoretically and experimentally. The separation system is based on generation of high pressure gas by combustion of solid propellant and restricting the expansion of the gas only by ejecting the two bodies in opposite directions, in such a fashion that maximizes generated impulse. An interior ballistics model was developed in order to investigate the potential benefits of the separation system for a large range of space body masses and for different design parameters such as geometry and propellant. The model takes into account solid propellant combustion, heat losses, and gas phase chemical reactions. The model shows that for large bodies (above 100 kg) and typical separation velocities of 5 m/s, the proposed separation mechanism may be characterized by a specific impulse of 25,000 s, two order of magnitude larger than that of conventional solid rockets. It means that the proposed separation system requires only 1% of the propellant mass that would be needed for a conventional rocket for the same mission. Since many existing launch vehicles obtain such separation velocities by using conventional solid rocket motors (retro-rockets), the implementation of the new separation system design can reduce dramatically the mass of the separation system and increase safety. A dedicated experimental setup was built in order to demonstrate the concept and validate the model. The experimental results revealed specific impulse values of up to 27,000 s and showed good correspondence with the model.

  10. Modeling Charge Collection in Detector Arrays

    NASA Technical Reports Server (NTRS)

    Hardage, Donna (Technical Monitor); Pickel, J. C.

    2003-01-01

    A detector array charge collection model has been developed for use as an engineering tool to aid in the design of optical sensor missions for operation in the space radiation environment. This model is an enhancement of the prototype array charge collection model that was developed for the Next Generation Space Telescope (NGST) program. The primary enhancements were accounting for drift-assisted diffusion by Monte Carlo modeling techniques and implementing the modeling approaches in a windows-based code. The modeling is concerned with integrated charge collection within discrete pixels in the focal plane array (FPA), with high fidelity spatial resolution. It is applicable to all detector geometries including monolithc charge coupled devices (CCDs), Active Pixel Sensors (APS) and hybrid FPA geometries based on a detector array bump-bonded to a readout integrated circuit (ROIC).

  11. Modeling aspects of the surface reconstruction problem

    NASA Astrophysics Data System (ADS)

    Toth, Charles K.; Melykuti, Gabor

    1994-08-01

    The ultimate goal of digital photogrammetry is to automatically produce digital maps which may in turn form the basis of GIS. Virtually all work in surface reconstruction deals with various kinds of approximations and constraints that are applied. In this paper we extend these concepts in various ways. For one, matching is performed in object space. Thus, matching and densification (modeling) is performed in the same reference system. Another extension concerns the solution of the second sub-problem. Rather than simply densifying (interpolating) the surface, we propose to model it. This combined top-down and bottom-up approach is performed in scale space, whereby the model is refined until compatibility between the data and expectations is reached. The paper focuses on the modeling aspects of the surface reconstruction problem. Obviously, the top-down and bottom-up model descriptions ought to be in a form which allows the generation and verification of hypotheses. Another crucial question is the degree of a priori scene knowledge necessary to constrain the solution space.

  12. Natural Atmospheric Environment Model Development for the National Aeronautics and Space Administration's Second Generation Reusable Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Roberts, Barry C.; Leahy, Frank; Overbey, Glenn; Batts, Glen W.; Parker, Nelson (Technical Monitor)

    2002-01-01

    The National Aeronautics and Space Administration (NASA) recently began development of a new reusable launch vehicle. The program office is located at Marshall Space Flight Center (MSFC) and is called the Second Generation Reusable Launch Vehicle (2GRLV). The purpose of the program is to improve upon the safety and reliability of the first generation reusable launch vehicle, the Space Shuttle. Specifically, the goals are to reduce the risk of crew loss to less than 1-in-10,000 missions and decreased costs by a factor of 10 to approximately $1,000 per pound of payload launched to low Earth orbit. The program is currently in the very early stages of development and many two-stage vehicle concepts will be evaluated. Risk reduction activities are also taking place. These activities include developing new technologies and advancing current technologies to be used by the vehicle. The Environments Group at MSFC is tasked by the 2GRLV Program to develop and maintain an extensive series of analytical tools and environmental databases which enable it to provide detailed atmospheric studies in support of structural, guidance, navigation and control, and operation of the 2GRLV.

  13. Techniques for modeling the reliability of fault-tolerant systems with the Markov state-space approach

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1995-01-01

    This paper presents a step-by-step tutorial of the methods and the tools that were used for the reliability analysis of fault-tolerant systems. The approach used in this paper is the Markov (or semi-Markov) state-space method. The paper is intended for design engineers with a basic understanding of computer architecture and fault tolerance, but little knowledge of reliability modeling. The representation of architectural features in mathematical models is emphasized. This paper does not present details of the mathematical solution of complex reliability models. Instead, it describes the use of several recently developed computer programs SURE, ASSIST, STEM, and PAWS that automate the generation and the solution of these models.

  14. A Data Management System for International Space Station Simulation Tools

    NASA Technical Reports Server (NTRS)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  15. Space shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1980-01-01

    The effects of atmospheric turbulence in both horizontal and near horizontal flight, during the return of the space shuttle, are important for determining design, control, and 'pilot-in-the-loop' effects. A nonrecursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model, the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes which are entitled shuttle simulation turbulence tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 10,000 meters. The turbulence generation procedure is described as well as the results of validating the simulated turbulence. Conclusions and recommendations are presented and references cited. The tabulated one dimensional von Karman spectra and the results of spectral and statistical analyses of the SSTT are contained in the appendix.

  16. Models for Multimegawatt Space Power Systems

    DTIC Science & Technology

    1990-06-01

    devices such as batteries, flywheels, and large, cryogenic inductors. Turbines with generators, thermionics, thermoelectrics, alkali metal...NTCA Weapons Laboratory Kirtland AFB, NM 87117 C. Perry Bankston California Institute of Technology Jet Propulsion Laboratory 4800 Oak Grove

  17. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.

  18. Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space

    NASA Astrophysics Data System (ADS)

    Christakos, G.

    We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.

  19. Trade-Space Analysis Tool for Constellations (TAT-C)

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The Knowledge Base supports both analysis and exploration, and the current GUI prototype automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost.

  20. Trade-space Analysis for Constellations

    NASA Astrophysics Data System (ADS)

    Le Moigne, J.; Dabney, P.; de Weck, O. L.; Foreman, V.; Grogan, P.; Holland, M. P.; Hughes, S. P.; Nag, S.

    2016-12-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: "How many spacecraft should be included in the constellation? Which design has the best cost/risk value?" The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time. This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance. TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The Knowledge Base supports both analysis and exploration, and the current GUI prototype automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost.

  1. Use of animal models for space flight physiology studies, with special focus on the immune system

    NASA Technical Reports Server (NTRS)

    Sonnenfeld, Gerald

    2005-01-01

    Animal models have been used to study the effects of space flight on physiological systems. The animal models have been used because of the limited availability of human subjects for studies to be carried out in space as well as because of the need to carry out experiments requiring samples and experimental conditions that cannot be performed using humans. Experiments have been carried out in space using a variety of species, and included developmental biology studies. These species included rats, mice, non-human primates, fish, invertebrates, amphibians and insects. The species were chosen because they best fit the experimental conditions required for the experiments. Experiments with animals have also been carried out utilizing ground-based models that simulate some of the effects of exposure to space flight conditions. Most of the animal studies have generated results that parallel the effects of space flight on human physiological systems. Systems studied have included the neurovestibular system, the musculoskeletal system, the immune system, the neurological system, the hematological system, and the cardiovascular system. Hindlimb unloading, a ground-based model of some of the effects of space flight on the immune system, has been used to study the effects of space flight conditions on physiological parameters. For the immune system, exposure to hindlimb unloading has been shown to results in alterations of the immune system similar to those observed after space flight. This has permitted the development of experiments that demonstrated compromised resistance to infection in rodents maintained in the hindlimb unloading model as well as the beginning of studies to develop countermeasures to ameliorate or prevent such occurrences. Although there are limitations to the use of animal models for the effects of space flight on physiological systems, the animal models should prove very valuable in designing countermeasures for exploration class missions of the future.

  2. Infrared radiation scene generation of stars and planets in celestial background

    NASA Astrophysics Data System (ADS)

    Guo, Feng; Hong, Yaohui; Xu, Xiaojian

    2014-10-01

    An infrared (IR) radiation generation model of stars and planets in celestial background is proposed in this paper. Cohen's spectral template1 is modified for high spectral resolution and accuracy. Based on the improved spectral template for stars and the blackbody assumption for planets, an IR radiation model is developed which is able to generate the celestial IR background for stars and planets appearing in sensor's field of view (FOV) for specified observing date and time, location, viewpoint and spectral band over 1.2μm ~ 35μm. In the current model, the initial locations of stars are calculated based on midcourse space experiment (MSX) IR astronomical catalogue (MSX-IRAC) 2 , while the initial locations of planets are calculated using secular variations of the planetary orbits (VSOP) theory. Simulation results show that the new IR radiation model has higher resolution and accuracy than common model.

  3. Combustion Stability Analyses for J-2X Gas Generator Development

    NASA Technical Reports Server (NTRS)

    Hulka, J. R.; Protz, C. S.; Casiano, M. J.; Kenny, R. J.

    2010-01-01

    The National Aeronautics and Space Administration (NASA) is developing a liquid oxygen/liquid hydrogen rocket engine for upper stage and trans-lunar applications of the Ares vehicles for the Constellation program. This engine, designated the J-2X, is a higher pressure, higher thrust variant of the Apollo-era J-2 engine. Development was contracted to Pratt & Whitney Rocketdyne in 2006. Over the past several years, development of the gas generator for the J-2X engine has progressed through a variety of workhorse injector, chamber, and feed system configurations. Several of these configurations have resulted in injection-coupled combustion instability of the gas generator assembly at the first longitudinal mode of the combustion chamber. In this paper, the longitudinal mode combustion instabilities observed on the workhorse test stand are discussed in detail. Aspects of this combustion instability have been modeled at the NASA Marshall Space Flight Center with several codes, including the Rocket Combustor Interaction Design and Analysis (ROCCID) code and a new lumped-parameter MatLab model. To accurately predict the instability characteristics of all the chamber and injector geometries and test conditions, several features of the submodels in the ROCCID suite of calculations required modification. Finite-element analyses were conducted of several complicated combustion chamber geometries to determine how to model and anchor the chamber response in ROCCID. A large suite of sensitivity calculations were conducted to determine how to model and anchor the injector response in ROCCID. These modifications and their ramification for future stability analyses of this type are discussed in detail. The lumped-parameter MatLab model of the gas generator assembly was created as an alternative calculation to the ROCCID methodology. This paper also describes this model and the stability calculations.

  4. Slow Dynamics Model of Compressed Air Energy Storage and Battery Storage Technologies for Automatic Generation Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Venkat; Das, Trishna

    Increasing variable generation penetration and the consequent increase in short-term variability makes energy storage technologies look attractive, especially in the ancillary market for providing frequency regulation services. This paper presents slow dynamics model for compressed air energy storage and battery storage technologies that can be used in automatic generation control studies to assess the system frequency response and quantify the benefits from storage technologies in providing regulation service. The paper also represents the slow dynamics model of the power system integrated with storage technologies in a complete state space form. The storage technologies have been integrated to the IEEE 24more » bus system with single area, and a comparative study of various solution strategies including transmission enhancement and combustion turbine have been performed in terms of generation cycling and frequency response performance metrics.« less

  5. An analysis of the massless planet approximation in transit light curve models

    NASA Astrophysics Data System (ADS)

    Millholland, Sarah; Ruch, Gerry

    2015-08-01

    Many extrasolar planet transit light curve models use the approximation of a massless planet. They approximate the planet as orbiting elliptically with the host star at the orbit’s focus instead of depicting the planet and star as both orbiting around a common center of mass. This approximation should generally be very good because the transit is a small fraction of the full-phase curve and the planet to stellar mass ratio is typically very small. However, to fully examine the legitimacy of this approximation, it is useful to perform a robust, all-parameter space-encompassing statistical comparison between the massless planet model and the more accurate model.Towards this goal, we establish two questions: (1) In what parameter domain is the approximation invalid? (2) If characterizing an exoplanetary system in this domain, what is the error of the parameter estimates when using the simplified model? We first address question (1). Given each parameter vector in a finite space, we can generate the simplified and more complete model curves. Associated with these model curves is a measure of the deviation between them, such as the root mean square (RMS). We use Gibbs sampling to generate a sample that is distributed according to the RMS surface. The high-density regions in the sample correspond to a large deviation between the models. To determine the domains of these high-density areas, we first employ the Ordering Points to Identify the Clustering Structure (OPTICS) algorithm. We then characterize the subclusters by performing the Patient Rule Induction Method (PRIM) on the transformed Principal Component spaces of each cluster. This process yields descriptors of the parameter domains with large discrepancies between the models.To consider question (2), we start by generating synthetic transit curve observations in the domains specified by the above analysis. We then derive the best-fit parameters of these synthetic light curves according to each model and examine the quality of agreement between the estimated parameters. Taken as a whole, these steps allow for a thorough analysis of the validity of the massless planet approximation.

  6. Generation of various partially coherent beams and their propagation properties in turbulent atmosphere: a review

    NASA Astrophysics Data System (ADS)

    Cai, Yangjian

    2011-03-01

    Partially coherent beams, such as Gaussian Schell-model beam, partially coherent dark hollow beam, partially coherent flat-topped beam and electromagnetic Gaussian Schell-model beam, have important applications in free space optical communications, optical imaging, optical trapping, inertial confinement fusion and nonlinear optics. In this paper, experimental generations of various partially coherent beams are introduced. Furthermore, with the help of a tensor method, analytical formulae for such beams propagating in turbulent atmosphere are derived, and the propagation properties of such beams in turbulent atmosphere are reviewed.

  7. Cyber threat impact assessment and analysis for space vehicle architectures

    NASA Astrophysics Data System (ADS)

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.

    2014-06-01

    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  8. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

  9. Dynamic wave field synthesis: enabling the generation of field distributions with a large space-bandwidth product.

    PubMed

    Kamau, Edwin N; Heine, Julian; Falldorf, Claas; Bergmann, Ralf B

    2015-11-02

    We present a novel approach for the design and fabrication of multiplexed computer generated volume holograms (CGVH) which allow for a dynamic synthesis of arbitrary wave field distributions. To achieve this goal, we developed a hybrid system that consists of a CGVH as a static element and an electronically addressed spatial light modulator as the dynamic element. We thereby derived a new model for describing the scattering process within the inhomogeneous dielectric material of the hologram. This model is based on the linearization of the scattering process within the Rytov approximation and incorporates physical constraints that account for voxel based laser-lithography using micro-fabrication of the holograms in a nonlinear optical material. In this article we demonstrate that this system basically facilitates a high angular Bragg selectivity on the order of 1°. Additionally, it allows for a qualitatively low cross-talk dynamic synthesis of predefined wave fields with a much larger space-bandwidth product (SBWP ≥ 8.7 × 10(6)) as compared to the current state of the art in computer generated holography.

  10. X-Ray Emission from the Terrestrial Magnetosheath

    NASA Astrophysics Data System (ADS)

    Robertson, I. P.; Collier, M. R.; Cravens, T. E.; Fok, M.

    2004-12-01

    X-rays are generated throughout the terrestrial magnetosheath as a consequence of charge transfer collisions between heavy solar wind ions and geocoronal neutrals. The solar wind ions resulting from these collisions are left in highly excited states and emit extreme ultraviolet or soft X-ray photons. A model has been created to simulate this X-ray radiation. Previously simulated images were created as seen from an observation point outside the geocorona. The locations of the bow shock and magnetopause were evident in these images. The cusps, however, were not taken into account in the model. We have now used dynamic three-dimensional simulations of the solar wind, magnetosheath and magnetosphere that were performed by the CCMC at Goddard Space Flight Center for the March 31st , 2001 geomagnetic storm. We have generated a sky map of the expected X-Ray emissions as would have been seen by an observer at the IMAGE space craft location at that time. We have also generated images as seen from an observation point well outside the geocorona. In both cases the presence of the cusps can clearly be observed.

  11. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  12. Recovering area-to-mass ratio of resident space objects through data mining

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Bai, Xiaoli

    2018-01-01

    The area-to-mass ratio (AMR) of a resident space object (RSO) is an important parameter for improved space situation awareness capability due to its effect on the non-conservative forces including the atmosphere drag force and the solar radiation pressure force. However, information about AMR is often not provided in most space catalogs. The present paper investigates recovering the AMR information from the consistency error, which refers to the difference between the orbit predicted from an earlier estimate and the orbit estimated at the current epoch. A data mining technique, particularly the random forest (RF) method, is used to discover the relationship between the consistency error and the AMR. Using a simulation-based space catalog environment as the testbed, this paper demonstrates that the classification RF model can determine the RSO's category AMR and the regression RF model can generate continuous AMR values, both with good accuracies. Furthermore, the paper reveals that by recording additional information besides the consistency error, the RF model can estimate the AMR with even higher accuracy.

  13. Low-cost blast wave generator for studies of hearing loss and brain injury: blast wave effects in closed spaces.

    PubMed

    Newman, Andrew J; Hayes, Sarah H; Rao, Abhiram S; Allman, Brian L; Manohar, Senthilvelan; Ding, Dalian; Stolzberg, Daniel; Lobarinas, Edward; Mollendorf, Joseph C; Salvi, Richard

    2015-03-15

    Military personnel and civilians living in areas of armed conflict have increased risk of exposure to blast overpressures that can cause significant hearing loss and/or brain injury. The equipment used to simulate comparable blast overpressures in animal models within laboratory settings is typically very large and prohibitively expensive. To overcome the fiscal and space limitations introduced by previously reported blast wave generators, we developed a compact, low-cost blast wave generator to investigate the effects of blast exposures on the auditory system and brain. The blast wave generator was constructed largely from off the shelf components, and reliably produced blasts with peak sound pressures of up to 198dB SPL (159.3kPa) that were qualitatively similar to those produced from muzzle blasts or explosions. Exposure of adult rats to 3 blasts of 188dB peak SPL (50.4kPa) resulted in significant loss of cochlear hair cells, reduced outer hair cell function and a decrease in neurogenesis in the hippocampus. Existing blast wave generators are typically large, expensive, and are not commercially available. The blast wave generator reported here provides a low-cost method of generating blast waves in a typical laboratory setting. This compact blast wave generator provides scientists with a low cost device for investigating the biological mechanisms involved in blast wave injury to the rodent cochlea and brain that may model many of the damaging effects sustained by military personnel and civilians exposed to intense blasts. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Low-Cost Blast Wave Generator for Studies of Hearing Loss and Brain Injury: Blast Wave Effects in Closed Spaces

    PubMed Central

    Newman, Andrew J.; Hayes, Sarah H.; Rao, Abhiram S.; Allman, Brian L.; Manohar, Senthilvelan; Ding, Dalian; Stolzberg, Daniel; Lobarinas, Edward; Mollendorf, Joseph C.; Salvi, Richard

    2015-01-01

    Background Military personnel and civilians living in areas of armed conflict have increased risk of exposure to blast overpressures that can cause significant hearing loss and/or brain injury. The equipment used to simulate comparable blast overpressures in animal models within laboratory settings is typically very large and prohibitively expensive. New Method To overcome the fiscal and space limitations introduced by previously reported blast wave generators, we developed a compact, low-cost blast wave generator to investigate the effects of blast exposures on the auditory system and brain. Results The blast wave generator was constructed largely from off the shelf components, and reliably produced blasts with peak sound pressures of up to 198 dB SPL (159.3 kPa) that were qualitatively similar to those produced from muzzle blasts or explosions. Exposure of adult rats to 3 blasts of 188 dB peak SPL (50.4 kPa) resulted in significant loss of cochlear hair cells, reduced outer hair cell function and a decrease in neurogenesis in the hippocampus. Comparison to existing methods Existing blast wave generators are typically large, expensive, and are not commercially available. The blast wave generator reported here provides a low-cost method of generating blast waves in a typical laboratory setting. Conclusions This compact blast wave generator provides scientists with a low cost device for investigating the biological mechanisms involved in blast wave injury to the rodent cochlea and brain that may model many of the damaging effects sustained by military personnel and civilians exposed to intense blasts. PMID:25597910

  15. Private space exploration: A new way for starting a spacefaring society?

    NASA Astrophysics Data System (ADS)

    Genta, Giancarlo

    2014-11-01

    Since the beginning space was an exclusive domain of public organizations, the role of privates is becoming more and more important, and not only in commercial activities. However, the main international treaties dealing with this subject are still based on the assumption that space activities are mostly reserved to states. In the last decade the idea that the role of privates could include the management of space infrastructures and launch vehicles gained support and now private launch services are a reality. An even wider role of privates is now advocated and private exploration and exploitation missions are discussed. This requires that space activity in general can generate an attractive return and those business models are identified.

  16. Topological Defects and Structures in the Early Universe

    NASA Astrophysics Data System (ADS)

    Zhu, Yong

    1997-08-01

    This thesis discusses the topological defects generated in the early universe and their contributions to cosmic structure formation. First, we investigate non-Gaussian isocurvature perturbations generated by the evolution of Goldstone modes during inflation. If a global symmetry is broken before inflation, the resulting Goldstone modes are disordered during inflation in a precise and predictable way. After inflation these Goldstone modes order themselves in a self-similar way, much as Goldstone modes in field ordering scenarios based on the Kibble mechanism. For (Hi2/Mpl2)~10- 6, through their gravitational interaction these Goldstone modes generate density perturbations of approximately the right magnitude to explain the cosmic microwave background (CMB) anisotropy and seed the structure seen in the universe today. In such a model non-Gaussian perturbations result because to lowest order density perturbations are sourced by products of Gaussian fields. We explore the issue of phase dispersion and conclude that this non-Gaussian model predicts Doppler peaks in the CMB anisotropy. Topological defects generated from quantum fluctuations during inflation are studied in chapter four. We present a calculation of the power spectrum generated in a classically symmetry-breaking O(N) scalar field through inflationary quantum fluctuations, using the large-N limit. The effective potential of the theory in de Sitter space is obtained from a gap equation which is exact at large N. Quantum fluctuations restore the O(N) symmetry in de Sitter space, but for the finite values of N of interest, there is symmetry breaking and phase ordering after inflation, described by the classical nonlinear sigma model. The scalar field power spectrum is obtained as a function of the scalar field self-coupling. In the second part of the thesis, we investigate non-Abelian topological worm-holes, obtained when winding number one texture field is coupled to Einstein gravity with a conserved global charge. This topological wormhole has the same Euclidean action as axion wormholes and charged scalar wormholes. We find that free topological wormholes are spontaneously generated in the Euclidean space-time with finite density. It is then shown that wormholes with finite density might destroy any long range order in the global fields.

  17. Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo

    USGS Publications Warehouse

    Herckenrath, Daan; Langevin, Christian D.; Doherty, John

    2011-01-01

    Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of generating calibration-constrained parameter fields approximately doubled. Predictive uncertainty variance computed through the NSMC method was compared with that computed through linear analysis. The results were in good agreement, with the NSMC method estimate showing a slightly smaller range of prediction uncertainty than was calculated by the linear method. Copyright 2011 by the American Geophysical Union.

  18. Explanation Constraint Programming for Model-based Diagnosis of Engineered Systems

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Brownston, Lee; Burrows, Daniel

    2004-01-01

    We can expect to see an increase in the deployment of unmanned air and land vehicles for autonomous exploration of space. In order to maintain autonomous control of such systems, it is essential to track the current state of the system. When the system includes safety-critical components, failures or faults in the system must be diagnosed as quickly as possible, and their effects compensated for so that control and safety are maintained under a variety of fault conditions. The Livingstone fault diagnosis and recovery kernel and its temporal extension L2 are examples of model-based reasoning engines for health management. Livingstone has been shown to be effective, it is in demand, and it is being further developed. It was part of the successful Remote Agent demonstration on Deep Space One in 1999. It has been and is being utilized by several projects involving groups from various NASA centers, including the In Situ Propellant Production (ISPP) simulation at Kennedy Space Center, the X-34 and X-37 experimental reusable launch vehicle missions, Techsat-21, and advanced life support projects. Model-based and consistency-based diagnostic systems like Livingstone work only with discrete and finite domain models. When quantitative and continuous behaviors are involved, these are abstracted to discrete form using some mapping. This mapping from the quantitative domain to the qualitative domain is sometimes very involved and requires the design of highly sophisticated and complex monitors. We propose a diagnostic methodology that deals directly with quantitative models and behaviors, thereby mitigating the need for these sophisticated mappings. Our work brings together ideas from model-based diagnosis systems like Livingstone and concurrent constraint programming concepts. The system uses explanations derived from the propagation of quantitative constraints to generate conflicts. Fast conflict generation algorithms are used to generate and maintain multiple candidates whose consistency can be tracked across multiple time steps.

  19. Air-Traffic Controllers Evaluate The Descent Advisor

    NASA Technical Reports Server (NTRS)

    Tobias, Leonard; Volckers, Uwe; Erzberger, Heinz

    1992-01-01

    Report describes study of Descent Advisor algorithm: software automation aid intended to assist air-traffic controllers in spacing traffic and meeting specified times or arrival. Based partly on mathematical models of weather conditions and performances of aircraft, it generates suggested clearances, including top-of-descent points and speed-profile data to attain objectives. Study focused on operational characteristics with specific attention to how it can be used for prediction, spacing, and metering.

  20. Performance Analysis for Lateral-Line-Inspired Sensor Arrays

    DTIC Science & Technology

    2011-06-01

    found to affect numerous aspects of behavior including maneuvering in complex fluid environments, schooling, prey tracking, and environment mapping...190 5-29 Maps of the cost function for a reflected vortex model with an increasing array length but constant sensor spacing . The x at...length but constant sensor spacing . The x in each image denotes the true location of the vortex. The black lines correspond to level sets generated by the

  1. Space Particle Hazard Measurement and Modeling

    DTIC Science & Technology

    2007-11-30

    the spacecraft and perturbations of the environment generated by the spacecraft. Koons et al. (1999) compiled and studied all spacecraft anomalies...unrealistic for D12 than for Dα0p). However, unlike the stability problems associated with the original cross diffusion terms, they are quite manageable ...E), to mono-energetic beams of charged particles of known energies which enables one, in principle , to unfold the space environment spectrum, j(E

  2. A modelling study of hyporheic exchange pattern and the sequence, size, and spacing of stream bedforms in mountain stream networks, Oregon, USA.

    Treesearch

    Michael N. Gooseff; Justin K. Anderson; Steven M. Wondzell; Justin LaNier; Roy Haggerty

    2005-01-01

    Studies of hyporheic exchange flows have identified physical features of channels that control exchange flow at the channel unit scale, namely slope breaks in the longitudinal profile of streams that generate subsurface head distributions. We recently completed a field study that suggested channel unit spacing in stream longitudinal profiles can be used to predict the...

  3. Human Activity Behavior and Gesture Generation in Virtual Worlds for Long- Duration Space Missions. Chapter 8

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Damer, Bruce; Brodsky, Boris; vanHoff, Ron

    2007-01-01

    A virtual worlds presentation technique with embodied, intelligent agents is being developed as an instructional medium suitable to present in situ training on long term space flight. The system combines a behavioral element based on finite state automata, a behavior based reactive architecture also described as subsumption architecture, and a belief-desire-intention agent structure. These three features are being integrated to describe a Brahms virtual environment model of extravehicular crew activity which could become a basis for procedure training during extended space flight.

  4. Model Comparisons For Space Solar Cell End-Of-Life Calculations

    NASA Astrophysics Data System (ADS)

    Messenger, Scott; Jackson, Eric; Warner, Jeffrey; Walters, Robert; Evans, Hugh; Heynderickx, Daniel

    2011-10-01

    Space solar cell end-of-life (EOL) calculations are performed over a wide range of space radiation environments for GaAs-based single and multijunction solar cell technologies. Two general semi-empirical approaches will used to generate these EOL calculation results: 1) the JPL equivalent fluence (EQFLUX) and 2) the NRL displacement damage dose (SCREAM). This paper also includes the first results using the Monte Carlo-based version of SCREAM, called MC- SCREAM, which is now freely available online as part of the SPENVIS suite of programs.

  5. Using Runtime Analysis to Guide Model Checking of Java Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.

  6. Morpheus Lander Roll Control System and Wind Modeling

    NASA Technical Reports Server (NTRS)

    Gambone, Elisabeth A.

    2014-01-01

    The Morpheus prototype lander is a testbed capable of vertical takeoff and landing developed by NASA Johnson Space Center to assess advanced space technologies. Morpheus completed a series of flight tests at Kennedy Space Center to demonstrate autonomous landing and hazard avoidance for future exploration missions. As a prototype vehicle being tested in Earth's atmosphere, Morpheus requires a robust roll control system to counteract aerodynamic forces. This paper describes the control algorithm designed that commands jet firing and delay times based on roll orientation. Design, analysis, and testing are supported using a high fidelity, 6 degree-of-freedom simulation of vehicle dynamics. This paper also details the wind profiles generated using historical wind data, which are necessary to validate the roll control system in the simulation environment. In preparation for Morpheus testing, the wind model was expanded to create day-of-flight wind profiles based on data delivered by Kennedy Space Center. After the test campaign, a comparison of flight and simulation performance was completed to provide additional model validation.

  7. The Space Environmental Impact System

    NASA Astrophysics Data System (ADS)

    Kihn, E. A.

    2009-12-01

    The Space Environmental Impact System (SEIS) is an operational tool for incorporating environmental data sets into DoD Modeling and Simulation (M&S) which allows for enhanced decision making regarding acquisitions, testing, operations and planning. The SEIS system creates, from the environmental archives and developed rule-base, a tool for describing the effects of the space environment on particular military systems, both historically and in real-time. The system uses data available over the web, and in particular data provided by NASA’s virtual observatory network, as well as modeled data generated specifically for this purpose. The rule base system developed to support SEIS is an open XML based model which can be extended to events from any environmental domain. This presentation will show how the SEIS tool allows users to easily and accurately evaluate the effect of space weather in terms that are meaningful to them as well as discuss the relevant standards used in its construction and go over lessons learned from fielding an operational environmental decision tool.

  8. QSAR modeling and chemical space analysis of antimalarial compounds

    NASA Astrophysics Data System (ADS)

    Sidorov, Pavel; Viira, Birgit; Davioud-Charvet, Elisabeth; Maran, Uko; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre

    2017-05-01

    Generative topographic mapping (GTM) has been used to visualize and analyze the chemical space of antimalarial compounds as well as to build predictive models linking structure of molecules with their antimalarial activity. For this, a database, including 3000 molecules tested in one or several of 17 anti- Plasmodium activity assessment protocols, has been compiled by assembling experimental data from in-house and ChEMBL databases. GTM classification models built on subsets corresponding to individual bioassays perform similarly to the earlier reported SVM models. Zones preferentially populated by active and inactive molecules, respectively, clearly emerge in the class landscapes supported by the GTM model. Their analysis resulted in identification of privileged structural motifs of potential antimalarial compounds. Projection of marketed antimalarial drugs on this map allowed us to delineate several areas in the chemical space corresponding to different mechanisms of antimalarial activity. This helped us to make a suggestion about the mode of action of the molecules populating these zones.

  9. QSAR modeling and chemical space analysis of antimalarial compounds.

    PubMed

    Sidorov, Pavel; Viira, Birgit; Davioud-Charvet, Elisabeth; Maran, Uko; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre

    2017-05-01

    Generative topographic mapping (GTM) has been used to visualize and analyze the chemical space of antimalarial compounds as well as to build predictive models linking structure of molecules with their antimalarial activity. For this, a database, including ~3000 molecules tested in one or several of 17 anti-Plasmodium activity assessment protocols, has been compiled by assembling experimental data from in-house and ChEMBL databases. GTM classification models built on subsets corresponding to individual bioassays perform similarly to the earlier reported SVM models. Zones preferentially populated by active and inactive molecules, respectively, clearly emerge in the class landscapes supported by the GTM model. Their analysis resulted in identification of privileged structural motifs of potential antimalarial compounds. Projection of marketed antimalarial drugs on this map allowed us to delineate several areas in the chemical space corresponding to different mechanisms of antimalarial activity. This helped us to make a suggestion about the mode of action of the molecules populating these zones.

  10. An experimental determination in Calspan Ludwieg tube of the base environment of the integrated space shuttle vehicle at simulated Mach 4.5 flight conditions (test IH5 of model 19-OTS)

    NASA Technical Reports Server (NTRS)

    Drzewiecki, R. F.; Foust, J. W.

    1976-01-01

    A model test program was conducted to determine heat transfer and pressure distributions in the base region of the space shuttle vehicle during simulated launch trajectory conditions of Mach 4.5 and pressure altitudes between 90,000 and 210,000 feet. Model configurations with and without the solid propellant booster rockets were examined to duplicate pre- and post-staging vehicle geometries. Using short duration flow techniques, a tube wind tunnel provided supersonic flow over the model. Simultaneously, combustion generated exhaust products reproduced the gasdynamic and thermochemical structure of the main vehicle engine plumes. Heat transfer and pressure measurements were made at numerous locations on the base surfaces of the 19-OTS space shuttle model with high response instrumentation. In addition, measurements of base recovery temperature were made indirectly by using dual fine wire and resistance thermometers and by extrapolating heat transfer measurements.

  11. Space power system scheduling using an expert system

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Biefeld, E.; Costello, L.; Klein, J. W.

    1986-01-01

    A most pressing problem in space exploration is timely spacecraft power system sequence generation, which requires the scheduling of a set of loads given a set of resource constraints. This is particularly important after an anomaly or failure. This paper discusses the power scheduling problem and how the software program, Plan-It, can be used as a consultant for scheduling power system activities. Modeling of power activities, human interface, and two of the many strategies used by Plan-It are discussed. Preliminary results showing the development of a conflict-free sequence from an initial sequence with conflicts is presented. It shows that a 4-day schedule can be generated in a matter of a few minutes, which provides sufficient time in many cases to aid the crew in the replanning of loads and generation use following a failure or anomaly.

  12. Engineering Risk Assessment of Space Thruster Challenge Problem

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie

    2014-01-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.

  13. Interferometric constraints on quantum geometrical shear noise correlations

    DOE PAGES

    Chou, Aaron; Glass, Henry; Richard Gustafson, H.; ...

    2017-07-20

    Final measurements and analysis are reported from the first-generation Holometer, the first instrument capable of measuring correlated variations in space-time position at strain noise power spectral densities smaller than a Planck time. The apparatus consists of two co-located, but independent and isolated, 40 m power-recycled Michelson interferometers, whose outputs are cross-correlated to 25 MHz. The data are sensitive to correlations of differential position across the apparatus over a broad band of frequencies up to and exceeding the inverse light crossing time, 7.6 MHz. By measuring with Planck precision the correlation of position variations at spacelike separations, the Holometer searches formore » faint, irreducible correlated position noise backgrounds predicted by some models of quantum space-time geometry. The first-generation optical layout is sensitive to quantum geometrical noise correlations with shear symmetry---those that can be interpreted as a fundamental noncommutativity of space-time position in orthogonal directions. General experimental constraints are placed on parameters of a set of models of spatial shear noise correlations, with a sensitivity that exceeds the Planck-scale holographic information bound on position states by a large factor. This result significantly extends the upper limits placed on models of directional noncommutativity by currently operating gravitational wave observatories.« less

  14. Interferometric constraints on quantum geometrical shear noise correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chou, Aaron; Glass, Henry; Richard Gustafson, H.

    Final measurements and analysis are reported from the first-generation Holometer, the first instrument capable of measuring correlated variations in space-time position at strain noise power spectral densities smaller than a Planck time. The apparatus consists of two co-located, but independent and isolated, 40 m power-recycled Michelson interferometers, whose outputs are cross-correlated to 25 MHz. The data are sensitive to correlations of differential position across the apparatus over a broad band of frequencies up to and exceeding the inverse light crossing time, 7.6 MHz. By measuring with Planck precision the correlation of position variations at spacelike separations, the Holometer searches formore » faint, irreducible correlated position noise backgrounds predicted by some models of quantum space-time geometry. The first-generation optical layout is sensitive to quantum geometrical noise correlations with shear symmetry---those that can be interpreted as a fundamental noncommutativity of space-time position in orthogonal directions. General experimental constraints are placed on parameters of a set of models of spatial shear noise correlations, with a sensitivity that exceeds the Planck-scale holographic information bound on position states by a large factor. This result significantly extends the upper limits placed on models of directional noncommutativity by currently operating gravitational wave observatories.« less

  15. Hybrid generative-discriminative human action recognition by combining spatiotemporal words with supervised topic models

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Wang, Cheng; Wang, Boliang

    2011-02-01

    We present a hybrid generative-discriminative learning method for human action recognition from video sequences. Our model combines a bag-of-words component with supervised latent topic models. A video sequence is represented as a collection of spatiotemporal words by extracting space-time interest points and describing these points using both shape and motion cues. The supervised latent Dirichlet allocation (sLDA) topic model, which employs discriminative learning using labeled data under a generative framework, is introduced to discover the latent topic structure that is most relevant to action categorization. The proposed algorithm retains most of the desirable properties of generative learning while increasing the classification performance though a discriminative setting. It has also been extended to exploit both labeled data and unlabeled data to learn human actions under a unified framework. We test our algorithm on three challenging data sets: the KTH human motion data set, the Weizmann human action data set, and a ballet data set. Our results are either comparable to or significantly better than previously published results on these data sets and reflect the promise of hybrid generative-discriminative learning approaches.

  16. Classification framework for partially observed dynamical systems

    NASA Astrophysics Data System (ADS)

    Shen, Yuan; Tino, Peter; Tsaneva-Atanasova, Krasimira

    2017-04-01

    We present a general framework for classifying partially observed dynamical systems based on the idea of learning in the model space. In contrast to the existing approaches using point estimates of model parameters to represent individual data items, we employ posterior distributions over model parameters, thus taking into account in a principled manner the uncertainty due to both the generative (observational and/or dynamic noise) and observation (sampling in time) processes. We evaluate the framework on two test beds: a biological pathway model and a stochastic double-well system. Crucially, we show that the classification performance is not impaired when the model structure used for inferring posterior distributions is much more simple than the observation-generating model structure, provided the reduced-complexity inferential model structure captures the essential characteristics needed for the given classification task.

  17. High-resolution spatiotemporal mapping of PM2.5 concentrations at Mainland China using a combined BME-GWR technique

    NASA Astrophysics Data System (ADS)

    Xiao, Lu; Lang, Yichao; Christakos, George

    2018-01-01

    With rapid economic development, industrialization and urbanization, the ambient air PM2.5 has become a major pollutant linked to respiratory, heart and lung diseases. In China, PM2.5 pollution constitutes an extreme environmental and social problem of widespread public concern. In this work we estimate ground-level PM2.5 from satellite-derived aerosol optical depth (AOD), topography data, meteorological data, and pollutant emission using an integrative technique. In particular, Geographically Weighted Regression (GWR) analysis was combined with Bayesian Maximum Entropy (BME) theory to assess the spatiotemporal characteristics of PM2.5 exposure in a large region of China and generate informative PM2.5 space-time predictions (estimates). It was found that, due to its integrative character, the combined BME-GWR method offers certain improvements in the space-time prediction of PM2.5 concentrations over China compared to previous techniques. The combined BME-GWR technique generated realistic maps of space-time PM2.5 distribution, and its performance was superior to that of seven previous studies of satellite-derived PM2.5 concentrations in China in terms of prediction accuracy. The purely spatial GWR model can only be used at a fixed time, whereas the integrative BME-GWR approach accounts for cross space-time dependencies and can predict PM2.5 concentrations in the composite space-time domain. The 10-fold results of BME-GWR modeling (R2 = 0.883, RMSE = 11.39 μg /m3) demonstrated a high level of space-time PM2.5 prediction (estimation) accuracy over China, revealing a definite trend of severe PM2.5 levels from the northern coast toward inland China (Nov 2015-Feb 2016). Future work should focus on the addition of higher resolution AOD data, developing better satellite-based prediction models, and related air pollutants for space-time PM2.5 prediction purposes.

  18. Modeling and simulation of RF photoinjectors for coherent light sources

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Krasilnikov, M.; Stephan, F.; Gjonaj, E.; Weiland, T.; Dohlus, M.

    2018-05-01

    We propose a three-dimensional fully electromagnetic numerical approach for the simulation of RF photoinjectors for coherent light sources. The basic idea consists in incorporating a self-consistent photoemission model within a particle tracking code. The generation of electron beams in the injector is determined by the quantum efficiency (QE) of the cathode, the intensity profile of the driving laser as well as by the accelerating field and magnetic focusing conditions in the gun. The total charge emitted during an emission cycle can be limited by the space charge field at the cathode. Furthermore, the time and space dependent electromagnetic field at the cathode may induce a transient modulation of the QE due to surface barrier reduction of the emitting layer. In our modeling approach, all these effects are taken into account. The beam particles are generated dynamically according to the local QE of the cathode and the time dependent laser intensity profile. For the beam dynamics, a tracking code based on the Lienard-Wiechert retarded field formalism is employed. This code provides the single particle trajectories as well as the transient space charge field distribution at the cathode. As an application, the PITZ injector is considered. Extensive electron bunch emission simulations are carried out for different operation conditions of the injector, in the source limited as well as in the space charge limited emission regime. In both cases, fairly good agreement between measurements and simulations is obtained.

  19. Physics-based Space Weather Forecasting in the Project for Solar-Terrestrial Environment Prediction (PSTEP) in Japan

    NASA Astrophysics Data System (ADS)

    Kusano, K.

    2016-12-01

    Project for Solar-Terrestrial Environment Prediction (PSTEP) is a Japanese nation-wide research collaboration, which was recently launched. PSTEP aims to develop a synergistic interaction between predictive and scientific studies of the solar-terrestrial environment and to establish the basis for next-generation space weather forecasting using the state-of-the-art observation systems and the physics-based models. For this project, we coordinate the four research groups, which develop (1) the integration of space weather forecast system, (2) the physics-based solar storm prediction, (3) the predictive models of magnetosphere and ionosphere dynamics, and (4) the model of solar cycle activity and its impact on climate, respectively. In this project, we will build the coordinated physics-based model to answer the fundamental questions concerning the onset of solar eruptions and the mechanism for radiation belt dynamics in the Earth's magnetosphere. In this paper, we will show the strategy of PSTEP, and discuss about the role and prospect of the physics-based space weather forecasting system being developed by PSTEP.

  20. A generative, probabilistic model of local protein structure.

    PubMed

    Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas

    2008-07-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.

  1. Outer-Planet Mission Analysis Using Solar-Electric Ion Propulsion

    NASA Technical Reports Server (NTRS)

    Woo, Byoungsam; Coverstone, Victoria L.; Hartmann, John W.; Cupples, Michael

    2003-01-01

    Outer-planet mission analysis was performed using three next generation solar-electric ion thruster models. Optimal trajectories are presented that maximize the delivered mass to the designated outer planet. Trajectories to Saturn and Neptune with a single Venus gravity assist are investigated. For each thruster model, the delivered mass versus flight time curve was generated to obtain thruster model performance. The effects of power to the thrusters and resonance ratio of Venutian orbital periods to spacecraft period were also studied. Multiple locally optimal trajectories to Saturn and Neptune have been discovered in different regions of the parameter search space. The characteristics of each trajectory are noted.

  2. Design of relative motion and attitude profiles for three-dimensional resident space object imaging with a laser rangefinder

    NASA Astrophysics Data System (ADS)

    Nayak, M.; Beck, J.; Udrea, B.

    This paper focuses on the aerospace application of a single beam laser rangefinder (LRF) for 3D imaging, shape detection, and reconstruction in the context of a space-based space situational awareness (SSA) mission scenario. The primary limitation to 3D imaging from LRF point clouds is the one-dimensional nature of the single beam measurements. A method that combines relative orbital motion and scanning attitude motion to generate point clouds has been developed and the design and characterization of multiple relative motion and attitude maneuver profiles are presented. The target resident space object (RSO) has the shape of a generic telecommunications satellite. The shape and attitude of the RSO are unknown to the chaser satellite however, it is assumed that the RSO is un-cooperative and has fixed inertial pointing. All sensors in the metrology chain are assumed ideal. A previous study by the authors used pure Keplerian motion to perform a similar 3D imaging mission at an asteroid. A new baseline for proximity operations maneuvers for LRF scanning, based on a waypoint adaptation of the Hill-Clohessy-Wiltshire (HCW) equations is examined. Propellant expenditure for each waypoint profile is discussed and combinations of relative motion and attitude maneuvers that minimize the propellant used to achieve a minimum required point cloud density are studied. Both LRF strike-point coverage and point cloud density are maximized; the capability for 3D shape registration and reconstruction from point clouds generated with a single beam LRF without catalog comparison is proven. Next, a method of using edge detection algorithms to process a point cloud into a 3D modeled image containing reconstructed shapes is presented. Weighted accuracy of edge reconstruction with respect to the true model is used to calculate a qualitative “ metric” that evaluates effectiveness of coverage. Both edge recognition algorithms and the metric are independent of point cloud densit- , therefore they are utilized to compare the quality of point clouds generated by various attitude and waypoint command profiles. The RSO model incorporates diverse irregular protruding shapes, such as open sensor covers, instrument pods and solar arrays, to test the limits of the algorithms. This analysis is used to mathematically prove that point clouds generated by a single-beam LRF can achieve sufficient edge recognition accuracy for SSA applications, with meaningful shape information extractable even from sparse point clouds. For all command profiles, reconstruction of RSO shapes from the point clouds generated with the proposed method are compared to the truth model and conclusions are drawn regarding their fidelity.

  3. Model-data synthesis for the next generation of forest free-air CO 2 enrichment (FACE) experiments

    DOE PAGES

    Norby, Richard J.; De Kauwe, Martin G.; Domingues, Tomas F.; ...

    2015-08-06

    The first generation of forest free-air CO 2 enrichment (FACE) experiments has successfully provided deeper understanding about how forests respond to an increasing CO 2 concentration in the atmosphere. Located in aggrading stands in the temperate zone, they have provided a strong foundation for testing critical assumptions in terrestrial biosphere models that are being used to project future interactions between forest productivity and the atmosphere, despite the limited inference space of these experiments with regards to the range of global ecosystems. Now, a new generation of FACE experiments in mature forests in different biomes and over a wide range ofmore » climate space and biodiversity will significantly expand the inference space. These new experiments are: EucFACE in a mature Eucalyptus stand on highly weathered soil in subtropical Australia; AmazonFACE in a highly diverse, primary rainforest in Brazil; BIFoR-FACE in a 150-yr-old deciduous woodland stand in central England; and SwedFACE proposed in a hemiboreal, Pinus sylvestris stand in Sweden. We now have a unique opportunity to initiate a model–data interaction as an integral part of experimental design and to address a set of cross-site science questions on topics including responses of mature forests; interactions with temperature, water stress, and phosphorus limitation; and the influence of biodiversity.« less

  4. Model-data synthesis for the next generation of forest free-air CO 2 enrichment (FACE) experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norby, Richard J.; De Kauwe, Martin G.; Domingues, Tomas F.

    The first generation of forest free-air CO 2 enrichment (FACE) experiments has successfully provided deeper understanding about how forests respond to an increasing CO 2 concentration in the atmosphere. Located in aggrading stands in the temperate zone, they have provided a strong foundation for testing critical assumptions in terrestrial biosphere models that are being used to project future interactions between forest productivity and the atmosphere, despite the limited inference space of these experiments with regards to the range of global ecosystems. Now, a new generation of FACE experiments in mature forests in different biomes and over a wide range ofmore » climate space and biodiversity will significantly expand the inference space. These new experiments are: EucFACE in a mature Eucalyptus stand on highly weathered soil in subtropical Australia; AmazonFACE in a highly diverse, primary rainforest in Brazil; BIFoR-FACE in a 150-yr-old deciduous woodland stand in central England; and SwedFACE proposed in a hemiboreal, Pinus sylvestris stand in Sweden. We now have a unique opportunity to initiate a model–data interaction as an integral part of experimental design and to address a set of cross-site science questions on topics including responses of mature forests; interactions with temperature, water stress, and phosphorus limitation; and the influence of biodiversity.« less

  5. Essential energy space random walk via energy space metadynamics method to accelerate molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Li, Hongzhi; Min, Donghong; Liu, Yusong; Yang, Wei

    2007-09-01

    To overcome the possible pseudoergodicity problem, molecular dynamic simulation can be accelerated via the realization of an energy space random walk. To achieve this, a biased free energy function (BFEF) needs to be priori obtained. Although the quality of BFEF is essential for sampling efficiency, its generation is usually tedious and nontrivial. In this work, we present an energy space metadynamics algorithm to efficiently and robustly obtain BFEFs. Moreover, in order to deal with the associated diffusion sampling problem caused by the random walk in the total energy space, the idea in the original umbrella sampling method is generalized to be the random walk in the essential energy space, which only includes the energy terms determining the conformation of a region of interest. This essential energy space generalization allows the realization of efficient localized enhanced sampling and also offers the possibility of further sampling efficiency improvement when high frequency energy terms irrelevant to the target events are free of activation. The energy space metadynamics method and its generalization in the essential energy space for the molecular dynamics acceleration are demonstrated in the simulation of a pentanelike system, the blocked alanine dipeptide model, and the leucine model.

  6. A methodology for the generation of the 2-D map from unknown navigation environment by traveling a short distance

    NASA Technical Reports Server (NTRS)

    Bourbakis, N.; Sarkar, D.

    1994-01-01

    A technique for generation of a 2-D space map by traveling a short distance is described. The space to be mapped can be classified as: (1) space without obstacles, (2) space with stationary obstacles, and (3) space with moving obstacles. This paper presents the methodology used to generate a 2-D map of an unknown navigation space. The ability to minimize the redundancy during traveling and maximize the confidence function for generation of the map are advantages of this technique.

  7. Wigner functions for evanescent waves.

    PubMed

    Petruccelli, Jonathan C; Tian, Lei; Oh, Se Baek; Barbastathis, George

    2012-09-01

    We propose phase space distributions, based on an extension of the Wigner distribution function, to describe fields of any state of coherence that contain evanescent components emitted into a half-space. The evanescent components of the field are described in an optical phase space of spatial position and complex-valued angle. Behavior of these distributions upon propagation is also considered, where the rapid decay of the evanescent components is associated with the exponential decay of the associated phase space distributions. To demonstrate the structure and behavior of these distributions, we consider the fields generated from total internal reflection of a Gaussian Schell-model beam at a planar interface.

  8. Evaluation of Fast-Time Wake Vortex Models using Wake Encounter Flight Test Data

    NASA Technical Reports Server (NTRS)

    Ahmad, Nashat N.; VanValkenburg, Randal L.; Bowles, Roland L.; Limon Duparcmeur, Fanny M.; Gloudesman, Thijs; van Lochem, Sander; Ras, Eelco

    2014-01-01

    This paper describes a methodology for the integration and evaluation of fast-time wake models with flight data. The National Aeronautics and Space Administration conducted detailed flight tests in 1995 and 1997 under the Aircraft Vortex Spacing System Program to characterize wake vortex decay and wake encounter dynamics. In this study, data collected during Flight 705 were used to evaluate NASA's fast-time wake transport and decay models. Deterministic and Monte-Carlo simulations were conducted to define wake hazard bounds behind the wake generator. The methodology described in this paper can be used for further validation of fast-time wake models using en-route flight data, and for determining wake turbulence constraints in the design of air traffic management concepts.

  9. Infrared horizon sensor modeling for attitude determination and control: Analysis and mission experience

    NASA Technical Reports Server (NTRS)

    Phenneger, M. C.; Singhal, S. P.; Lee, T. H.; Stengle, T. H.

    1985-01-01

    The work performed by the Attitude Determination and Control Section at the National Aeronautics and Space Administration/Goddard Space Flight Center in analyzing and evaluating the performance of infrared horizon sensors is presented. The results of studies performed during the 1960s are reviewed; several models for generating the Earth's infrared radiance profiles are presented; and the Horizon Radiance Modeling Utility, the software used to model the horizon sensor optics and electronics processing to computer radiance-dependent attitude errors, is briefly discussed. Also provided is mission experience from 12 spaceflight missions spanning the period from 1973 to 1984 and using a variety of horizon sensing hardware. Recommendations are presented for future directions for the infrared horizon sensing technology.

  10. Development of the CELSS emulator at NASA. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Cullingford, Hatice S.

    1990-01-01

    The Closed Ecological Life Support System (CELSS) Emulator is under development. It will be used to investigate computer simulations of integrated CELSS operations involving humans, plants, and process machinery. Described here is Version 1.0 of the CELSS Emulator that was initiated in 1988 on the Johnson Space Center (JSC) Multi Purpose Applications Console Test Bed as the simulation framework. The run model of the simulation system now contains a CELSS model called BLSS. The CELSS simulator empowers us to generate model data sets, store libraries of results for further analysis, and also display plots of model variables as a function of time. The progress of the project is presented with sample test runs and simulation display pages.

  11. Effects of spatial constraints on channel network topology: Implications for geomorphological inference

    NASA Astrophysics Data System (ADS)

    Cabral, Mariza Castanheira De Moura Da Costa

    In the fifty-two years since Robert Horton's 1945 pioneering quantitative description of channel network planform (or plan view morphology), no conclusive findings have been presented that permit inference of geomorphological processes from any measures of network planform. All measures of network planform studied exhibit limited geographic variability across different environments. Horton (1945), Langbein et al. (1947), Schumm (1956), Hack (1957), Melton (1958), and Gray (1961) established various "laws" of network planform, that is, statistical relationships between different variables which have limited variability. A wide variety of models which have been proposed to simulate the growth of channel networks in time over a landsurface are generally also in agreement with the above planform laws. An explanation is proposed for the generality of the channel network planform laws. Channel networks must be space filling, that is, they must extend over the landscape to drain every hillslope, leaving no large undrained areas, and with no crossing of channels, often achieving a roughly uniform drainage density in a given environment. It is shown that the space-filling constraint can reduce the sensitivity of planform variables to different network growth models, and it is proposed that this constraint may determine the planform laws. The "Q model" of network growth of Van Pelt and Verwer (1985) is used to generate samples of networks. Sensitivity to the model parameter Q is markedly reduced when the networks generated are required to be space filling. For a wide variety of Q values, the space-filling networks are in approximate agreement with the various channel network planform laws. Additional constraints, including of energy efficiency, were not studied but may further reduce the variability of planform laws. Inference of model parameter Q from network topology is successful only in networks not subject to spatial constraints. In space-filling networks, for a wide range of Q values, the maximal-likelihood Q parameter value is generally in the vicinity of 1/2, which yields topological randomness. It is proposed that space filling originates the appearance of randomness in channel network topology, and may cause difficulties to geomorphological inference from network planform.

  12. The generation and use of numerical shape models for irregular Solar System objects

    NASA Technical Reports Server (NTRS)

    Simonelli, Damon P.; Thomas, Peter C.; Carcich, Brian T.; Veverka, Joseph

    1993-01-01

    We describe a procedure that allows the efficient generation of numerical shape models for irregular Solar System objects, where a numerical model is simply a table of evenly spaced body-centered latitudes and longitudes and their associated radii. This modeling technique uses a combination of data from limbs, terminators, and control points, and produces shape models that have some important advantages over analytical shape models. Accurate numerical shape models make it feasible to study irregular objects with a wide range of standard scientific analysis techniques. These applications include the determination of moments of inertia and surface gravity, the mapping of surface locations and structural orientations, photometric measurement and analysis, the reprojection and mosaicking of digital images, and the generation of albedo maps. The capabilities of our modeling procedure are illustrated through the development of an accurate numerical shape model for Phobos and the production of a global, high-resolution, high-pass-filtered digital image mosaic of this Martian moon. Other irregular objects that have been modeled, or are being modeled, include the asteroid Gaspra and the satellites Deimos, Amalthea, Epimetheus, Janus, Hyperion, and Proteus.

  13. Materials property definition and generation for carbon-carbon and carbon phenolic materials

    NASA Technical Reports Server (NTRS)

    Canfield, A. R.; Mathis, J. R.; Starrett, H. S.; Koenig, J. R.

    1987-01-01

    A data base program to generate statistically significant material-property data for carbon-carbon and carbon phenolic materials to be used in designs of Space Shuttle is described. The program, which will provide data necessary for thermal and stress modeling of Shuttle nozzle and exit cone structures, includes evaluation of tension, compression, shear strength, shear modulus, thermal expansion, thermal conductivity, permeability, and emittance for both materials; the testing of carbon phenolic materials also includes CTE, off-gassing, pyrolysis, and RTG. Materials to be tested will be excised from Space Shuttle inlet, throat, and exit cone billets and modified involute carbon-carbon exit cones; coprocessed blocks, panels, and cylinders will also be tested.

  14. Toward Rigorous Parameterization of Underconstrained Neural Network Models Through Interactive Visualization and Steering of Connectivity Generation

    PubMed Central

    Nowke, Christian; Diaz-Pier, Sandra; Weyers, Benjamin; Hentschel, Bernd; Morrison, Abigail; Kuhlen, Torsten W.; Peyser, Alexander

    2018-01-01

    Simulation models in many scientific fields can have non-unique solutions or unique solutions which can be difficult to find. Moreover, in evolving systems, unique final state solutions can be reached by multiple different trajectories. Neuroscience is no exception. Often, neural network models are subject to parameter fitting to obtain desirable output comparable to experimental data. Parameter fitting without sufficient constraints and a systematic exploration of the possible solution space can lead to conclusions valid only around local minima or around non-minima. To address this issue, we have developed an interactive tool for visualizing and steering parameters in neural network simulation models. In this work, we focus particularly on connectivity generation, since finding suitable connectivity configurations for neural network models constitutes a complex parameter search scenario. The development of the tool has been guided by several use cases—the tool allows researchers to steer the parameters of the connectivity generation during the simulation, thus quickly growing networks composed of multiple populations with a targeted mean activity. The flexibility of the software allows scientists to explore other connectivity and neuron variables apart from the ones presented as use cases. With this tool, we enable an interactive exploration of parameter spaces and a better understanding of neural network models and grapple with the crucial problem of non-unique network solutions and trajectories. In addition, we observe a reduction in turn around times for the assessment of these models, due to interactive visualization while the simulation is computed. PMID:29937723

  15. Adjudicating between face-coding models with individual-face fMRI responses

    PubMed Central

    Kriegeskorte, Nikolaus

    2017-01-01

    The perceptual representation of individual faces is often explained with reference to a norm-based face space. In such spaces, individuals are encoded as vectors where identity is primarily conveyed by direction and distinctiveness by eccentricity. Here we measured human fMRI responses and psychophysical similarity judgments of individual face exemplars, which were generated as realistic 3D animations using a computer-graphics model. We developed and evaluated multiple neurobiologically plausible computational models, each of which predicts a representational distance matrix and a regional-mean activation profile for 24 face stimuli. In the fusiform face area, a face-space coding model with sigmoidal ramp tuning provided a better account of the data than one based on exemplar tuning. However, an image-processing model with weighted banks of Gabor filters performed similarly. Accounting for the data required the inclusion of a measurement-level population averaging mechanism that approximates how fMRI voxels locally average distinct neuronal tunings. Our study demonstrates the importance of comparing multiple models and of modeling the measurement process in computational neuroimaging. PMID:28746335

  16. Etude de la Generation des Ultrasons Par Laser dans un Materiau Composite

    NASA Astrophysics Data System (ADS)

    Dubois, Marc

    Laser generation of ultrasound is not a new subject. Many authors have proposed mathematical models of the thermoelastic process of generation of acoustic waves. However, none of those models, up to now, could take simultaneously the effects of the thermal conduction, the optical penetration, the anisotropy of the material and any time and surface profiles of the laser excitation into account. The model presented in this work takes all these parameters into consideration in the case of an infinite orthotropic plate. The mathematical approach used allows to obtain an analytical solution of the mechanical displacement field in the Laplace and two-dimensional (2-D) Fourier spaces. Numerical inverse Laplace and 2-D Fourier transformations bring the mechanical displacement field back into the normal spaces. The use of direct numerical transformations enables to consider almost any time and spatial distributions of the generation laser beam. The acoustic displacements calculated by this model have been compared to experimental displacements measured with a wide band optical detection system. The features of this system allow the quantitative measurement of the parallel and normal displacements to the surface of the sample. Hence, the calculated normal and parallel displacements have been compared to those experimentally measured at various locations on aluminum, glass and polymer samples. In all cases, the agreement between the calculated and experimentally measured displacements was good. The semi-analytical model having proved its validity, it has been used, in addition to a completely analytical one-dimensional model, to study the effects of the optical penetration and the laser pulse duration on the longitudinal acoustic wave generated. This study has established that a short enough laser pulse and a large irradiation with regard to the sample thickness allows to determine quantitatively, from the full width at half maximum of the acoustic pulse, the optical penetration depth at the wavelength of the generation laser inside the material. This semi-analytical model has also permitted to analyze the effects of the optical penetration on the directivity patterns of the longitudinal and shear waves generated by a thermoelastic source. This study has clearly shown that the optical penetration modifies significantly the longitudinal wave directivity pattern, but has only weak effects on the shear wave one. (Abstract shortened by UMI.).

  17. NASREN: Standard reference model for telerobot control

    NASA Technical Reports Server (NTRS)

    Albus, J. S.; Lumia, R.; Mccain, H.

    1987-01-01

    A hierarchical architecture is described which supports space station telerobots in a variety of modes. The system is divided into three hierarchies: task decomposition, world model, and sensory processing. Goals at each level of the task dedomposition heirarchy are divided both spatially and temporally into simpler commands for the next lower level. This decomposition is repreated until, at the lowest level, the drive signals to the robot actuators are generated. To accomplish its goals, task decomposition modules must often use information stored it the world model. The purpose of the sensory system is to update the world model as rapidly as possible to keep the model in registration with the physical world. The architecture of the entire control system hierarch is described and how it can be applied to space telerobot applications.

  18. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  19. Analyses of Longitudinal Mode Combustion Instability in J-2X Gas Generator Development

    NASA Technical Reports Server (NTRS)

    Hulka, J. R.; Protz, C. S.; Casiano, M. J.; Kenny, R. J.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) and Pratt & Whitney Rocketdyne are developing a liquid oxygen/liquid hydrogen rocket engine for future upper stage and trans-lunar applications. This engine, designated the J-2X, is a higher pressure, higher thrust variant of the Apollo-era J-2 engine. The contract for development was let to Pratt & Whitney Rocketdyne in 2006. Over the past several years, development of the gas generator for the J-2X engine has progressed through a variety of workhorse injector, chamber, and feed system configurations on the component test stand at the NASA Marshall Space Flight Center (MSFC). Several of the initial configurations resulted in combustion instability of the workhorse gas generator assembly at a frequency near the first longitudinal mode of the combustion chamber. In this paper, several aspects of these combustion instabilities are discussed, including injector, combustion chamber, feed system, and nozzle influences. To ensure elimination of the instabilities at the engine level, and to understand the stability margin, the gas generator system has been modeled at the NASA MSFC with two techniques, the Rocket Combustor Interaction Design and Analysis (ROCCID) code and a lumped-parameter MATLAB(TradeMark) model created as an alternative calculation to the ROCCID methodology. To correctly predict the instability characteristics of all the chamber and injector geometries and test conditions as a whole, several inputs to the submodels in ROCCID and the MATLAB(TradeMark) model were modified. Extensive sensitivity calculations were conducted to determine how to model and anchor a lumped-parameter injector response, and finite-element and acoustic analyses were conducted on several complicated combustion chamber geometries to determine how to model and anchor the chamber response. These modifications and their ramification for future stability analyses of this type are discussed.

  20. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  1. Space Shuttle Main Engine (SSME) LOX turbopump pump-end bearing analysis

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A simulation of the shaft/bearing system of the Space Shuttle Main Engine Liquid Oxygen turbopump was developed. The simulation model allows the thermal and mechanical characteristics to interact as a realistic simulation of the bearing operating characteristics. The model accounts for single and two phase coolant conditions, and includes the heat generation from bearing friction and fluid stirring. Using the simulation model, parametric analyses were performed on the 45 mm pump-end bearings to investigate the sensitivity of bearing characteristics to contact friction, axial preload, coolant flow rate, coolant inlet temperature and quality, heat transfer coefficients, outer race clearance and misalignment, and the effects of thermally isolating the outer race from the isolator.

  2. I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison

    NASA Technical Reports Server (NTRS)

    Somawardhana, Ruwan

    2011-01-01

    CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.

  3. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large Photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis, and test data to date. A description of the LMSC performance model, future test plans, and predicted performance ranges are also given.

  4. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis and test data to date. A description of the LMSC performance model future test plans and predicted performance ranges are also given.

  5. Testing collapse models by a thermometer

    NASA Astrophysics Data System (ADS)

    Bahrami, M.

    2018-05-01

    Collapse models postulate that space is filled with a collapse noise field, inducing quantum Brownian motions, which are dominant during the measurement, thus causing collapse of the wave function. An important manifestation of the collapse noise field, if any, is thermal energy generation, thus disturbing the temperature profile of a system. The experimental investigation of a collapse-driven heating effect has provided, so far, the most promising test of collapse models against standard quantum theory. In this paper, we calculate the collapse-driven heat generation for a three-dimensional multi-atomic Bravais lattice by solving stochastic Heisenberg equations. We perform our calculation for the mass-proportional continuous spontaneous localization collapse model with nonwhite noise. We obtain the temperature distribution of a sphere under stationary-state and insulated surface conditions. However, the exact quantification of the collapse-driven heat-generation effect highly depends on the actual value of cutoff in the collapse noise spectrum.

  6. LG-ANALYST: linguistic geometry for master air attack planning

    NASA Astrophysics Data System (ADS)

    Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg

    2003-09-01

    We investigate the technical feasibility of implementing LG-ANALYST, a new software tool based on the Linguistic Geometry (LG) approach. The tool will be capable of modeling and providing solutions to Air Force related battlefield problems and of conducting multiple experiments to verify the quality of the solutions it generates. LG-ANALYST will support generation of the Fast Master Air Attack Plan (MAAP) with subsequent conversion into Air Tasking Order (ATO). An Air Force mission is modeled employing abstract board games (ABG). Such a mission may include, for example, an aircraft strike package moving to a target area with the opposing side having ground-to-air missiles, anti-aircraft batteries, fighter wings, and radars. The corresponding abstract board captures 3D air space, terrain, the aircraft trajectories, positions of the batteries, strategic features of the terrain, such as bridges, and their status, radars and illuminated space, etc. Various animated views are provided by LG-ANALYST including a 3D view for realistic representation of the battlespace and a 2D view for ease of analysis and control. LG-ANALYST will allow a user to model full scale intelligent enemy, plan in advance, re-plan and control in real time Blue and Red forces by generating optimal (or near-optimal) strategies for all sides of a conflict.

  7. Design of Test Support Hardware for Advanced Space Suits

    NASA Technical Reports Server (NTRS)

    Watters, Jeffrey A.; Rhodes, Richard

    2013-01-01

    As a member of the Space Suit Assembly Development Engineering Team, I designed and built test equipment systems to support the development of the next generation of advanced space suits. During space suit testing it is critical to supply the subject with two functions: (1) cooling to remove metabolic heat, and (2) breathing air to pressurize the space suit. The objective of my first project was to design, build, and certify an improved Space Suit Cooling System for manned testing in a 1-G environment. This design had to be portable and supply a minimum cooling rate of 2500 BTU/hr. The Space Suit Cooling System is a robust, portable system that supports very high metabolic rates. It has a highly adjustable cool rate and is equipped with digital instrumentation to monitor the flowrate and critical temperatures. It can supply a variable water temperature down to 34 deg., and it can generate a maximum water flowrate of 2.5 LPM. My next project was to design and build a Breathing Air System that was capable of supply facility air to subjects wearing the Z-2 space suit. The system intakes 150 PSIG breathing air and regulates it to two operating pressures: 4.3 and 8.3 PSIG. It can also provide structural capabilities at 1.5x operating pressure: 6.6 and 13.2 PSIG, respectively. It has instrumentation to monitor flowrate, as well as inlet and outlet pressures. The system has a series of relief valves to fully protect itself in case of regulator failure. Both projects followed a similar design methodology. The first task was to perform research on existing concepts to develop a sufficient background knowledge. Then mathematical models were developed to size components and simulate system performance. Next, mechanical and electrical schematics were generated and presented at Design Reviews. After the systems were approved by the suit team, all the hardware components were specified and procured. The systems were then packaged, fabricated, and thoroughly tested. The next step was to certify the equipment for manned used, which included generating a Hazard Analysis and giving a presentation to the Test Readiness Review Board. Both of these test support systems will perform critical roles in the development of next-generation space suits. They will used on a regular basis to test the NASA's new Z-2 Space Suit. The Space Suit Cooling System is now the primary cooling system for all advanced suit tests.

  8. Computing the modal mass from the state space model in combined experimental-operational modal analysis

    NASA Astrophysics Data System (ADS)

    Cara, Javier

    2016-05-01

    Modal parameters comprise natural frequencies, damping ratios, modal vectors and modal masses. In a theoretic framework, these parameters are the basis for the solution of vibration problems using the theory of modal superposition. In practice, they can be computed from input-output vibration data: the usual procedure is to estimate a mathematical model from the data and then to compute the modal parameters from the estimated model. The most popular models for input-output data are based on the frequency response function, but in recent years the state space model in the time domain has become popular among researchers and practitioners of modal analysis with experimental data. In this work, the equations to compute the modal parameters from the state space model when input and output data are available (like in combined experimental-operational modal analysis) are derived in detail using invariants of the state space model: the equations needed to compute natural frequencies, damping ratios and modal vectors are well known in the operational modal analysis framework, but the equation needed to compute the modal masses has not generated much interest in technical literature. These equations are applied to both a numerical simulation and an experimental study in the last part of the work.

  9. Computer Models of Proteins

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Dr. Marc Pusey (seated) and Dr. Craig Kundrot use computers to analyze x-ray maps and generate three-dimensional models of protein structures. With this information, scientists at Marshall Space Flight Center can learn how proteins are made and how they work. The computer screen depicts a proten structure as a ball-and-stick model. Other models depict the actual volume occupied by the atoms, or the ribbon-like structures that are crucial to a protein's function.

  10. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  11. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  12. Worlddem - a Novel Global Foundation Layer

    NASA Astrophysics Data System (ADS)

    Riegler, G.; Hennig, S. D.; Weber, M.

    2015-03-01

    Airbus Defence and Space's WorldDEM™ provides a global Digital Elevation Model of unprecedented quality, accuracy, and coverage. The product will feature a vertical accuracy of 2m (relative) and better than 6m (absolute) in a 12m x 12m raster. The accuracy will surpass that of any global satellite-based elevation model available. WorldDEM is a game-changing disruptive technology and will define a new standard in global elevation models. The German radar satellites TerraSAR-X and TanDEM-X form a high-precision radar interferometer in space and acquire the data basis for the WorldDEM. This mission is performed jointly with the German Aerospace Center (DLR). Airbus DS refines the Digital Surface Model (e.g. editing of acquisition, processing artefacts and water surfaces) or generates a Digital Terrain Model. Three product levels are offered: WorldDEMcore (output of the processing, no editing is applied), WorldDEM™ (guarantees a void-free terrain description and hydrological consistency) and WorldDEM DTM (represents bare Earth elevation). Precise elevation data is the initial foundation of any accurate geospatial product, particularly when the integration of multi-source imagery and data is performed based upon it. Fused data provides for improved reliability, increased confidence and reduced ambiguity. This paper will present the current status of product development activities including methodologies and tool to generate these, like terrain and water bodies editing and DTM generation. In addition, the studies on verification & validation of the WorldDEM products will be presented.

  13. Using Model-Based Systems Engineering to Provide Artifacts for NASA Project Life-cycle and Technical Reviews

    NASA Technical Reports Server (NTRS)

    Parrott, Edith L.; Weiland, Karen J.

    2017-01-01

    This paper is for the AIAA Space Conference. The ability of systems engineers to use model-based systems engineering (MBSE) to generate self-consistent, up-to-date systems engineering products for project life-cycle and technical reviews is an important aspect for the continued and accelerated acceptance of MBSE. Currently, many review products are generated using labor-intensive, error-prone approaches based on documents, spreadsheets, and chart sets; a promised benefit of MBSE is that users will experience reductions in inconsistencies and errors. This work examines features of SysML that can be used to generate systems engineering products. Model elements, relationships, tables, and diagrams are identified for a large number of the typical systems engineering artifacts. A SysML system model can contain and generate most systems engineering products to a significant extent and this paper provides a guide on how to use MBSE to generate products for project life-cycle and technical reviews. The use of MBSE can reduce the schedule impact usually experienced for review preparation, as in many cases the review products can be auto-generated directly from the system model. These approaches are useful to systems engineers, project managers, review board members, and other key project stakeholders.

  14. Necessary conditions for superior thermoelectric power of Si/Au artificial superlattice thin-film

    NASA Astrophysics Data System (ADS)

    Okamoto, Yoichi; Watanabe, Shin; Miyazaki, Hisashi; Morimoto, Jun

    2018-03-01

    The Si-Ge-Au ternary artificial superlattice thin-films showed superior thermoelectric power with low reproducibility. Superior thermoelectric power was only generated, when nanocrystals existed. Therefore, the origin of superior thermoelectric power was considered to be the quantum size effect of nanocrystals. However, even with the presence of nanocrystals, superior thermoelectric power was often not generated. In order to investigate the generation conditions of superior thermoelectric power in more detail, the samples were simplified to Si-Au binary artificial superlattice samples. Furthermore, annealings were carried out under conditions where nanocrystals were likely to be formed. From the results of Raman scattering spectroscopy and X-ray diffraction (XRD) analysis, the diameter of nanocrystals and the spacing between nanocrystals were calculated with an isotropic three-dimensional mosaic model. It was found that superior thermoelectric power was generated only when the diameter of nanocrystals was 11 nm or less and the spacing between nanocrystals was 3 nm or less.

  15. Development of an external ceramic insulation for the space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Tanzilli, R. A. (Editor)

    1972-01-01

    The development and evaluation of a family of reusable external insulation systems for use on the space shuttle orbiter is discussed. The material development and evaluation activities are described. Additional information is provided on the development of an analytical micromechanical model of the reusable insulation and the development of techniques for reducing the heat transfer. Design data on reusable insulation systems and test techniques used for design data generation are included.

  16. Multiple-body simulation with emphasis on integrated Space Shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau

    1993-01-01

    The program to obtain intergrid communications - Pegasus - was enhanced to make better use of computing resources. Periodic block tridiagonal and penta-diagonal diagonal routines in OVERFLOW were modified to use a better algorithm to speed up the calculation for grids with periodic boundary conditions. Several programs were added to collar grid tools and a user friendly shell script was developed to help users generate collar grids. User interface for HYPGEN was modified to cope with the changes in HYPGEN. ET/SRB attach hardware grids were added to the computational model for the space shuttle and is currently incorporated into the refined shuttle model jointly developed at Johnson Space Center and Ames Research Center. Flow simulation for the integrated space shuttle vehicle at flight Reynolds number was carried out and compared with flight data as well as the earlier simulation for wind tunnel Reynolds number.

  17. Functional identification of spike-processing neural circuits.

    PubMed

    Lazar, Aurel A; Slutskiy, Yevgeniy B

    2014-02-01

    We introduce a novel approach for a complete functional identification of biophysical spike-processing neural circuits. The circuits considered accept multidimensional spike trains as their input and comprise a multitude of temporal receptive fields and conductance-based models of action potential generation. Each temporal receptive field describes the spatiotemporal contribution of all synapses between any two neurons and incorporates the (passive) processing carried out by the dendritic tree. The aggregate dendritic current produced by a multitude of temporal receptive fields is encoded into a sequence of action potentials by a spike generator modeled as a nonlinear dynamical system. Our approach builds on the observation that during any experiment, an entire neural circuit, including its receptive fields and biophysical spike generators, is projected onto the space of stimuli used to identify the circuit. Employing the reproducing kernel Hilbert space (RKHS) of trigonometric polynomials to describe input stimuli, we quantitatively describe the relationship between underlying circuit parameters and their projections. We also derive experimental conditions under which these projections converge to the true parameters. In doing so, we achieve the mathematical tractability needed to characterize the biophysical spike generator and identify the multitude of receptive fields. The algorithms obviate the need to repeat experiments in order to compute the neurons' rate of response, rendering our methodology of interest to both experimental and theoretical neuroscientists.

  18. Three-dimensional elliptic grid generation for an F-16

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.

    1988-01-01

    A case history depicting the effort to generate a computational grid for the simulation of transonic flow about an F-16 aircraft at realistic flight conditions is presented. The flow solver for which this grid is designed is a zonal one, using the Reynolds averaged Navier-Stokes equations near the surface of the aircraft, and the Euler equations in regions removed from the aircraft. A body conforming global grid, suitable for the Euler equation, is first generated using 3-D Poisson equations having inhomogeneous terms modeled after the 2-D GRAPE code. Regions of the global grid are then designated for zonal refinement as appropriate to accurately model the flow physics. Grid spacing suitable for solution of the Navier-Stokes equations is generated in the refinement zones by simple subdivision of the given coarse grid intervals. That grid generation project is described, with particular emphasis on the global coarse grid.

  19. Modeling and analysis of the space shuttle nose-gear tire with semianalytic finite elements

    NASA Technical Reports Server (NTRS)

    Kim, Kyun O.; Noor, Ahmed K.; Tanner, John A.

    1990-01-01

    A computational procedure is presented for the geometrically nonlinear analysis of aircraft tires. The Space Shuttle Orbiter nose gear tire was modeled by using a two-dimensional laminated anisotropic shell theory with the effects of variation in material and geometric parameters included. The four key elements of the procedure are: (1) semianalytic finite elements in which the shell variables are represented by Fourier series in the circumferential direction and piecewise polynominals in the meridional direction; (2) a mixed formulation with the fundamental unknowns consisting of strain parameters, stress-resultant parameters, and generalized displacements; (3) multilevel operator splitting to effect successive simplifications, and to uncouple the equations associated with different Fourier harmonics; and (4) multilevel iterative procedures and reduction techniques to generate the response of the shell. Numerical results of the Space Shuttle Orbiter nose gear tire model are compared with experimental measurements of the tire subjected to inflation loading.

  20. Development of a computational model for astronaut reorientation.

    PubMed

    Stirling, Leia; Willcox, Karen; Newman, Dava

    2010-08-26

    The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.

  1. Controlling effect of geometrically defined local structural changes on chaotic Hamiltonian systems.

    PubMed

    Ben Zion, Yossi; Horwitz, Lawrence

    2010-04-01

    An effective characterization of chaotic conservative Hamiltonian systems in terms of the curvature associated with a Riemannian metric tensor derived from the structure of the Hamiltonian has been extended to a wide class of potential models of standard form through definition of a conformal metric. The geodesic equations reproduce the Hamilton equations of the original potential model through an inverse map in the tangent space. The second covariant derivative of the geodesic deviation in this space generates a dynamical curvature, resulting in (energy-dependent) criteria for unstable behavior different from the usual Lyapunov criteria. We show here that this criterion can be constructively used to modify locally the potential of a chaotic Hamiltonian model in such a way that stable motion is achieved. Since our criterion for instability is local in coordinate space, these results provide a minimal method for achieving control of a chaotic system.

  2. Parametric modeling of the intervertebral disc space in 3D: application to CT images of the lumbar spine.

    PubMed

    Korez, Robert; Likar, Boštjan; Pernuš, Franjo; Vrtovec, Tomaž

    2014-10-01

    Gradual degeneration of intervertebral discs of the lumbar spine is one of the most common causes of low back pain. Although conservative treatment for low back pain may provide relief to most individuals, surgical intervention may be required for individuals with significant continuing symptoms, which is usually performed by replacing the degenerated intervertebral disc with an artificial implant. For designing implants with good bone contact and continuous force distribution, the morphology of the intervertebral disc space and vertebral body endplates is of considerable importance. In this study, we propose a method for parametric modeling of the intervertebral disc space in three dimensions (3D) and show its application to computed tomography (CT) images of the lumbar spine. The initial 3D model of the intervertebral disc space is generated according to the superquadric approach and therefore represented by a truncated elliptical cone, which is initialized by parameters obtained from 3D models of adjacent vertebral bodies. In an optimization procedure, the 3D model of the intervertebral disc space is incrementally deformed by adding parameters that provide a more detailed morphometric description of the observed shape, and aligned to the observed intervertebral disc space in the 3D image. By applying the proposed method to CT images of 20 lumbar spines, the shape and pose of each of the 100 intervertebral disc spaces were represented by a 3D parametric model. The resulting mean (±standard deviation) accuracy of modeling was 1.06±0.98mm in terms of radial Euclidean distance against manually defined ground truth points, with the corresponding success rate of 93% (i.e. 93 out of 100 intervertebral disc spaces were modeled successfully). As the resulting 3D models provide a description of the shape of intervertebral disc spaces in a complete parametric form, morphometric analysis was straightforwardly enabled and allowed the computation of the corresponding heights, widths and volumes, as well as of other geometric features that in detail describe the shape of intervertebral disc spaces. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Search-based model identification of smart-structure damage

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Macalou, A.

    1991-01-01

    This paper describes the use of a combined model and parameter identification approach, based on modal analysis and artificial intelligence (AI) techniques, for identifying damage or flaws in a rotating truss structure incorporating embedded piezoceramic sensors. This smart structure example is representative of a class of structures commonly found in aerospace systems and next generation space structures. Artificial intelligence techniques of classification, heuristic search, and an object-oriented knowledge base are used in an AI-based model identification approach. A finite model space is classified into a search tree, over which a variant of best-first search is used to identify the model whose stored response most closely matches that of the input. Newly-encountered models can be incorporated into the model space. This adaptativeness demonstrates the potential for learning control. Following this output-error model identification, numerical parameter identification is used to further refine the identified model. Given the rotating truss example in this paper, noisy data corresponding to various damage configurations are input to both this approach and a conventional parameter identification method. The combination of the AI-based model identification with parameter identification is shown to lead to smaller parameter corrections than required by the use of parameter identification alone.

  4. Asymmetric injection and distribution of space charges in propylene carbonate under impulse voltage

    NASA Astrophysics Data System (ADS)

    Sima, Wenxia; Chen, Qiulin; Sun, Potao; Yang, Ming; Guo, Hongda; Ye, Lian

    2018-05-01

    Space charge can distort the electric field in high voltage stressed liquid dielectrics and lead to breakdown. Observing the evolution of space charge in real time and determining the influencing factors are of considerable significance. The spatio-temporal evolution of space charge in propylene carbonate, which is very complex under impulse voltage, was measured in this study through the time-continuous Kerr electro-optic field mapping measurement. We found that the injection charge from a brass electrode displayed an asymmetric effect; that is, the negative charge injection near the cathode lags behind the positive charge injection near the anode. Physical mechanisms, including charge generation and drift, are analyzed, and a voltage-dependent saturated drift rectification model was established to explain the interesting phenomena. Mutual validation of models and our measurement data indicated that a barrier layer, which is similar to metal-semiconductor contact, was formed in the contact interface between the electrode and propylene carbonate and played an important role in the space charge injection.

  5. High-order space charge effects using automatic differentiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reusch, Michael F.; Bruhwiler, David L.; Computer Accelerator Physics Conference Williamsburg, Virginia 1996

    1997-02-01

    The Northrop Grumman Topkark code has been upgraded to Fortran 90, making use of operator overloading, so the same code can be used to either track an array of particles or construct a Taylor map representation of the accelerator lattice. We review beam optics and beam dynamics simulations conducted with TOPKARK in the past and we present a new method for modeling space charge forces to high-order with automatic differentiation. This method generates an accurate, high-order, 6-D Taylor map of the phase space variable trajectories for a bunched, high-current beam. The spatial distribution is modeled as the product of amore » Taylor Series times a Gaussian. The variables in the argument of the Gaussian are normalized to the respective second moments of the distribution. This form allows for accurate representation of a wide range of realistic distributions, including any asymmetries, and allows for rapid calculation of the space charge fields with free space boundary conditions. An example problem is presented to illustrate our approach.« less

  6. ACCELERATED FITTING OF STELLAR SPECTRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ting, Yuan-Sen; Conroy, Charlie; Rix, Hans-Walter

    2016-07-20

    Stellar spectra are often modeled and fitted by interpolating within a rectilinear grid of synthetic spectra to derive the stars’ labels: stellar parameters and elemental abundances. However, the number of synthetic spectra needed for a rectilinear grid grows exponentially with the label space dimensions, precluding the simultaneous and self-consistent fitting of more than a few elemental abundances. Shortcuts such as fitting subsets of labels separately can introduce unknown systematics and do not produce correct error covariances in the derived labels. In this paper we present a new approach—Convex Hull Adaptive Tessellation (chat)—which includes several new ideas for inexpensively generating amore » sufficient stellar synthetic library, using linear algebra and the concept of an adaptive, data-driven grid. A convex hull approximates the region where the data lie in the label space. A variety of tests with mock data sets demonstrate that chat can reduce the number of required synthetic model calculations by three orders of magnitude in an eight-dimensional label space. The reduction will be even larger for higher dimensional label spaces. In chat the computational effort increases only linearly with the number of labels that are fit simultaneously. Around each of these grid points in the label space an approximate synthetic spectrum can be generated through linear expansion using a set of “gradient spectra” that represent flux derivatives at every wavelength point with respect to all labels. These techniques provide new opportunities to fit the full stellar spectra from large surveys with 15–30 labels simultaneously.« less

  7. Bridging the Worlds of Entertainment and Space - One Element of the Space Generation Foundation

    NASA Astrophysics Data System (ADS)

    Hildago, L.

    2002-01-01

    Programme on Space Applications, SGSabstracts@unsgac.org/fax +1(281)244-7478 The Space Generation Foundation, founder of ISU, is the current home for Space Rocks!, Yuri's Night, and other space projects focused on education, outreach, and sustainable development worldwide. One particular area of success in 2001/2002 has been the involvement of the entertainment community in space events. Yuri's Night brought together musicians, DJs, artists, and the public to celebrate space. Space Rocks will do the same on a much larger scale, employing film, theatre, poetry, music, art, advertising firms, and other unconventional media to communicate space to the public. We will present about the aims and future plans of the Foundation. The Space Generation Advisory Council in support of the United Nations Programme on Space Applications has as its main focus Space education and outreach. Since the Space Generation Forum in 1999, successful global education and outreach projects have been implemented by young people around the world. These and new ideas are being further developed at the Space Generation Summit (SGS), an event at World Space Congress (WSC) that will unite international students and young professionals to develop a youth vision and strategy for the peaceful uses of space. SGS, endorsed by the United Nations, will take place from October 11-13th, during which the 200 delegates will discuss ongoing youth space activities, particularly those stemming from the UNISPACE- III/SGF and taken forward by the Space Generation Advisory Council. Delegates will address a variety of topics with the goal of devising new recommendations according to the theme, 'Accelerating Our Pace in Space'. The material presented here and in other technical sessions throughout WSC includes the results of these discussions.

  8. Validation Test Report for the BioCast Optical Forecast Model Version 1.0

    DTIC Science & Technology

    2015-04-09

    can generate such as: total absorption (a), backscattering (bb), chlorophyll (chl), sea surface temperature (SST), diver visibility, etc. The...optical backscattering coefficient BSP - Battle Space Profiler CHARTS - Compact Hydrographic Airborne Rapid Total Survey Chl - Chlorophyll EO

  9. Finite-element 3D simulation tools for high-current relativistic electron beams

    NASA Astrophysics Data System (ADS)

    Humphries, Stanley; Ekdahl, Carl

    2002-08-01

    The DARHT second-axis injector is a challenge for computer simulations. Electrons are subject to strong beam-generated forces. The fields are fully three-dimensional and accurate calculations at surfaces are critical. We describe methods applied in OmniTrak, a 3D finite-element code suite that can address DARHT and the full range of charged-particle devices. The system handles mesh generation, electrostatics, magnetostatics and self-consistent particle orbits. The MetaMesh program generates meshes of conformal hexahedrons to fit any user geometry. The code has the unique ability to create structured conformal meshes with cubic logic. Organized meshes offer advantages in speed and memory utilization in the orbit and field solutions. OmniTrak is a versatile charged-particle code that handles 3D electric and magnetic field solutions on independent meshes. The program can update both 3D field solutions from the calculated beam space-charge and current-density. We shall describe numerical methods for orbit tracking on a hexahedron mesh. Topics include: 1) identification of elements along the particle trajectory, 2) fast searches and adaptive field calculations, 3) interpolation methods to terminate orbits on material surfaces, 4) automatic particle generation on multiple emission surfaces to model space-charge-limited emission and field emission, 5) flexible Child law algorithms, 6) implementation of the dual potential model for 3D magnetostatics, and 7) assignment of charge and current from model particle orbits for self-consistent fields.

  10. Planning additional drilling campaign using two-space genetic algorithm: A game theoretical approach

    NASA Astrophysics Data System (ADS)

    Kumral, Mustafa; Ozer, Umit

    2013-03-01

    Grade and tonnage are the most important technical uncertainties in mining ventures because of the use of estimations/simulations, which are mostly generated from drill data. Open pit mines are planned and designed on the basis of the blocks representing the entire orebody. Each block has different estimation/simulation variance reflecting uncertainty to some extent. The estimation/simulation realizations are submitted to mine production scheduling process. However, the use of a block model with varying estimation/simulation variances will lead to serious risk in the scheduling. In the medium of multiple simulations, the dispersion variances of blocks can be thought to regard technical uncertainties. However, the dispersion variance cannot handle uncertainty associated with varying estimation/simulation variances of blocks. This paper proposes an approach that generates the configuration of the best additional drilling campaign to generate more homogenous estimation/simulation variances of blocks. In other words, the objective is to find the best drilling configuration in such a way as to minimize grade uncertainty under budget constraint. Uncertainty measure of the optimization process in this paper is interpolation variance, which considers data locations and grades. The problem is expressed as a minmax problem, which focuses on finding the best worst-case performance i.e., minimizing interpolation variance of the block generating maximum interpolation variance. Since the optimization model requires computing the interpolation variances of blocks being simulated/estimated in each iteration, the problem cannot be solved by standard optimization tools. This motivates to use two-space genetic algorithm (GA) approach to solve the problem. The technique has two spaces: feasible drill hole configuration with minimization of interpolation variance and drill hole simulations with maximization of interpolation variance. Two-space interacts to find a minmax solution iteratively. A case study was conducted to demonstrate the performance of approach. The findings showed that the approach could be used to plan a new drilling campaign.

  11. Preliminary design, analysis, and costing of a dynamic scale model of the NASA space station

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Pinson, E. D.; Voqui, H. L.; Crawley, E. F.; Everman, M. R.

    1987-01-01

    The difficulty of testing the next generation of large flexible space structures on the ground places an emphasis on other means for validating predicted on-orbit dynamic behavior. Scale model technology represents one way of verifying analytical predictions with ground test data. This study investigates the preliminary design, scaling and cost trades for a Space Station dynamic scale model. The scaling of nonlinear joint behavior is studied from theoretical and practical points of view. Suspension system interaction trades are conducted for the ISS Dual Keel Configuration and Build-Up Stages suspended in the proposed NASA/LaRC Large Spacecraft Laboratory. Key issues addressed are scaling laws, replication vs. simulation of components, manufacturing, suspension interactions, joint behavior, damping, articulation capability, and cost. These issues are the subject of parametric trades versus the scale model factor. The results of these detailed analyses are used to recommend scale factors for four different scale model options, each with varying degrees of replication. Potential problems in constructing and testing the scale model are identified, and recommendations for further study are outlined.

  12. Microgravity

    NASA Image and Video Library

    2000-04-19

    Dr. Marc Pusey (seated) and Dr. Craig Kundrot use computers to analyze x-ray maps and generate three-dimensional models of protein structures. With this information, scientists at Marshall Space Flight Center can learn how proteins are made and how they work. The computer screen depicts a proten structure as a ball-and-stick model. Other models depict the actual volume occupied by the atoms, or the ribbon-like structures that are crucial to a protein's function.

  13. The influence of solid rocket motor retro-burns on the space debris environment

    NASA Astrophysics Data System (ADS)

    Stabroth, Sebastian; Homeister, Maren; Oswald, Michael; Wiedemann, Carsten; Klinkrad, Heiner; Vörsmann, Peter

    The ESA space debris population model MASTER (Meteoroid and Space Debris Terrestrial Environment Reference) considers firings of solid rocket motors (SRM) as a debris source with the associated generation of slag and dust particles. The resulting slag and dust population is a major contribution to the sub-millimetre size debris environment in Earth orbit. The current model version, MASTER-2005, is based on the simulation of 1076 orbital SRM firings which contributed to the long-term debris environment. A comparison of the modelled flux with impact data from returned surfaces shows that the shape and quantity of the modelled SRM dust distribution matches that of recent Hubble Space Telescope (HST) solar array measurements very well. However, the absolute flux level for dust is under-predicted for some of the analysed Long Duration Exposure Facility (LDEF) surfaces. This points into the direction of some past SRM firings not included in the current event database. The most suitable candidates for these firings are the large number of SRM retro-burns of return capsules. Objects released by those firings have highly eccentric orbits with perigees in the lower regions of the atmosphere. Thus, they produce no long-term effect on the debris environment. However, a large number of those firings during the on-orbit time frame of LDEF might lead to an increase of the dust population for some of the LDEF surfaces. In this paper, the influence of SRM retro-burns on the short- and long-term debris environment is analysed. The existing firing database is updated with gathered information of some 800 Russian retro-firings. Each firing is simulated with the MASTER population generation module. The resulting population is compared against the existing background population of SRM slag and dust particles in terms of spatial density and flux predictions.

  14. The effect of anatomical modeling on space radiation dose estimates: a comparison of doses for NASA phantoms and the 5th, 50th, and 95th percentile male and female astronauts.

    PubMed

    Bahadori, Amir A; Van Baalen, Mary; Shavers, Mark R; Dodge, Charles; Semones, Edward J; Bolch, Wesley E

    2011-03-21

    The National Aeronautics and Space Administration (NASA) performs organ dosimetry and risk assessment for astronauts using model-normalized measurements of the radiation fields encountered in space. To determine the radiation fields in an organ or tissue of interest, particle transport calculations are performed using self-shielding distributions generated with the computer program CAMERA to represent the human body. CAMERA mathematically traces linear rays (or path lengths) through the computerized anatomical man (CAM) phantom, a computational stylized model developed in the early 1970s with organ and body profiles modeled using solid shapes and scaled to represent the body morphometry of the 1950 50th percentile (PCTL) Air Force male. With the increasing use of voxel phantoms in medical and health physics, a conversion from a mathematical-based to a voxel-based ray-tracing algorithm is warranted. In this study, the voxel-based ray tracer (VoBRaT) is introduced to ray trace voxel phantoms using a modified version of the algorithm first proposed by Siddon (1985 Med. Phys. 12 252-5). After validation, VoBRAT is used to evaluate variations in body self-shielding distributions for NASA phantoms and six University of Florida (UF) hybrid phantoms, scaled to represent the 5th, 50th, and 95th PCTL male and female astronaut body morphometries, which have changed considerably since the inception of CAM. These body self-shielding distributions are used to generate organ dose equivalents and effective doses for five commonly evaluated space radiation environments. It is found that dosimetric differences among the phantoms are greatest for soft radiation spectra and light vehicular shielding.

  15. Systems engineering studies of on-orbit assembly operation

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1991-01-01

    While the practice of construction has a long history, the underlying theory of construction is relatively young. Very little has been documented as to techniques of logistic support, construction planning, construction scheduling, construction testing, and inspection. The lack of 'systems approaches' to construction processes is certainly one of the most serious roadblocks to the construction of space structures. System engineering research efforts at CSC are aimed at developing concepts and tools which contribute to a systems theory of space construction. The research is also aimed at providing means for trade-offs of design parameters for other research areas in CSC. Systems engineering activity at CSC has divided space construction into the areas of orbital assembly, lunar base construction, interplanetary transport vehicle construction, and Mars base construction. A brief summary of recent results is given. Several models for 'launch-on-time' were developed. Launch-on-time is a critical concept to the assembly of such Earth-orbiting structures as the Space Station Freedom, and to planetary orbiters such as the Mars transfer vehicle. CSC has developed a launch vehicle selection model which uses linear programming to find optimal combinations of launch vehicles of various sizes (Atlas, Titan, Shuttles, HLLV's) to support SEI missions. Recently, the Center developed a cost trade-off model for studying on orbit assembly logistics. With this model it was determined that the most effective size of the HLLV would be in the range of 120 to 200 metric tons to LEO, which is consistent with the choices of General Stafford's Synthesis Group Report. A second-generation Dynamic Construction Activities Model ('DYCAM') process model has been under development, based on our past results in interruptability and our initial DYCAM model. This second-generation model is built on the paradigm of knowledge-based expert systems. It is aimed at providing answers to two questions: (1) what are some necessary or sufficient conditions for judging conceptual designs of spacecraft?, and (2) can a methodology be formulated such that these conditions may be used to provide computer-aided tools for evaluating conceptual designs and planning for space assembly sequences? Early simulation results indicate that the DYCAM model has a clear ability to emulate and simulate human orbital construction processes.

  16. Analytical Investigation of a Reflux Boiler

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Young, Fred M.; Chambers, Terrence L.

    1996-01-01

    A thermal model of a single Ultralight Fabric Reflux Tube (UFRT) was constructed and tested against data for an array of such tubes tested in the NASA-JSC facility. Modifications to the single fin model were necessary to accommodate the change in radiation shape factors due to adjacent tubes. There was good agreement between the test data and data generated for the same cases by the thermal model. The thermal model was also used to generate single and linear array data for the lunar environment (the primary difference between the test and lunar data was due to lunar gravity). The model was also used to optimize the linear spacing of the reflux tubes in an array. The optimal spacing of the tubes was recommended to be about 5 tube diameters based on maximizing the heat transfer per unit mass. The model also showed that the thermal conductivity of the Nextel fabric was the major limitation to the heat transfer. This led to a suggestion that the feasibility of jacketing the Nextel fiber bundles with copper strands be investigated. This jacketing arrangement was estimated to be able to double the thermal conductivity of the fabric at a volume concentration of about 12-14%. Doubling the thermal conductivity of the fabric would double the amount of heat transferred at the same steam saturation temperature.

  17. Exploring cluster Monte Carlo updates with Boltzmann machines

    NASA Astrophysics Data System (ADS)

    Wang, Lei

    2017-11-01

    Boltzmann machines are physics informed generative models with broad applications in machine learning. They model the probability distribution of an input data set with latent variables and generate new samples accordingly. Applying the Boltzmann machines back to physics, they are ideal recommender systems to accelerate the Monte Carlo simulation of physical systems due to their flexibility and effectiveness. More intriguingly, we show that the generative sampling of the Boltzmann machines can even give different cluster Monte Carlo algorithms. The latent representation of the Boltzmann machines can be designed to mediate complex interactions and identify clusters of the physical system. We demonstrate these findings with concrete examples of the classical Ising model with and without four-spin plaquette interactions. In the future, automatic searches in the algorithm space parametrized by Boltzmann machines may discover more innovative Monte Carlo updates.

  18. Fluvial reservoir characterization using topological descriptors based on spectral analysis of graphs

    NASA Astrophysics Data System (ADS)

    Viseur, Sophie; Chiaberge, Christophe; Rhomer, Jérémy; Audigane, Pascal

    2015-04-01

    Fluvial systems generate highly heterogeneous reservoir. These heterogeneities have major impact on fluid flow behaviors. However, the modelling of such reservoirs is mainly performed in under-constrained contexts as they include complex features, though only sparse and indirect data are available. Stochastic modeling is the common strategy to solve such problems. Multiple 3D models are generated from the available subsurface dataset. The generated models represent a sampling of plausible subsurface structure representations. From this model sampling, statistical analysis on targeted parameters (e.g.: reserve estimations, flow behaviors, etc.) and a posteriori uncertainties are performed to assess risks. However, on one hand, uncertainties may be huge, which requires many models to be generated for scanning the space of possibilities. On the other hand, some computations performed on the generated models are time consuming and cannot, in practice, be applied on all of them. This issue is particularly critical in: 1) geological modeling from outcrop data only, as these data types are generally sparse and mainly distributed in 2D at large scale but they may locally include high-resolution descriptions (e.g.: facies, strata local variability, etc.); 2) CO2 storage studies as many scales of investigations are required, from meter to regional ones, to estimate storage capacities and associated risks. Recent approaches propose to define distances between models to allow sophisticated multivariate statistics to be applied on the space of uncertainties so that only sub-samples, representative of initial set, are investigated for dynamic time-consuming studies. This work focuses on defining distances between models that characterize the topology of the reservoir rock network, i.e. its compactness or connectivity degree. The proposed strategy relies on the study of the reservoir rock skeleton. The skeleton of an object corresponds to its median feature. A skeleton is computed for each reservoir rock geobody and studied through a graph spectral analysis. To achieve this, the skeleton is converted into a graph structure. The spectral analysis applied on this graph structure allows a distance to be defined between pairs of graphs. Therefore, this distance is used as support for clustering analysis to gather models that share the same reservoir rock topology. To show the ability of the defined distances to discriminate different types of reservoir connectivity, a synthetic data set of fluvial models with different geological settings was generated and studied using the proposed approach. The results of the clustering analysis are shown and discussed.

  19. DEGAS: Dynamic Exascale Global Address Space Programming Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demmel, James

    The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. The Berkeley part of the project concentrated on communication-optimal code generation to optimize speed and energy efficiency by reducing data movement. Our work developed communication lower bounds, and/or communication avoiding algorithms (that either meet the lower bound, or do much less communication than their conventional counterparts) for a variety of algorithms, including linear algebra, machine learning and genomics. The Berkeley part of the project concentrated on communication-optimal code generation to optimize speedmore » and energy efficiency by reducing data movement. Our work developed communication lower bounds, and/or communication avoiding algorithms (that either meet the lower bound, or do much less communication than their conventional counterparts) for a variety of algorithms, including linear algebra, machine learning and genomics.« less

  20. The Building Game: From Enumerative Combinatorics to Conformational Diffusion

    NASA Astrophysics Data System (ADS)

    Johnson-Chyzhykov, Daniel; Menon, Govind

    2016-08-01

    We study a discrete attachment model for the self-assembly of polyhedra called the building game. We investigate two distinct aspects of the model: (i) enumerative combinatorics of the intermediate states and (ii) a notion of Brownian motion for the polyhedral linkage defined by each intermediate that we term conformational diffusion. The combinatorial configuration space of the model is computed for the Platonic, Archimedean, and Catalan solids of up to 30 faces, and several novel enumerative results are generated. These represent the most exhaustive computations of this nature to date. We further extend the building game to include geometric information. The combinatorial structure of each intermediate yields a systems of constraints specifying a polyhedral linkage and its moduli space. We use a random walk to simulate a reflected Brownian motion in each moduli space. Empirical statistics of the random walk may be used to define the rates of transition for a Markov process modeling the process of self-assembly.

  1. The Art and Science of Long-Range Space Weather Forecasting

    NASA Technical Reports Server (NTRS)

    Hathaway, David H.; Wilson, Robert M.

    2006-01-01

    Long-range space weather forecasts are akin to seasonal forecasts of terrestrial weather. We don t expect to forecast individual events but we do hope to forecast the underlying level of activity important for satellite operations and mission pl&g. Forecasting space weather conditions years or decades into the future has traditionally been based on empirical models of the solar cycle. Models for the shape of the cycle as a function of its amplitude become reliable once the amplitude is well determined - usually two to three years after minimum. Forecasting the amplitude of a cycle well before that time has been more of an art than a science - usually based on cycle statistics and trends. Recent developments in dynamo theory -the theory explaining the generation of the Sun s magnetic field and the solar activity cycle - have now produced models with predictive capabilities. Testing these models with historical sunspot cycle data indicates that these predictions may be highly reliable one, or even two, cycles into the future.

  2. Numerical Study of Three Dimensional Effects in Longitudinal Space-Charge Impedance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halavanau, A.; Piot, P.

    2015-06-01

    Longitudinal space-charge (LSC) effects are generally considered as detrimental in free-electron lasers as they can seed instabilities. Such “microbunching instabilities” were recently shown to be potentially useful to support the generation of broadband coherent radiation pulses [1, 2]. Therefore there has been an increasing interest in devising accelerator beamlines capable of sustaining this LSC instability as a mechanism to produce a coherent light source. To date most of these studies have been carried out with a one-dimensional impedance model for the LSC. In this paper we use a N-body “Barnes-Hut” algorithm [3] to simulate the 3D space charge force inmore » the beam combined with elegant [4] and explore the limitation of the 1D model often used« less

  3. String-inspired special grand unification

    NASA Astrophysics Data System (ADS)

    Yamatsu, Naoki

    2017-10-01

    We discuss a grand unified theory (GUT) based on an SO(32) GUT gauge group broken to its subgroups including a special subgroup. In the SO(32) GUT on the six-dimensional (6D) orbifold space M^4× T^2/\\mathbb{Z}_2, one generation of the standard model fermions can be embedded into a 6D bulk Weyl fermion in the SO(32) vector representation. We show that for a three-generation model, all the 6D and 4D gauge anomalies in the bulk and on the fixed points are canceled out without exotic chiral fermions at low energies.

  4. RFI Math Model programs for predicting intermodulation interference

    NASA Technical Reports Server (NTRS)

    Stafford, J. M.

    1974-01-01

    Receivers operating on a space vehicle or an aircraft having many on-board transmitters are subject to intermodulation interference from mixing in the transmitting antenna systems, the external environment, or the receiver front-ends. This paper presents the techniques utilized in RFI Math Model computer programs that were developed to aid in the prevention of interference by predicting problem areas prior to occurrence. Frequencies and amplitudes of possible intermodulation products generated in the external environment are calculated and compared to receiver sensitivities. Intermodulation products generated in receivers are evaluated to determine the adequacy of preselector ejection.

  5. Automated procedures for sizing aerospace vehicle structures /SAVES/

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  6. Demonstration of a High-Fidelity Predictive/Preview Display Technique for Telerobotic Servicing in Space

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Bejczy, Antal K.

    1993-01-01

    A highly effective predictive/preview display technique for telerobotic servicing in space under several seconds communication time delay has been demonstrated on a large laboratory scale in May 1993, involving the Jet Propulsion Laboratory as the simulated ground control station and, 2500 miles away, the Goddard Space Flight Center as the simulated satellite servicing set-up. The technique is based on a high-fidelity calibration procedure that enables a high-fidelity overlay of 3-D graphics robot arm and object models over given 2-D TV camera images of robot arm and objects. To generate robot arm motions, the operator can confidently interact in real time with the graphics models of the robot arm and objects overlaid on an actual camera view of the remote work site. The technique also enables the operator to generate high-fidelity synthetic TV camera views showing motion events that are hidden in a given TV camera view or for which no TV camera views are available. The positioning accuracy achieved by this technique for a zoomed-in camera setting was about +/-5 mm, well within the allowable +/-12 mm error margin at the insertion of a 45 cm long tool in the servicing task.

  7. General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark

    2010-01-01

    Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.

  8. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  9. The promise of the state space approach to time series analysis for nursing research.

    PubMed

    Levy, Janet A; Elser, Heather E; Knobel, Robin B

    2012-01-01

    Nursing research, particularly related to physiological development, often depends on the collection of time series data. The state space approach to time series analysis has great potential to answer exploratory questions relevant to physiological development but has not been used extensively in nursing. The aim of the study was to introduce the state space approach to time series analysis and demonstrate potential applicability to neonatal monitoring and physiology. We present a set of univariate state space models; each one describing a process that generates a variable of interest over time. Each model is presented algebraically and a realization of the process is presented graphically from simulated data. This is followed by a discussion of how the model has been or may be used in two nursing projects on neonatal physiological development. The defining feature of the state space approach is the decomposition of the series into components that are functions of time; specifically, slowly varying level, faster varying periodic, and irregular components. State space models potentially simulate developmental processes where a phenomenon emerges and disappears before stabilizing, where the periodic component may become more regular with time, or where the developmental trajectory of a phenomenon is irregular. The ultimate contribution of this approach to nursing science will require close collaboration and cross-disciplinary education between nurses and statisticians.

  10. Extending existing structural identifiability analysis methods to mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2018-01-01

    The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Thermo-electrochemical analysis of lithium ion batteries for space applications using Thermal Desktop

    NASA Astrophysics Data System (ADS)

    Walker, W.; Ardebili, H.

    2014-12-01

    Lithium-ion batteries (LIBs) are replacing the Nickel-Hydrogen batteries used on the International Space Station (ISS). Knowing that LIB efficiency and survivability are greatly influenced by temperature, this study focuses on the thermo-electrochemical analysis of LIBs in space orbit. Current finite element modeling software allows for advanced simulation of the thermo-electrochemical processes; however the heat transfer simulation capabilities of said software suites do not allow for the extreme complexities of orbital-space environments like those experienced by the ISS. In this study, we have coupled the existing thermo-electrochemical models representing heat generation in LIBs during discharge cycles with specialized orbital-thermal software, Thermal Desktop (TD). Our model's parameters were obtained from a previous thermo-electrochemical model of a 185 Amp-Hour (Ah) LIB with 1-3 C (C) discharge cycles for both forced and natural convection environments at 300 K. Our TD model successfully simulates the temperature vs. depth-of-discharge (DOD) profiles and temperature ranges for all discharge and convection variations with minimal deviation through the programming of FORTRAN logic representing each variable as a function of relationship to DOD. Multiple parametrics were considered in a second and third set of cases whose results display vital data in advancing our understanding of accurate thermal modeling of LIBs.

  12. New 2D diffraction model and its applications to terahertz parallel-plate waveguide power splitters

    PubMed Central

    Zhang, Fan; Song, Kaijun; Fan, Yong

    2017-01-01

    A two-dimensional (2D) diffraction model for the calculation of the diffraction field in 2D space and its applications to terahertz parallel-plate waveguide power splitters are proposed in this paper. Compared with the Huygens-Fresnel principle in three-dimensional (3D) space, the proposed model provides an approximate analytical expression to calculate the diffraction field in 2D space. The diffraction filed is regarded as the superposition integral in 2D space. The calculated results obtained from the proposed diffraction model agree well with the ones by software HFSS based on the element method (FEM). Based on the proposed 2D diffraction model, two parallel-plate waveguide power splitters are presented. The splitters consist of a transmitting horn antenna, reflectors, and a receiving antenna array. The reflector is cylindrical parabolic with superimposed surface relief to efficiently couple the transmitted wave into the receiving antenna array. The reflector is applied as computer-generated holograms to match the transformed field to the receiving antenna aperture field. The power splitters were optimized by a modified real-coded genetic algorithm. The computed results of the splitters agreed well with the ones obtained by software HFSS verify the novel design method for power splitter, which shows good applied prospects of the proposed 2D diffraction model. PMID:28181514

  13. Emergence of encounter networks due to human mobility.

    PubMed

    Riascos, A P; Mateos, José L

    2017-01-01

    There is a burst of work on human mobility and encounter networks. However, the connection between these two important fields just begun recently. It is clear that both are closely related: Mobility generates encounters, and these encounters might give rise to contagion phenomena or even friendship. We model a set of random walkers that visit locations in space following a strategy akin to Lévy flights. We measure the encounters in space and time and establish a link between walkers after they coincide several times. This generates a temporal network that is characterized by global quantities. We compare this dynamics with real data for two cities: New York City and Tokyo. We use data from the location-based social network Foursquare and obtain the emergent temporal encounter network, for these two cities, that we compare with our model. We found long-range (Lévy-like) distributions for traveled distances and time intervals that characterize the emergent social network due to human mobility. Studying this connection is important for several fields like epidemics, social influence, voting, contagion models, behavioral adoption and diffusion of ideas.

  14. Influence of droplet spacing on drag coefficient in nonevaporating, monodisperse streams

    NASA Astrophysics Data System (ADS)

    Mulholland, J. A.; Srivastava, R. K.; Wendt, J. O. L.

    1988-10-01

    Trajectory measurements on single, monodisperse, nonevaporating droplet streams whose droplet size, velocity, and spacing were varied to yield initial Re numbers in the 90-290 range are presently used to ascertain the influence of droplet spacing on the drag coefficient of individual drops injected into a quiescent environment. A trajectory model containing the local drag coefficient was fitted to the experimental data by a nonlinear regression; over 40 additional trajectories were predicted with acceptable accuracy. This formulation will aid the computation of waste-droplet drag in flames for improved combustion-generated pollutant predictions.

  15. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    PubMed

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.

  16. Bottom-quark fusion processes at the LHC for probing Z' models and B -meson decay anomalies

    NASA Astrophysics Data System (ADS)

    Abdullah, Mohammad; Dalchenko, Mykhailo; Dutta, Bhaskar; Eusebi, Ricardo; Huang, Peisi; Kamon, Teruki; Rathjens, Denis; Thompson, Adrian

    2018-04-01

    We investigate models of a heavy neutral gauge boson Z' coupling mostly to third generation quarks and second generation leptons. In this scenario, bottom quarks arising from gluon splitting can fuse into Z' allowing the LHC to probe it. In the generic framework presented, anomalies in B -meson decays reported by the LHCb experiment imply a flavor-violating b s coupling of the featured Z' constraining the lowest possible production cross section. A novel approach searching for a Z'(→μ μ ) in association with at least one bottom-tagged jet can probe regions of model parameter space existing analyses are not sensitive to.

  17. Spatial effects in discrete generation population models.

    PubMed

    Carrillo, C; Fife, P

    2005-02-01

    A framework is developed for constructing a large class of discrete generation, continuous space models of evolving single species populations and finding their bifurcating patterned spatial distributions. Our models involve, in separate stages, the spatial redistribution (through movement laws) and local regulation of the population; and the fundamental properties of these events in a homogeneous environment are found. Emphasis is placed on the interaction of migrating individuals with the existing population through conspecific attraction (or repulsion), as well as on random dispersion. The nature of the competition of these two effects in a linearized scenario is clarified. The bifurcation of stationary spatially patterned population distributions is studied, with special attention given to the role played by that competition.

  18. Aerospace applications of SINDA/FLUINT at the Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Ewert, Michael K.; Bellmore, Phillip E.; Andish, Kambiz K.; Keller, John R.

    1992-01-01

    SINDA/FLUINT has been found to be a versatile code for modeling aerospace systems involving single or two-phase fluid flow and all modes of heat transfer. Several applications of SINDA/FLUINT are described in this paper. SINDA/FLUINT is being used extensively to model the single phase water loops and the two-phase ammonia loops of the Space Station Freedom active thermal control system (ATCS). These models range from large integrated system models with multiple submodels to very detailed subsystem models. An integrated Space Station ATCS model has been created with ten submodels representing five water loops, three ammonia loops, a Freon loop and a thermal submodel representing the air loop. The model, which has approximately 800 FLUINT lumps and 300 thermal nodes, is used to determine the interaction between the multiple fluid loops which comprise the Space Station ATCS. Several detailed models of the flow-through radiator subsystem of the Space Station ATCS have been developed. One model, which has approximately 70 FLUINT lumps and 340 thermal nodes, provides a representation of the ATCS low temperature radiator array with two fluid loops connected only by conduction through the radiator face sheet. The detailed models are used to determine parameters such as radiator fluid return temperature, fin efficiency, flow distribution and total heat rejection for the baseline design as well as proposed alternate designs. SINDA/FLUINT has also been used as a design tool for several systems using pressurized gasses. One model examined the pressurization and depressurization of the Space Station airlock under a variety of operating conditions including convection with the side walls and internal cooling. Another model predicted the performance of a new generation of manned maneuvering units. This model included high pressure gas depressurization, internal heat transfer and supersonic thruster equations. The results of both models were used to size components, such as the heaters and gas bottles and also to point to areas where hardware testing was needed.

  19. Enrichment and Ranking of the YouTube Tag Space and Integration with the Linked Data Cloud

    NASA Astrophysics Data System (ADS)

    Choudhury, Smitashree; Breslin, John G.; Passant, Alexandre

    The increase of personal digital cameras with video functionality and video-enabled camera phones has increased the amount of user-generated videos on the Web. People are spending more and more time viewing online videos as a major source of entertainment and "infotainment". Social websites allow users to assign shared free-form tags to user-generated multimedia resources, thus generating annotations for objects with a minimum amount of effort. Tagging allows communities to organise their multimedia items into browseable sets, but these tags may be poorly chosen and related tags may be omitted. Current techniques to retrieve, integrate and present this media to users are deficient and could do with improvement. In this paper, we describe a framework for semantic enrichment, ranking and integration of web video tags using Semantic Web technologies. Semantic enrichment of folksonomies can bridge the gap between the uncontrolled and flat structures typically found in user-generated content and structures provided by the Semantic Web. The enhancement of tag spaces with semantics has been accomplished through two major tasks: (1) a tag space expansion and ranking step; and (2) through concept matching and integration with the Linked Data cloud. We have explored social, temporal and spatial contexts to enrich and extend the existing tag space. The resulting semantic tag space is modelled via a local graph based on co-occurrence distances for ranking. A ranked tag list is mapped and integrated with the Linked Data cloud through the DBpedia resource repository. Multi-dimensional context filtering for tag expansion means that tag ranking is much easier and it provides less ambiguous tag to concept matching.

  20. Computer image generation: Reconfigurability as a strategy in high fidelity space applications

    NASA Technical Reports Server (NTRS)

    Bartholomew, Michael J.

    1989-01-01

    The demand for realistic, high fidelity, computer image generation systems to support space simulation is well established. However, as the number and diversity of space applications increase, the complexity and cost of computer image generation systems also increase. One strategy used to harmonize cost with varied requirements is establishment of a reconfigurable image generation system that can be adapted rapidly and easily to meet new and changing requirements. The reconfigurability strategy through the life cycle of system conception, specification, design, implementation, operation, and support for high fidelity computer image generation systems are discussed. The discussion is limited to those issues directly associated with reconfigurability and adaptability of a specialized scene generation system in a multi-faceted space applications environment. Examples and insights gained through the recent development and installation of the Improved Multi-function Scene Generation System at Johnson Space Center, Systems Engineering Simulator are reviewed and compared with current simulator industry practices. The results are clear; the strategy of reconfigurability applied to space simulation requirements provides a viable path to supporting diverse applications with an adaptable computer image generation system.

  1. A Tutorial Introduction to Bayesian Models of Cognitive Development

    DTIC Science & Technology

    2011-01-01

    typewriter with an infinite amount of paper. There is a space of documents that it is capable of producing, which includes things like The Tempest and does...not include, say, a Vermeer painting or a poem written in Russian. This typewriter represents a means of generating the hypothesis space for a Bayesian...learner: each possible document that can be typed on it is a hypothesis, the infinite set of documents producible by the typewriter is the latent

  2. Refinement of the ICRF

    NASA Technical Reports Server (NTRS)

    Ma, Chopo

    2004-01-01

    Since the ICRF was generated in 1995, VLBI modeling and estimation, data quality: source position stability analysis, and supporting observational programs have improved markedly. There are developing and potential applications in the areas of space navigation Earth orientation monitoring and optical astrometry from space that would benefit from a refined ICRF with enhanced accuracy, stability and spatial distribution. The convergence of analysis, focused observations, and astrometric needs should drive the production of a new realization in the next few years.

  3. Infrared monitoring of the Space Station environment

    NASA Technical Reports Server (NTRS)

    Kostiuk, Theodor; Jennings, Donald E.; Mumma, Michael J.

    1988-01-01

    The measurement and monitoring of infrared emission in the environment of the Space Station has a twofold importance - for the study of the phenomena itself and as an aid in planning and interpreting Station based infrared experiments. Spectral measurements of the infrared component of the spacecraft glow will, along with measurements in other spectral regions, provide data necessary to fully understand and model the physical and chemical processes producing these emissions. The monitoring of the intensity of these emissions will provide background limits for Space Station based infrared experiments and permit the determination of optimum instrument placement and pointing direction. Continuous monitoring of temporal changes in the background radiation (glow) will also permit better interpretation of Station-based infrared earth sensing and astronomical observations. The primary processes producing infrared emissions in the Space Station environment are: (1) Gas phase excitations of Station generated molecules ( e.g., CO2, H2O, organics...) by collisions with the ambient flux of mainly O and N2. Molecular excitations and generation of new species by collisions of ambient molecules with Station surfaces. They provide a list of resulting species, transition energies, excitation cross sections and relevant time constants. The modeled spectrum of the excited species occurs primarily at wavelengths shorter than 8 micrometer. Emissions at longer wavelengths may become important during rocket firing or in the presence of dust.

  4. OpenSim Model Improvements to Support High Joint Angle Resistive Exercising

    NASA Technical Reports Server (NTRS)

    Gallo, Christopher; Thompson, William; Lewandowski, Beth; Humphreys, Brad

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The Advanced Resistive Exercise Device (ARED) currently on the ISS is being used as a benchmark for the functional performance of these new devices. Rigorous testing of these proposed devices in space flight is difficult so computational modeling provides an estimation of the muscle forces and joint loads during exercise to gain insight on the efficacy to protect the musculoskeletal health of astronauts. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts

  5. Nutritional models for a Controlled Ecological Life Support System (CELSS): Linear mathematical modeling

    NASA Technical Reports Server (NTRS)

    Wade, Rose C.

    1989-01-01

    The NASA Controlled Ecological Life Support System (CELSS) Program is involved in developing a biogenerative life support system that will supply food, air, and water to space crews on long-duration missions. An important part of this effort is in development of the knowledge and technological capability of producing and processing foods to provide optimal diets for space crews. This involves such interrelated factors as determination of the diet, based on knowledge of nutrient needs of humans and adjustments in those needs that may be required as a result of the conditions of long-duration space flight; determination of the optimal mixture of crops required to provide nutrients at levels that are sufficient but not excessive or toxic; and consideration of the critical issues of spacecraft space and power limitations, which impose a phytomass minimization requirement. The complex interactions among these factors are examined with the goal of supplying a diet that will satisfy human needs while minimizing the total phytomass requirement. The approach taken was to collect plant nutritional composition and phytomass production data, identify human nutritional needs and estimate the adjustments to the nutrient requirements likely to result from space flight, and then to generate mathematical models from these data.

  6. A technique for generating phase-space-based Monte Carlo beamlets in radiotherapy applications.

    PubMed

    Bush, K; Popescu, I A; Zavgorodni, S

    2008-09-21

    As radiotherapy treatment planning moves toward Monte Carlo (MC) based dose calculation methods, the MC beamlet is becoming an increasingly common optimization entity. At present, methods used to produce MC beamlets have utilized a particle source model (PSM) approach. In this work we outline the implementation of a phase-space-based approach to MC beamlet generation that is expected to provide greater accuracy in beamlet dose distributions. In this approach a standard BEAMnrc phase space is sorted and divided into beamlets with particles labeled using the inheritable particle history variable. This is achieved with the use of an efficient sorting algorithm, capable of sorting a phase space of any size into the required number of beamlets in only two passes. Sorting a phase space of five million particles can be achieved in less than 8 s on a single-core 2.2 GHz CPU. The beamlets can then be transported separately into a patient CT dataset, producing separate dose distributions (doselets). Methods for doselet normalization and conversion of dose to absolute units of Gy for use in intensity modulated radiation therapy (IMRT) plan optimization are also described.

  7. Considering Decision Variable Diversity in Multi-Objective Optimization: Application in Hydrologic Model Calibration

    NASA Astrophysics Data System (ADS)

    Sahraei, S.; Asadzadeh, M.

    2017-12-01

    Any modern multi-objective global optimization algorithm should be able to archive a well-distributed set of solutions. While the solution diversity in the objective space has been explored extensively in the literature, little attention has been given to the solution diversity in the decision space. Selection metrics such as the hypervolume contribution and crowding distance calculated in the objective space would guide the search toward solutions that are well-distributed across the objective space. In this study, the diversity of solutions in the decision-space is used as the main selection criteria beside the dominance check in multi-objective optimization. To this end, currently archived solutions are clustered in the decision space and the ones in less crowded clusters are given more chance to be selected for generating new solution. The proposed approach is first tested on benchmark mathematical test problems. Second, it is applied to a hydrologic model calibration problem with more than three objective functions. Results show that the chance of finding more sparse set of high-quality solutions increases, and therefore the analyst would receive a well-diverse set of options with maximum amount of information. Pareto Archived-Dynamically Dimensioned Search, which is an efficient and parsimonious multi-objective optimization algorithm for model calibration, is utilized in this study.

  8. Particle Analysis Pitfalls

    NASA Technical Reports Server (NTRS)

    Hughes, David; Dazzo, Tony

    2007-01-01

    This viewgraph presentation reviews the use of particle analysis to assist in preparing for the 4th Hubble Space Telescope (HST) Servicing mission. During this mission the Space Telescope Imaging Spectrograph (STIS) will be repaired. The particle analysis consisted of Finite element mesh creation, Black-body viewfactors generated using I-DEAS TMG Thermal Analysis, Grey-body viewfactors calculated using Markov method, Particle distribution modeled using an iterative Monte Carlo process, (time-consuming); in house software called MASTRAM, Differential analysis performed in Excel, and Visualization provided by Tecplot and I-DEAS. Several tests were performed and are reviewed: Conformal Coat Particle Study, Card Extraction Study, Cover Fastener Removal Particle Generation Study, and E-Graf Vibration Particulate Study. The lessons learned during this analysis are also reviewed.

  9. Optimal design of tilt carrier frequency computer-generated holograms to measure aspherics.

    PubMed

    Peng, Jiantao; Chen, Zhe; Zhang, Xingxiang; Fu, Tianjiao; Ren, Jianyue

    2015-08-20

    Computer-generated holograms (CGHs) provide an approach to high-precision metrology of aspherics. A CGH is designed under the trade-off among size, mapping distortion, and line spacing. This paper describes an optimal design method based on the parametric model for tilt carrier frequency CGHs placed outside the interferometer focus points. Under the condition of retaining an admissible size and a tolerable mapping distortion, the optimal design method has two advantages: (1) separating the parasitic diffraction orders to improve the contrast of the interferograms and (2) achieving the largest line spacing to minimize sensitivity to fabrication errors. This optimal design method is applicable to common concave aspherical surfaces and illustrated with CGH design examples.

  10. Overcoming the Subject-Object Dichotomy in Urban Modeling: Axial Maps as Geometric Representations of Affordances in the Built Environment.

    PubMed

    Marcus, Lars

    2018-01-01

    The world is witnessing unprecedented urbanization, bringing extreme challenges to contemporary practices in urban planning and design. This calls for improved urban models that can generate new knowledge and enhance practical skill. Importantly, any urban model embodies a conception of the relation between humans and the physical environment. In urban modeling this is typically conceived of as a relation between human subjects and an environmental object, thereby reproducing a humans-environment dichotomy. Alternative modeling traditions, such as space syntax that originates in architecture rather than geography, have tried to overcome this dichotomy. Central in this effort is the development of new representations of urban space, such as in the case of space syntax, the axial map. This form of representation aims to integrate both human behavior and the physical environment into one and the same description. Interestingly, models based on these representations have proved to better capture pedestrian movement than regular models. Pedestrian movement, as well as other kinds of human flows in urban space, is essential for urban modeling, since increasingly flows of this kind are understood as the driver in urban processes. Critical for a full understanding of space syntax modeling is the ontology of its' representations, such as the axial map. Space syntax theory here often refers to James Gibson's "Theory of affordances," where the concept of affordances, in a manner similar to axial maps, aims to bridge the subject-object dichotomy by neither constituting physical properties of the environment or human behavior, but rather what emerges in the meeting between the two. In extension of this, the axial map can be interpreted as a representation of how the physical form of the environment affords human accessibility and visibility in urban space. This paper presents a close examination of the form of representations developed in space syntax methodology, in particular in the light of Gibson's "theory of affordances." The overarching aim is to contribute to a theoretical framework for urban models based on affordances, which may support the overcoming of the subject-object dichotomy in such models, here deemed essential for a greater social-ecological sustainability of cities.

  11. Stereo imaging with spaceborne radars

    NASA Technical Reports Server (NTRS)

    Leberl, F.; Kobrick, M.

    1983-01-01

    Stereo viewing is a valuable tool in photointerpretation and is used for the quantitative reconstruction of the three dimensional shape of a topographical surface. Stereo viewing refers to a visual perception of space by presenting an overlapping image pair to an observer so that a three dimensional model is formed in the brain. Some of the observer's function is performed by machine correlation of the overlapping images - so called automated stereo correlation. The direct perception of space with two eyes is often called natural binocular vision; techniques of generating three dimensional models of the surface from two sets of monocular image measurements is the topic of stereology.

  12. Exploring Space Physics Concepts Using Simulation Results

    NASA Astrophysics Data System (ADS)

    Gross, N. A.

    2008-05-01

    The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the National Science Foundation, has the goal of developing a suite of integrated physics based computer models of the space environment that can follow the evolution of a space weather event from the Sun to the Earth. In addition to the research goals, CISM is also committed to training the next generation of space weather professionals who are imbued with a system view of space weather. This view should include an understanding of both helio-spheric and geo-space phenomena. To this end, CISM offers a yearly Space Weather Summer School targeted to first year graduate students, although advanced undergraduates and space weather professionals have also attended. This summer school uses a number of innovative pedagogical techniques including devoting each afternoon to a computer lab exercise that use results from research quality simulations and visualization techniques, along with ground based and satellite data to explore concepts introduced during the morning lectures. These labs are suitable for use in wide variety educational settings from formal classroom instruction to outreach programs. The goal of this poster is to outline the goals and content of the lab materials so that instructors may evaluate their potential use in the classroom or other settings.

  13. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  14. NASA's Integrated Space Transportation Plan — 3 rd generation reusable launch vehicle technology update

    NASA Astrophysics Data System (ADS)

    Cook, Stephen; Hueter, Uwe

    2003-08-01

    NASA's Integrated Space Transportation Plan (ISTP) calls for investments in Space Shuttle safety upgrades, second generation Reusable Launch Vehicle (RLV) advanced development and third generation RLV and in-space research and technology. NASA's third generation launch systems are to be fully reusable and operation by 2025. The goals for third generation launch systems are to reduce cost by a factor of 100 and improve safety by a factor of 10,000 over current systems. The Advanced Space Transportation Program Office (ASTP) at NASA's Marshall Space Flight Center in Huntsville, AL has the agency lead to develop third generation space transportation technologies. The Hypersonics Investment Area, part of ASTP, is developing the third generation launch vehicle technologies in two main areas, propulsion and airframes. The program's major investment is in hypersonic airbreathing propulsion since it offers the greatest potential for meeting the third generation launch vehicles. The program will mature the technologies in three key propulsion areas, scramjets, rocket-based combined cycle and turbine-based combination cycle. Ground and flight propulsion tests are being planned for the propulsion technologies. Airframe technologies will be matured primarily through ground testing. This paper describes NASA's activities in hypersonics. Current programs, accomplishments, future plans and technologies that are being pursued by the Hypersonics Investment Area under the Advanced Space Transportation Program Office will be discussed.

  15. Strategic priming with multiple antigens can yield memory cell phenotypes optimized for infection with Mycobacterium tuberculosis: A computational study

    DOE PAGES

    Ziraldo, Cordelia; Gong, Chang; Kirschner, Denise E.; ...

    2016-01-06

    Lack of an effective vaccine results in 9 million new cases of tuberculosis (TB) every year and 1.8 million deaths worldwide. While many infants are vaccinated at birth with BCG (an attenuated M. bovis), this does not prevent infection or development of TB after childhood. Immune responses necessary for prevention of infection or disease are still unknown, making development of effective vaccines against TB challenging. Several new vaccines are ready for human clinical trials, but these trials are difficult and expensive; especially challenging is determining the appropriate cellular response necessary for protection. The magnitude of an immune response is likelymore » key to generating a successful vaccine. Characteristics such as numbers of central memory (CM) and effector memory (EM) T cells responsive to a diverse set of epitopes are also correlated with protection. Promising vaccines against TB contain mycobacterial subunit antigens (Ag) present during both active and latent infection. We hypothesize that protection against different key immunodominant antigens could require a vaccine that produces different levels of EM and CM for each Ag-specific memory population. We created a computational model to explore EM and CM values, and their ratio, within what we term Memory Design Space. Our model captures events involved in T cell priming within lymph nodes and tracks their circulation through blood to peripheral tissues. We used the model to test whether multiple Ag-specific memory cell populations could be generated with distinct locations within Memory Design Space at a specific time point post vaccination. Boosting can further shift memory populations to memory cell ratios unreachable by initial priming events. By strategically varying antigen load, properties of cellular interactions within the LN, and delivery parameters (e.g., number of boosts) of multi-subunit vaccines, we can generate multiple Ag-specific memory populations that cover a wide range of Memory Design Space. As a result, given a set of desired characteristics for Ag-specific memory populations, we can use our model as a tool to predict vaccine formulations that will generate those populations.« less

  16. Strategic priming with multiple antigens can yield memory cell phenotypes optimized for infection with Mycobacterium tuberculosis: A computational study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziraldo, Cordelia; Gong, Chang; Kirschner, Denise E.

    Lack of an effective vaccine results in 9 million new cases of tuberculosis (TB) every year and 1.8 million deaths worldwide. While many infants are vaccinated at birth with BCG (an attenuated M. bovis), this does not prevent infection or development of TB after childhood. Immune responses necessary for prevention of infection or disease are still unknown, making development of effective vaccines against TB challenging. Several new vaccines are ready for human clinical trials, but these trials are difficult and expensive; especially challenging is determining the appropriate cellular response necessary for protection. The magnitude of an immune response is likelymore » key to generating a successful vaccine. Characteristics such as numbers of central memory (CM) and effector memory (EM) T cells responsive to a diverse set of epitopes are also correlated with protection. Promising vaccines against TB contain mycobacterial subunit antigens (Ag) present during both active and latent infection. We hypothesize that protection against different key immunodominant antigens could require a vaccine that produces different levels of EM and CM for each Ag-specific memory population. We created a computational model to explore EM and CM values, and their ratio, within what we term Memory Design Space. Our model captures events involved in T cell priming within lymph nodes and tracks their circulation through blood to peripheral tissues. We used the model to test whether multiple Ag-specific memory cell populations could be generated with distinct locations within Memory Design Space at a specific time point post vaccination. Boosting can further shift memory populations to memory cell ratios unreachable by initial priming events. By strategically varying antigen load, properties of cellular interactions within the LN, and delivery parameters (e.g., number of boosts) of multi-subunit vaccines, we can generate multiple Ag-specific memory populations that cover a wide range of Memory Design Space. As a result, given a set of desired characteristics for Ag-specific memory populations, we can use our model as a tool to predict vaccine formulations that will generate those populations.« less

  17. End-to-End Trade-space Analysis for Designing Constellation Missions

    NASA Astrophysics Data System (ADS)

    LeMoigne, J.; Dabney, P.; Foreman, V.; Grogan, P.; Hache, S.; Holland, M. P.; Hughes, S. P.; Nag, S.; Siddiqi, A.

    2017-12-01

    Multipoint measurement missions can provide a significant advancement in science return and this science interest coupled with many recent technological advances are driving a growing trend in exploring distributed architectures for future NASA missions. Distributed Spacecraft Missions (DSMs) leverage multiple spacecraft to achieve one or more common goals. In particular, a constellation is the most general form of DSM with two or more spacecraft placed into specific orbit(s) for the purpose of serving a common objective (e.g., CYGNSS). Because a DSM architectural trade-space includes both monolithic and distributed design variables, DSM optimization is a large and complex problem with multiple conflicting objectives. Over the last two years, our team has been developing a Trade-space Analysis Tool for Constellations (TAT-C), implemented in common programming languages for pre-Phase A constellation mission analysis. By evaluating alternative mission architectures, TAT-C seeks to minimize cost and maximize performance for pre-defined science goals. This presentation will describe the overall architecture of TAT-C including: a User Interface (UI) at several levels of details and user expertise; Trade-space Search Requests that are created from the Science requirements gathered by the UI and validated by a Knowledge Base; a Knowledge Base to compare the current requests to prior mission concepts to potentially prune the trade-space; a Trade-space Search Iterator which, with inputs from the Knowledge Base, and, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generates multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, modeling orbits to balance accuracy and performance. The current version includes uniform and non-uniform Walker constellations as well as Ad-Hoc and precessing constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The current GUI automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost. The end-to-end system will be demonstrated as part of the presentation.

  18. End-to-End Trade-Space Analysis for Designing Constellation

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Dabney, Philip; Foreman, Veronica; Grogan, Paul T.; Hache, Sigfried; Holland, Matthew; Hughes, Steven; Nag, Sreeja; Siddiqi, Afreen

    2017-01-01

    Multipoint measurement missions can provide a significant advancement in science return and this science interest coupled with as many recent technological advances are driving a growing trend in exploring distributed architectures for future NASA missions. Distributed Spacecraft Missions (DSMs) leverage multiple spacecraft to achieve one or more common goals. In particular, a constellation is the most general form of DSM with two or more spacecraft placed into specific orbit(s) for the purpose of serving a common objective (e.g., CYGNSS). Because a DSM architectural trade-space includes both monolithic and distributed design variables, DSM optimization is a large and complex problem with multiple conflicting objectives. Over the last two years, our team has been developing a Trade-space Analysis Tool for Constellations (TAT-C), implemented in common programming languages for pre-Phase A constellation mission analysis. By evaluating alternative mission architectures, TAT-C seeks to minimize cost and maximize performance for pre-defined science goals. This presentation will describe the overall architecture of TAT-C including: a User Interface (UI) at several levels of details and user expertise; Trade-space Search Requests that are created from the Science requirements gathered by the UI and validated by a Knowledge Base; a Knowledge Base to compare the current requests to prior mission concepts to potentially prune the trade-space; a Trade-space Search Iterator which, with inputs from the Knowledge Base, and, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generates multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, modeling orbits to balance accuracy and performance. The current version includes uniform and non-uniform Walker constellations as well as Ad-Hoc and precessing constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The current GUI automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost. The end-to-end system will be demonstrated as part of the presentation.

  19. General solution of a cosmological model induced from higher dimensions using a kinematical constraint

    NASA Astrophysics Data System (ADS)

    Akarsu, Özgür; Dereli, Tekin; Katırcı, Nihan; Sheftel, Mikhail B.

    2015-05-01

    In a recent study Akarsu and Dereli (Gen. Relativ. Gravit. 45:1211, 2013) discussed the dynamical reduction of a higher dimensional cosmological model which is augmented by a kinematical constraint characterized by a single real parameter, correlating and controlling the expansion of both the external (physical) and internal spaces. In that paper explicit solutions were found only for the case of three dimensional internal space (). Here we derive a general solution of the system using Lie group symmetry properties, in parametric form for arbitrary number of internal dimensions. We also investigate the dynamical reduction of the model as a function of cosmic time for various values of and generate parametric plots to discuss cosmologically relevant results.

  20. Temporal Cyber Attack Detection.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, Joey Burton; Draelos, Timothy J.; Galiardi, Meghan

    Rigorous characterization of the performance and generalization ability of cyber defense systems is extremely difficult, making it hard to gauge uncertainty, and thus, confidence. This difficulty largely stems from a lack of labeled attack data that fully explores the potential adversarial space. Currently, performance of cyber defense systems is typically evaluated in a qualitative manner by manually inspecting the results of the system on live data and adjusting as needed. Additionally, machine learning has shown promise in deriving models that automatically learn indicators of compromise that are more robust than analyst-derived detectors. However, to generate these models, most algorithms requiremore » large amounts of labeled data (i.e., examples of attacks). Algorithms that do not require annotated data to derive models are similarly at a disadvantage, because labeled data is still necessary when evaluating performance. In this work, we explore the use of temporal generative models to learn cyber attack graph representations and automatically generate data for experimentation and evaluation. Training and evaluating cyber systems and machine learning models requires significant, annotated data, which is typically collected and labeled by hand for one-off experiments. Automatically generating such data helps derive/evaluate detection models and ensures reproducibility of results. Experimentally, we demonstrate the efficacy of generative sequence analysis techniques on learning the structure of attack graphs, based on a realistic example. These derived models can then be used to generate more data. Additionally, we provide a roadmap for future research efforts in this area.« less

  1. Topex/Poseidon: A United States/France mission. Oceanography from space: The oceans and climate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The TOPEX/POSEIDON space mission, sponsored by NASA and France's space agency, the Centre National d'Etudes Spatiales (CNES), will give new observations of the Earth from space to gain a quantitative understanding of the role of ocean currents in climate change. Rising atmospheric concentrations of carbon dioxide and other 'greenhouse gases' produced as a result of human activities could generate a global warming, followed by an associated rise in sea level. The satellite will use radar altimetry to measure sea-surface height and will be tracked by three independent systems to yield accurate topographic maps over the dimensions of entire ocean basins. The satellite data, together with the Tropical Ocean and Global Atmosphere (TOGA) program and the World Ocean Circulation Experiment (WOCE) measurements, will be analyzed by an international scientific team. By merging the satellite observations with TOGA and WOCE findings, the scientists will establish the extensive data base needed for the quantitative description and computer modeling of ocean circulation. The ocean models will eventually be coupled with atmospheric models to lay the foundation for predictions of global climate change.

  2. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    PubMed

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  3. Generation and analysis of correlated pairs of photons on board a nanosatellite

    NASA Astrophysics Data System (ADS)

    Chandrasekara, R.; Tang, Z.; Tan, Y. C.; Cheng, C.; Sha, L.; Hiang, G. C.; Oi, D.; Ling, A.

    2016-10-01

    Progress in quantum computers and their threat to conventional public key infrastructure is driving new forms of encryption. Quantum Key Distribution (QKD) using entangled photons is a promising approach. A global QKD network can be achieved using satellites equipped with optical links. Despite numerous proposals, actual experimental work demonstrating relevant entanglement technology in space is limited due to the prohibitive cost of traditional satellite development. To make progress, we have designed a photon pair source that can operate on modular spacecraft called CubeSats. We report the in-orbit operation of the photon pair source on board an orbiting CubeSat and demonstrate pair generation and polarisation correlation under space conditions. The in-orbit polarisation correlations are compatible with ground-based tests, validating our design. This successful demonstration is a major experimental milestone towards a space-based quantum network. Our approach provides a cost-effective method for proving the space-worthiness of critical components used in entangled photon technology. We expect that it will also accelerate efforts to probe the overlap between quantum and relativistic models of physics.

  4. A Contemporary Analysis of the O'Neill-Glaser Model for Space-Based Solar Power and Habitat Construction

    NASA Technical Reports Server (NTRS)

    Curreri, Peter A.; Detweiler, Michael K.

    2011-01-01

    In 1975 Gerard O Neill published in the journal Science a model for the construction of solar power satellites. He found that the solar power satellites suggested by Peter Glaser would be too massive to launch economically from Earth, but could be financially viable if the workforce was permanently located in free space habitats and if lunar and asteroid materials were used for construction. All new worldwide electrical generating capacity could be then achieved by solar power satellites. The project would financially break even in about 20 years after which it would generate substantial income selling power below fossil fuel prices. Two NASA / Stanford University led studies at Ames Research center during the summers of 1974 and 1976 found the concept technically sound and developed a detailed financial parametric model. Although the project was not undertaken when suggested in the 1970s, several contemporary issues make pursuing the O Neill -- Glaser concept more compelling today. First, our analysis suggests that if in the first ten years of construction that small habitats (compared to the large vista habitats envisioned by O Neill) supporting approximately 300 people were utilized, development costs of the program and the time for financial break even could be substantially improved. Second, the contemporary consensus is developing that carbon free energy is required to mitigate global climate change. It is estimated that 300 GW of new carbon free energy would be necessary per year to stabilize global atmospheric carbon. This is about 4 times greater energy demand than was considered by the O Neill Glaser model. Our analysis suggests that after the initial investments in lunar mining and space manufacturing and transportation, that the profit margin for producing space solar power is very high (even when selling power below fossil fuel prices). We have investigated the financial scaling of ground launched versus space derived space solar power satellites. We find that for the carbon mitigation case even modernized ground launched space solar power satellites are not financially viable. For space derived solar power satellites, however, the increased demand makes them break even substantially sooner and yield much higher profit. Third, current awareness is increasing about the dangers of humanity remaining a single planet species. Our technological power has been increasing relative to the size of the planet Earth. Since the middle of the 20th century our technological power has grown large relative to our planet's size. This presents a very real potential for human self-extinction. We argue that the potential for human self-extinction is increasing with time in proportion to the exponential growth of our technological power making self-extinction likely within this century if humanity remains a single planet species. The O Neill model of multiple independent free space habitats, it is argued, can protect humanity from extinction in the same way that portfolio diversification protects ones assets from total loss. We show that about 1 million people for the electricity only case, and about 1 billion people for the carbon mitigation case, can be provided with permanent space habitats and transportation from Earth in 30 years and can be funded by the space derived solar power satellite program. 1.2 Scope of this Chapter The goal of this chapter is to illustrate the power and importance of the O'Neill-Glaser concept in the context of human survival and maintaining a healthy planet Earth. We argue that at this point in human history our technological power is too dangerous to our selves and our home planet for us not to expand into space. We show by the models presented in the chapter that the imminent dangers of global warming and human self-extinction mandate that humanity move aggressively into the solar system in this generation. We show that the production of solar power satellites using space resources and with a work foe living in space provides a viable financial model to mitigate CO2 preventing the worst global warming scenarios, and safeguards humanity against self-extinction by providing hundreds of habitats and a billion people living in space within about 35 years. To accomplish this goal we need only consider the classic O'Neill-Glaser model which was parameterized for 1970's technological projections. Only habitat size optimization for the first ten years of production is added. This is a conservative approach since the innovations of the last 30 years will make the financial projections more favorable. However, the classic O'Neill-Glaser model represented a broad technological consensus. The model is well documented in the references and our calculations can be easily reproduced In this chapter the economics of the O Neill - Glaser model is compared with models that rely exclusively on Earth launched materials. Although many studies of Earth launched Solar Power Satellites have been made, we found that the NASA "Fresh Look Study" was the most comprehensive and well documented. It also provided one of the most optimistic Earth launch financial projections. We thus chose it for comparison purposes.

  5. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  6. Analysis of helicopter flight dynamics through modeling and simulation of primary flight control actuation system

    NASA Astrophysics Data System (ADS)

    Nelson, Hunter Barton

    A simplified second-order transfer function actuator model used in most flight dynamics applications cannot easily capture the effects of different actuator parameters. The present work integrates a nonlinear actuator model into a nonlinear state space rotorcraft model to determine the effect of actuator parameters on key flight dynamics. The completed actuator model was integrated with a swashplate kinematics where step responses were generated over a range of key hydraulic parameters. The actuator-swashplate system was then introduced into a nonlinear state space rotorcraft simulation where flight dynamics quantities such as bandwidth and phase delay analyzed. Frequency sweeps were simulated for unique actuator configurations using the coupled nonlinear actuator-rotorcraft system. The software package CIFER was used for system identification and compared directly to the linearized models. As the actuator became rate saturated, the effects on bandwidth and phase delay were apparent on the predicted handling qualities specifications.

  7. Hybrid grammar-based approach to nonlinear dynamical system identification from biological time series

    NASA Astrophysics Data System (ADS)

    McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.

    2006-02-01

    We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.

  8. Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Dupnick, E.; Wiggins, D.

    1980-01-01

    An interactive computer program for automatically generating traffic models for the Space Transportation System (STS) is presented. Information concerning run stream construction, input data, and output data is provided. The flow of the interactive data stream is described. Error messages are specified, along with suggestions for remedial action. In addition, formats and parameter definitions for the payload data set (payload model), feasible combination file, and traffic model are documented.

  9. Evaluation of Cartosat-1 Multi-Scale Digital Surface Modelling Over France

    PubMed Central

    Gianinetto, Marco

    2009-01-01

    On 5 May 2005, the Indian Space Research Organization launched Cartosat-1, the eleventh satellite of its constellation, dedicated to the stereo viewing of the Earth's surface for terrain modeling and large-scale mapping, from the Satish Dhawan Space Centre (India). In early 2006, the Indian Space Research Organization started the Cartosat-1 Scientific Assessment Programme, jointly established with the International Society for Photogrammetry and Remote Sensing. Within this framework, this study evaluated the capabilities of digital surface modeling from Cartosat-1 stereo data for the French test sites of Mausanne les Alpilles and Salon de Provence. The investigation pointed out that for hilly territories it is possible to produce high-resolution digital surface models with a root mean square error less than 7.1 m and a linear error at 90% confidence level less than 9.5 m. The accuracy of the generated digital surface models also fulfilled the requirements of the French Reference 3D®, so Cartosat-1 data may be used to produce or update such kinds of products. PMID:22412311

  10. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  11. Calculation of the electric field resulting from human body rotation in a magnetic field

    NASA Astrophysics Data System (ADS)

    Cobos Sánchez, Clemente; Glover, Paul; Power, Henry; Bowtell, Richard

    2012-08-01

    A number of recent studies have shown that the electric field and current density induced in the human body by movement in and around magnetic resonance imaging installations can exceed regulatory levels. Although it is possible to measure the induced electric fields at the surface of the body, it is usually more convenient to use numerical models to predict likely exposure under well-defined movement conditions. Whilst the accuracy of these models is not in doubt, this paper shows that modelling of particular rotational movements should be treated with care. In particular, we show that v  ×  B rather than -(v  ·  ∇)A should be used as the driving term in potential-based modelling of induced fields. Although for translational motion the two driving terms are equivalent, specific examples of rotational rigid-body motion are given where incorrect results are obtained when -(v  ·  ∇)A is employed. In addition, we show that it is important to take into account the space charge which can be generated by rotations and we also consider particular cases where neglecting the space charge generates erroneous results. Along with analytic calculations based on simple models, boundary-element-based numerical calculations are used to illustrate these findings.

  12. Operational Impact of Improved Space Tracking on Collision Avoidance in the Future LEO Space Debris Environment

    NASA Astrophysics Data System (ADS)

    Sibert, D.; Borgeson, D.; Peterson, G.; Jenkin, A.; Sorge, M.

    2010-09-01

    Even if global space policy successfully curtails on orbit explosions and ASAT demonstrations, studies indicate that the number of debris objects in Low Earth Orbit (LEO) will continue to grow solely from debris on debris collisions and debris generated from new launches. This study examines the threat posed by this growing space debris population over the next 30 years and how improvements in our space tracking capabilities can reduce the number of Collision Avoidance (COLA) maneuvers required keep the risk of operational satellite loss within tolerable limits. Particular focus is given to satellites operated by the Department of Defense (DoD) and Intelligence Community (IC) in Low Earth Orbit (LEO). The following debris field and space tracking performance parameters were varied parametrically in the experiment to study the impact on the number of collision avoidance maneuvers required: - Debris Field Density (by year 2009, 2019, 2029, and 2039) - Quality of Track Update (starting 1 sigma error ellipsoid) - Future Propagator Accuracy (error ellipsoid growth rates - Special Perturbations in 3 axes) - Track Update Rate for Debris (stochastic) - Track Update Rate for Payloads (stochastic) Baseline values matching present day tracking performance for quality of track update, propagator accuracy, and track update rate were derived by analyzing updates to the unclassified Satellite Catalog (SatCat). Track update rates varied significantly for active payloads and debris and as such we used different models for the track update rates for military payloads and debris. The analysis was conducted using the System Effectiveness Analysis Simulation (SEAS) an agent based model developed by the United States Air Force Space Command’s Space and Missile Systems Center to evaluate the military utility of space systems. The future debris field was modeled by The Aerospace Corporation using a tool chain which models the growth of the 10cm+ debris field using high fidelity propagation, collision, and breakup models. Our analysis uses Two Line Element (TLE) sets and surface area data generated by this model sampled at the years 2019, 2029, and 2039. Data for the 2009 debris field is taken from the unclassified SatCat. By using Monte Carlo simulation techniques and varying the epoch of the military constellation relative to the debris field we were able to remove the bias of initial conditions. Additional analysis was conducted looking at the military utility impact of temporarily losing the use of Intelligence Surveillance and Reconnaissance (ISR) assets due to COLA maneuvers during a large classified scenario with stressful satellite tasking. This paper and presentation will focus only on unclassified results quantifying the potential reduction in the risk assumed by satellite flyers, and the potential reduction in Delta-V usage that is possible if we are able to improve our tracking performance in any of these three areas and reduce the positional uncertainty of space objects at the time of closest approach.

  13. The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding.

    PubMed

    Testolin, Alberto; De Filippo De Grazia, Michele; Zorzi, Marco

    2017-01-01

    The recent "deep learning revolution" in artificial neural networks had strong impact and widespread deployment for engineering applications, but the use of deep learning for neurocomputational modeling has been so far limited. In this article we argue that unsupervised deep learning represents an important step forward for improving neurocomputational models of perception and cognition, because it emphasizes the role of generative learning as opposed to discriminative (supervised) learning. As a case study, we present a series of simulations investigating the emergence of neural coding of visual space for sensorimotor transformations. We compare different network architectures commonly used as building blocks for unsupervised deep learning by systematically testing the type of receptive fields and gain modulation developed by the hidden neurons. In particular, we compare Restricted Boltzmann Machines (RBMs), which are stochastic, generative networks with bidirectional connections trained using contrastive divergence, with autoencoders, which are deterministic networks trained using error backpropagation. For both learning architectures we also explore the role of sparse coding, which has been identified as a fundamental principle of neural computation. The unsupervised models are then compared with supervised, feed-forward networks that learn an explicit mapping between different spatial reference frames. Our simulations show that both architectural and learning constraints strongly influenced the emergent coding of visual space in terms of distribution of tuning functions at the level of single neurons. Unsupervised models, and particularly RBMs, were found to more closely adhere to neurophysiological data from single-cell recordings in the primate parietal cortex. These results provide new insights into how basic properties of artificial neural networks might be relevant for modeling neural information processing in biological systems.

  14. The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding

    PubMed Central

    Testolin, Alberto; De Filippo De Grazia, Michele; Zorzi, Marco

    2017-01-01

    The recent “deep learning revolution” in artificial neural networks had strong impact and widespread deployment for engineering applications, but the use of deep learning for neurocomputational modeling has been so far limited. In this article we argue that unsupervised deep learning represents an important step forward for improving neurocomputational models of perception and cognition, because it emphasizes the role of generative learning as opposed to discriminative (supervised) learning. As a case study, we present a series of simulations investigating the emergence of neural coding of visual space for sensorimotor transformations. We compare different network architectures commonly used as building blocks for unsupervised deep learning by systematically testing the type of receptive fields and gain modulation developed by the hidden neurons. In particular, we compare Restricted Boltzmann Machines (RBMs), which are stochastic, generative networks with bidirectional connections trained using contrastive divergence, with autoencoders, which are deterministic networks trained using error backpropagation. For both learning architectures we also explore the role of sparse coding, which has been identified as a fundamental principle of neural computation. The unsupervised models are then compared with supervised, feed-forward networks that learn an explicit mapping between different spatial reference frames. Our simulations show that both architectural and learning constraints strongly influenced the emergent coding of visual space in terms of distribution of tuning functions at the level of single neurons. Unsupervised models, and particularly RBMs, were found to more closely adhere to neurophysiological data from single-cell recordings in the primate parietal cortex. These results provide new insights into how basic properties of artificial neural networks might be relevant for modeling neural information processing in biological systems. PMID:28377709

  15. Stargate GTM: Bridging Descriptor and Activity Spaces.

    PubMed

    Gaspar, Héléna A; Baskin, Igor I; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre

    2015-11-23

    Predicting the activity profile of a molecule or discovering structures possessing a specific activity profile are two important goals in chemoinformatics, which could be achieved by bridging activity and molecular descriptor spaces. In this paper, we introduce the "Stargate" version of the Generative Topographic Mapping approach (S-GTM) in which two different multidimensional spaces (e.g., structural descriptor space and activity space) are linked through a common 2D latent space. In the S-GTM algorithm, the manifolds are trained simultaneously in two initial spaces using the probabilities in the 2D latent space calculated as a weighted geometric mean of probability distributions in both spaces. S-GTM has the following interesting features: (1) activities are involved during the training procedure; therefore, the method is supervised, unlike conventional GTM; (2) using molecular descriptors of a given compound as input, the model predicts a whole activity profile, and (3) using an activity profile as input, areas populated by relevant chemical structures can be detected. To assess the performance of S-GTM prediction models, a descriptor space (ISIDA descriptors) of a set of 1325 GPCR ligands was related to a B-dimensional (B = 1 or 8) activity space corresponding to pKi values for eight different targets. S-GTM outperforms conventional GTM for individual activities and performs similarly to the Lasso multitask learning algorithm, although it is still slightly less accurate than the Random Forest method.

  16. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  17. Exposure Space: Integrating Exposure Data and Modeling with Toxicity Information

    EPA Science Inventory

    Recent advances have been made in high-throughput (HTP) toxicity testing, e.g. from ToxCast, which will ultimately be combined with HTP predictions of exposure potential to support next-generation chemical safety assessment. Rapid exposure methods are essential in selecting chemi...

  18. Charting epilepsy by searching for intelligence in network space with the help of evolving autonomous agents.

    PubMed

    Ohayon, Elan L; Kalitzin, Stiliyan; Suffczynski, Piotr; Jin, Frank Y; Tsang, Paul W; Borrett, Donald S; Burnham, W McIntyre; Kwan, Hon C

    2004-01-01

    The problem of demarcating neural network space is formidable. A simple fully connected recurrent network of five units (binary activations, synaptic weight resolution of 10) has 3.2 *10(26) possible initial states. The problem increases drastically with scaling. Here we consider three complementary approaches to help direct the exploration to distinguish epileptic from healthy networks. [1] First, we perform a gross mapping of the space of five-unit continuous recurrent networks using randomized weights and initial activations. The majority of weight patterns (>70%) were found to result in neural assemblies exhibiting periodic limit-cycle oscillatory behavior. [2] Next we examine the activation space of non-periodic networks demonstrating that the emergence of paroxysmal activity does not require changes in connectivity. [3] The next challenge is to focus the search of network space to identify networks with more complex dynamics. Here we rely on a major available indicator critical to clinical assessment but largely ignored by epilepsy modelers, namely: behavioral states. To this end, we connected the above network layout to an external robot in which interactive states were evolved. The first random generation showed a distribution in line with approach [1]. That is, the predominate phenotypes were fixed-point or oscillatory with seizure-like motor output. As evolution progressed the profile changed markedly. Within 20 generations the entire population was able to navigate a simple environment with all individuals exhibiting multiply-stable behaviors with no cases of default locked limit-cycle oscillatory motor behavior. The resultant population may thus afford us a view of the architectural principles demarcating healthy biological networks from the pathological. The approach has an advantage over other epilepsy modeling techniques in providing a way to clarify whether observed dynamics or suggested therapies are pointing to computational viability or dead space.

  19. Real-time Space-time Integration in GIScience and Geography.

    PubMed

    Richardson, Douglas B

    2013-01-01

    Space-time integration has long been the topic of study and speculation in geography. However, in recent years an entirely new form of space-time integration has become possible in GIS and GIScience: real-time space-time integration and interaction. While real-time spatiotemporal data is now being generated almost ubiquitously, and its applications in research and commerce are widespread and rapidly accelerating, the ability to continuously create and interact with fused space-time data in geography and GIScience is a recent phenomenon, made possible by the invention and development of real-time interactive (RTI) GPS/GIS technology and functionality in the late 1980s and early 1990s. This innovation has since functioned as a core change agent in geography, cartography, GIScience and many related fields, profoundly realigning traditional relationships and structures, expanding research horizons, and transforming the ways geographic data is now collected, mapped, modeled, and used, both in geography and in science and society more broadly. Real-time space-time interactive functionality remains today the underlying process generating the current explosion of fused spatiotemporal data, new geographic research initiatives, and myriad geospatial applications in governments, businesses, and society. This essay addresses briefly the development of these real-time space-time functions and capabilities; their impact on geography, cartography, and GIScience; and some implications for how discovery and change can occur in geography and GIScience, and how we might foster continued innovation in these fields.

  20. Vacuum stability in the early universe and the backreaction of classical gravity.

    PubMed

    Markkanen, Tommi

    2018-03-06

    In the case of a metastable electroweak vacuum, the quantum corrected effective potential plays a crucial role in the potential instability of the standard model. In the early universe, in particular during inflation and reheating, this instability can be triggered leading to catastrophic vacuum decay. We discuss how the large space-time curvature of the early universe can be incorporated in the calculation and in many cases significantly modify the flat space prediction. The two key new elements are the unavoidable generation of the non-minimal coupling between the Higgs field and the scalar curvature of gravity and a curvature induced contribution to the running of the constants. For the minimal set up of the standard model and a decoupled inflation sector we show how a metastable vacuum can lead to very tight bounds for the non-minimal coupling. We also discuss a novel and very much related dark matter generation mechanism.This article is part of the Theo Murphy meeting issue 'Higgs cosmology'. © 2018 The Author(s).

  1. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing.

    PubMed

    Xu, Jason; Minin, Vladimir N

    2015-07-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes.

  2. Behavior Knowledge Space-Based Fusion for Copy-Move Forgery Detection.

    PubMed

    Ferreira, Anselmo; Felipussi, Siovani C; Alfaro, Carlos; Fonseca, Pablo; Vargas-Munoz, John E; Dos Santos, Jefersson A; Rocha, Anderson

    2016-07-20

    The detection of copy-move image tampering is of paramount importance nowadays, mainly due to its potential use for misleading the opinion forming process of the general public. In this paper, we go beyond traditional forgery detectors and aim at combining different properties of copy-move detection approaches by modeling the problem on a multiscale behavior knowledge space, which encodes the output combinations of different techniques as a priori probabilities considering multiple scales of the training data. Afterwards, the conditional probabilities missing entries are properly estimated through generative models applied on the existing training data. Finally, we propose different techniques that exploit the multi-directionality of the data to generate the final outcome detection map in a machine learning decision-making fashion. Experimental results on complex datasets, comparing the proposed techniques with a gamut of copy-move detection approaches and other fusion methodologies in the literature show the effectiveness of the proposed method and its suitability for real-world applications.

  3. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing

    PubMed Central

    Xu, Jason; Minin, Vladimir N.

    2016-01-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes. PMID:26949377

  4. Vacuum stability in the early universe and the backreaction of classical gravity

    NASA Astrophysics Data System (ADS)

    Markkanen, Tommi

    2018-01-01

    In the case of a metastable electroweak vacuum, the quantum corrected effective potential plays a crucial role in the potential instability of the standard model. In the early universe, in particular during inflation and reheating, this instability can be triggered leading to catastrophic vacuum decay. We discuss how the large space-time curvature of the early universe can be incorporated in the calculation and in many cases significantly modify the flat space prediction. The two key new elements are the unavoidable generation of the non-minimal coupling between the Higgs field and the scalar curvature of gravity and a curvature induced contribution to the running of the constants. For the minimal set up of the standard model and a decoupled inflation sector we show how a metastable vacuum can lead to very tight bounds for the non-minimal coupling. We also discuss a novel and very much related dark matter generation mechanism. This article is part of the Theo Murphy meeting issue `Higgs cosmology'.

  5. Predicting the possibility of not yet observed situations as higher goal of space environment standards.

    NASA Astrophysics Data System (ADS)

    Nymmik, Rikho

    Space environment models are intended for fairly describing the quantitative behavior of nature space environment. Usually, they are constructed on the basis of some experimental data set generalization, which is characteristic of the conditions that were taking place during measurement period. It is often to see that such models state and postulate realities of the past. The typical example of this point of view is the situation around extremely SEP events. During dozens of years models of such events have been based on the largest occurrences observed, which features were measured by some instruments with the reliability that was not always analyzed. It is obvious, that this way does not agree with reality, because any new extreme event conflicts with it. From this follow that space environment models can not be created by using numerical observed data only, when such data are changing in time, or have the probability nature. The model's goal is not only describing the average environment characteristics, but the predicting of extreme ones too. Such a prediction can only be result of analyzing the causes that stimulate environment change and taking them into account in model parameters. In this report we present the analysis of radiation environment formed by solar-generated high energy particles. A progresses and failures of SEP event modeling attempts are also shown and analyzed.

  6. Method of generating a surface mesh

    DOEpatents

    Shepherd, Jason F [Albuquerque, NM; Benzley, Steven [Provo, UT; Grover, Benjamin T [Tracy, CA

    2008-03-04

    A method and machine-readable medium provide a technique to generate and modify a quadrilateral finite element surface mesh using dual creation and modification. After generating a dual of a surface (mesh), a predetermined algorithm may be followed to generate and modify a surface mesh of quadrilateral elements. The predetermined algorithm may include the steps of generating two-dimensional cell regions in dual space, determining existing nodes in primal space, generating new nodes in the dual space, and connecting nodes to form the quadrilateral elements (faces) for the generated and modifiable surface mesh.

  7. Model of non-stationary, inhomogeneous turbulence

    DOE PAGES

    Bragg, Andrew D.; Kurien, Susan; Clark, Timothy T.

    2016-07-08

    Here, we compare results from a spectral model for non-stationary, inhomogeneous turbulence (Besnard et al. in Theor Comp Fluid Dyn 8:1–35, 1996) with direct numerical simulation (DNS) data of a shear-free mixing layer (SFML) (Tordella et al. in Phys Rev E 77:016309, 2008). The SFML is used as a test case in which the efficacy of the model closure for the physical-space transport of the fluid velocity field can be tested in a flow with inhomogeneity, without the additional complexity of mean-flow coupling. The model is able to capture certain features of the SFML quite well for intermediate to longmore » times, including the evolution of the mixing-layer width and turbulent kinetic energy. At short-times, and for more sensitive statistics such as the generation of the velocity field anisotropy, the model is less accurate. We propose two possible causes for the discrepancies. The first is the local approximation to the pressure-transport and the second is the a priori spherical averaging used to reduce the dimensionality of the solution space of the model, from wavevector to wavenumber space. DNS data are then used to gauge the relative importance of both possible deficiencies in the model.« less

  8. Automating the generation of finite element dynamical cores with Firedrake

    NASA Astrophysics Data System (ADS)

    Ham, David; Mitchell, Lawrence; Homolya, Miklós; Luporini, Fabio; Gibson, Thomas; Kelly, Paul; Cotter, Colin; Lange, Michael; Kramer, Stephan; Shipton, Jemma; Yamazaki, Hiroe; Paganini, Alberto; Kärnä, Tuomas

    2017-04-01

    The development of a dynamical core is an increasingly complex software engineering undertaking. As the equations become more complete, the discretisations more sophisticated and the hardware acquires ever more fine-grained parallelism and deeper memory hierarchies, the problem of building, testing and modifying dynamical cores becomes increasingly complex. Here we present Firedrake, a code generation system for the finite element method with specialist features designed to support the creation of geoscientific models. Using Firedrake, the dynamical core developer writes the partial differential equations in weak form in a high level mathematical notation. Appropriate function spaces are chosen and time stepping loops written at the same high level. When the programme is run, Firedrake generates high performance C code for the resulting numerics which are executed in parallel. Models in Firedrake typically take a tiny fraction of the lines of code required by traditional hand-coding techniques. They support more sophisticated numerics than are easily achieved by hand, and the resulting code is frequently higher performance. Critically, debugging, modifying and extending a model written in Firedrake is vastly easier than by traditional methods due to the small, highly mathematical code base. Firedrake supports a wide range of key features for dynamical core creation: A vast range of discretisations, including both continuous and discontinuous spaces and mimetic (C-grid-like) elements which optimally represent force balances in geophysical flows. High aspect ratio layered meshes suitable for ocean and atmosphere domains. Curved elements for high accuracy representations of the sphere. Support for non-finite element operators, such as parametrisations. Access to PETSc, a world-leading library of programmable linear and nonlinear solvers. High performance adjoint models generated automatically by symbolically reasoning about the forward model. This poster will present the key features of the Firedrake system, as well as those of Gusto, an atmospheric dynamical core, and Thetis, a coastal ocean model, both of which are written in Firedrake.

  9. Modeling Joule Heating Effect on Lunar O2 Generation via Electrolytic Reduction.

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus; Poizeau, Sophie; Sibille, Laurent

    2009-01-01

    Kennedy Space Center is leading research work on lunar O2 generation via electrolytic reduction of regolith; the metal oxide present in the regolith is dissociated in oxygen anions and metal cations leading to the generation of gaseous oxygen at the anode and liquid metal at the cathode. Electrical resistance of molten regolith is high, leading to heating of the melt when electrical current is applied between the electrodes (Joule heating). The authors have developed a 3D model using a rigorous approach for two coupled physics (thermal and electrical potential) to not only study the effect of Joule heating on temperature distribution throughout the molten regolith but also to evaluate and optimize the design of the electrolytic cells. This paper presents the results of the thermal analysis performed on the model and used to validate the design of the electrolytic cell.

  10. Using the MCPLXS Generator for Technology Transfer

    NASA Technical Reports Server (NTRS)

    Moore, Arlene A.; Dean, Edwin B.

    1987-01-01

    The objective of this paper is to acquaint you with some of the approaches we are taking at Langley to incorporate escalations (or de-escalations) of technology when modeling futuristic systems. Since we have a short turnaround between the time we receive enough descriptive information to start estimating the project and when the estimate is needed (the "we-want-it-yesterday syndrome"), creativity is often necessary. There is not much time available for tool development. It is expedient to use existing tools in an adaptive manner to model the situation at hand. Specifically, this paper describes the use of the RCA PRICE MCPLXS Generator to incorporate technology transfer and technology escalation in estimates for advanced space systems such as Shuttle II and NASA advanced technology vehicles. It is assumed that the reader is familiar with the RCA PRICE family of models as well as the RCA PRICE utility programs such as SCPLX, PARAM, PARASYN, and the MCPLXS Generator.

  11. A new state space model for the NASA/JPL 70-meter antenna servo controls

    NASA Technical Reports Server (NTRS)

    Hill, R. E.

    1987-01-01

    A control axis referenced model of the NASA/JPL 70-m antenna structure is combined with the dynamic equations of servo components to produce a comprehansive state variable (matrix) model of the coupled system. An interactive Fortran program for generating the linear system model and computing its salient parameters is described. Results are produced in a state variable, block diagram, and in factored transfer function forms to facilitate design and analysis by classical as well as modern control methods.

  12. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  13. Technology Development Risk Assessment for Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Godsell, Aga M.; Go, Susie

    2006-01-01

    A new approach for assessing development risk associated with technology development projects is presented. The method represents technology evolution in terms of sector-specific discrete development stages. A Monte Carlo simulation is used to generate development probability distributions based on statistical models of the discrete transitions. Development risk is derived from the resulting probability distributions and specific program requirements. Two sample cases are discussed to illustrate the approach, a single rocket engine development and a three-technology space transportation portfolio.

  14. On the mixing time in the Wang-Landau algorithm

    NASA Astrophysics Data System (ADS)

    Fadeeva, Marina; Shchur, Lev

    2018-01-01

    We present preliminary results of the investigation of the properties of the Markov random walk in the energy space generated by the Wang-Landau probability. We build transition matrix in the energy space (TMES) using the exact density of states for one-dimensional and two-dimensional Ising models. The spectral gap of TMES is inversely proportional to the mixing time of the Markov chain. We estimate numerically the dependence of the mixing time on the lattice size, and extract the mixing exponent.

  15. NASA Stennis Space Center Test Technology Branch Activities

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.

    2000-01-01

    This paper provides a short history of NASA Stennis Space Center's Test Technology Laboratory and briefly describes the variety of engine test technology activities and developmental project initiatives. Theoretical rocket exhaust plume modeling, acoustic monitoring and analysis, hand held fire imaging, heat flux radiometry, thermal imaging and exhaust plume spectroscopy are all examples of current and past test activities that are briefly described. In addition, recent efforts and visions focused on accomodating second, third, and fourth generation flight vehicle engine test requirements are discussed.

  16. Combining Space-Based and In-Situ Measurements to Track Flooding in Thailand

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Doubleday, Joshua; Mclaren, David; Tran, Daniel; Tanpipat, Veerachai; Chitradon, Royal; Boonya-aaroonnet, Surajate; Thanapakpawin, Porranee; Khunboa, Chatchai; Leelapatra, Watis; hide

    2011-01-01

    We describe efforts to integrate in-situ sensing, space-borne sensing, hydrological modeling, active control of sensing, and automatic data product generation to enhance monitoring and management of flooding. In our approach, broad coverage sensors and missions such as MODIS, TRMM, and weather satellite information and in-situ weather and river gauging information are all inputs to track flooding via river basin and sub-basin hydrological models. While these inputs can provide significant information as to the major flooding, targetable space measurements can provide better spatial resolution measurements of flooding extent. In order to leverage such assets we automatically task observations in response to automated analysis indications of major flooding. These new measurements are automatically processed and assimilated with the other flooding data. We describe our ongoing efforts to deploy this system to track major flooding events in Thailand.

  17. Animated computer graphics models of space and earth sciences data generated via the massively parallel processor

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David

    1987-01-01

    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.

  18. Modeling a Wireless Network for International Space Station

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Yaprak, Ece; Lamouri, Saad

    2000-01-01

    This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.

  19. A framework for parallelized efficient global optimization with application to vehicle crashworthiness optimization

    NASA Astrophysics Data System (ADS)

    Hamza, Karim; Shalaby, Mohamed

    2014-09-01

    This article presents a framework for simulation-based design optimization of computationally expensive problems, where economizing the generation of sample designs is highly desirable. One popular approach for such problems is efficient global optimization (EGO), where an initial set of design samples is used to construct a kriging model, which is then used to generate new 'infill' sample designs at regions of the search space where there is high expectancy of improvement. This article attempts to address one of the limitations of EGO, where generation of infill samples can become a difficult optimization problem in its own right, as well as allow the generation of multiple samples at a time in order to take advantage of parallel computing in the evaluation of the new samples. The proposed approach is tested on analytical functions, and then applied to the vehicle crashworthiness design of a full Geo Metro model undergoing frontal crash conditions.

  20. Numerical simulation of terahertz generation and detection based on ultrafast photoconductive antennas

    NASA Astrophysics Data System (ADS)

    Chen, Long-chao; Fan, Wen-hui

    2011-08-01

    The numerical simulation of terahertz generation and detection in the interaction between femtosecond laser pulse and photoconductive material has been reported in this paper. The simulation model based on the Drude-Lorentz theory is used, and takes into account the phenomena that photo-generated electrons and holes are separated by the external bias field, which is screened by the space-charge field simultaneously. According to the numerical calculation, the terahertz time-domain waveforms and their Fourier-transformed spectra are presented under different conditions. The simulation results indicate that terahertz generation and detection properties of photoconductive antennas are largely influenced by three major factors, including photo-carriers' lifetime, laser pulse width and pump laser power. Finally, a simple model has been applied to simulate the detected terahertz pulses by photoconductive antennas with various photo-carriers' lifetimes, and the results show that the detected terahertz spectra are very different from the spectra radiated from the emitter.

  1. PyLDTk: Python toolkit for calculating stellar limb darkening profiles and model-specific coefficients for arbitrary filters

    NASA Astrophysics Data System (ADS)

    Parviainen, Hannu

    2015-10-01

    PyLDTk automates the calculation of custom stellar limb darkening (LD) profiles and model-specific limb darkening coefficients (LDC) using the library of PHOENIX-generated specific intensity spectra by Husser et al. (2013). It facilitates exoplanet transit light curve modeling, especially transmission spectroscopy where the modeling is carried out for custom narrow passbands. PyLDTk construct model-specific priors on the limb darkening coefficients prior to the transit light curve modeling. It can also be directly integrated into the log posterior computation of any pre-existing transit modeling code with minimal modifications to constrain the LD model parameter space directly by the LD profile, allowing for the marginalization over the whole parameter space that can explain the profile without the need to approximate this constraint by a prior distribution. This is useful when using a high-order limb darkening model where the coefficients are often correlated, and the priors estimated from the tabulated values usually fail to include these correlations.

  2. `Inverse Crime' and Model Integrity in Lightcurve Inversion applied to unresolved Space Object Identification

    NASA Astrophysics Data System (ADS)

    Henderson, Laura S.; Subbarao, Kamesh

    2017-12-01

    This work presents a case wherein the selection of models when producing synthetic light curves affects the estimation of the size of unresolved space objects. Through this case, "inverse crime" (using the same model for the generation of synthetic data and data inversion), is illustrated. This is done by using two models to produce the synthetic light curve and later invert it. It is shown here that the choice of model indeed affects the estimation of the shape/size parameters. When a higher fidelity model (henceforth the one that results in the smallest error residuals after the crime is committed) is used to both create, and invert the light curve model the estimates of the shape/size parameters are significantly better than those obtained when a lower fidelity model (in comparison) is implemented for the estimation. It is therefore of utmost importance to consider the choice of models when producing synthetic data, which later will be inverted, as the results might be misleadingly optimistic.

  3. Spinor Field Nonlinearity and Space-Time Geometry

    NASA Astrophysics Data System (ADS)

    Saha, Bijan

    2018-03-01

    Within the scope of Bianchi type VI,VI0,V, III, I, LRSBI and FRW cosmological models we have studied the role of nonlinear spinor field on the evolution of the Universe and the spinor field itself. It was found that due to the presence of non-trivial non-diagonal components of the energy-momentum tensor of the spinor field in the anisotropic space-time, there occur some severe restrictions both on the metric functions and on the components of the spinor field. In this report we have considered a polynomial nonlinearity which is a function of invariants constructed from the bilinear spinor forms. It is found that in case of a Bianchi type-VI space-time, depending of the sign of self-coupling constants, the model allows either late time acceleration or oscillatory mode of evolution. In case of a Bianchi VI 0 type space-time due to the specific behavior of the spinor field we have two different scenarios. In one case the invariants constructed from bilinear spinor forms become trivial, thus giving rise to a massless and linear spinor field Lagrangian. This case is equivalent to the vacuum solution of the Bianchi VI 0 type space-time. The second case allows non-vanishing massive and nonlinear terms and depending on the sign of coupling constants gives rise to accelerating mode of expansion or the one that after obtaining some maximum value contracts and ends in big crunch, consequently generating space-time singularity. In case of a Bianchi type-V model there occur two possibilities. In one case we found that the metric functions are similar to each other. In this case the Universe expands with acceleration if the self-coupling constant is taken to be a positive one, whereas a negative coupling constant gives rise to a cyclic or periodic solution. In the second case the spinor mass and the spinor field nonlinearity vanish and the Universe expands linearly in time. In case of a Bianchi type-III model the space-time remains locally rotationally symmetric all the time, though the isotropy of space-time can be attained for a large proportionality constant. As far as evolution is concerned, depending on the sign of coupling constant the model allows both accelerated and oscillatory mode of expansion. A negative coupling constant leads to an oscillatory mode of expansion, whereas a positive coupling constant generates expanding Universe with late time acceleration. Both deceleration parameter and EoS parameter in this case vary with time and are in agreement with modern concept of space-time evolution. In case of a Bianchi type-I space-time the non-diagonal components lead to three different possibilities. In case of a full BI space-time we find that the spinor field nonlinearity and the massive term vanish, hence the spinor field Lagrangian becomes massless and linear. In two other cases the space-time evolves into either LRSBI or FRW Universe. If we consider a locally rotationally symmetric BI( LRSBI) model, neither the mass term nor the spinor field nonlinearity vanishes. In this case depending on the sign of coupling constant we have either late time accelerated mode of expansion or oscillatory mode of evolution. In this case for an expanding Universe we have asymptotical isotropization. Finally, in case of a FRW model neither the mass term nor the spinor field nonlinearity vanishes. Like in LRSBI case we have either late time acceleration or cyclic mode of evolution. These findings allow us to conclude that the spinor field is very sensitive to the gravitational one.

  4. Crossing the dividing surface of transition state theory. III. Once and only once. Selecting reactive trajectories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorquet, J. C., E-mail: jc.lorquet@ulg.ac.be

    2015-09-14

    The purpose of the present work is to determine initial conditions that generate reacting, recrossing-free trajectories that cross the conventional dividing surface of transition state theory (i.e., the plane in configuration space passing through a saddle point of the potential energy surface and perpendicular to the reaction coordinate) without ever returning to it. Local analytical equations of motion valid in the neighborhood of this planar surface have been derived as an expansion in Poisson brackets. We show that the mere presence of a saddle point implies that reactivity criteria can be quite simply formulated in terms of elements of thismore » series, irrespective of the shape of the potential energy function. Some of these elements are demonstrated to be equal to a sum of squares and thus to be necessarily positive, which has a profound impact on the dynamics. The method is then applied to a three-dimensional model describing an atom-diatom interaction. A particular relation between initial conditions is shown to generate a bundle of reactive trajectories that form reactive cylinders (or conduits) in phase space. This relation considerably reduces the phase space volume of initial conditions that generate recrossing-free trajectories. Loci in phase space of reactive initial conditions are presented. Reactivity is influenced by symmetry, as shown by a comparative study of collinear and bent transition states. Finally, it is argued that the rules that have been derived to generate reactive trajectories in classical mechanics are also useful to build up a reactive wave packet.« less

  5. The Deep Space Network. An instrument for radio navigation of deep space probes

    NASA Technical Reports Server (NTRS)

    Renzetti, N. A.; Jordan, J. F.; Berman, A. L.; Wackley, J. A.; Yunck, T. P.

    1982-01-01

    The Deep Space Network (DSN) network configurations used to generate the navigation observables and the basic process of deep space spacecraft navigation, from data generation through flight path determination and correction are described. Special emphasis is placed on the DSN Systems which generate the navigation data: the DSN Tracking and VLBI Systems. In addition, auxiliary navigational support functions are described.

  6. Three Generations of Tracking and Data Relay Satellite (TDRS) Spacecraft

    NASA Technical Reports Server (NTRS)

    Zaleski, Ron

    2016-01-01

    The current Tracking and Data Relay Satellite configuration consists of nine in-orbit satellites (four first generation, three second generation and two third generation satellites) globally distributed in geosynchronous orbit to provide near continuous data relay service to missions like Hubble Space Telescope and the International Space Station. The 1st generation spacecraft were designed by TRW/Northrop Grumman with their launches of the five spacecraft ranging from 1983 through 1995. The 2nd and 3rd generation spacecraft were designed by Boeing with their launches ranging 2000 - 2002 and 2013 - 2017 respectively. TDRS-3 is now 27 years on orbit, continues to be a capable asset for the TDRS constellation. Lack of need for inclination control combined with large fuel reserves and redundancy on critical elements provides spacecraft that operate well past design life, all of which contributes to expanded TDRS constellation support capabilities. All spacecraft generations have issues. Significant issues will be summarized with the focus on the Boeing related problems. Degradations and failures are continually assessed and provide the foundation for yearly updates to spacecraft reliability models, constellation service projections and deorbit plans (in order to meet NASAs mandate of limiting orbital debris). Even when accounting for degradations and failures, the life expectancy for the Boeing delivered 2nd generation TDRS-8, 9 10 TDRS are anticipated to be 25+ years.

  7. Spanwise Spacing Effects on the Initial Structure and Decay of Axial Vortices

    NASA Technical Reports Server (NTRS)

    Wendt, B. J.; Reichert, B. A.

    1996-01-01

    The initial structure and axial decay of an array of streamwise vortices embedded in a turbulent pipe boundary layer is experimentally investigated. The vortices are shed in counter-rotating fashion from an array of equally-spaced symmetric airfoil vortex generators. Vortex structure is quantified in terms of crossplane circulation and peak streamwise vorticity. Flow conditions are subsonic and incompressible. The focus of this study is on the effect of the initial spacing between the parent vortex generators. Arrays with vortex generators spaced at 15 and 30 degrees apart are considered. When the spacing between vortex generators is decreased the circulation and peak vorticity of the shed vortices increases. Analysis indicates this strengthening results from regions of fluid acceleration in the vicinity of the vortex generator array. Decreased spacing between the constituent vortices also produces increased rates of circulation and peak vorticity decay.

  8. Using Models to Enhance Exposure Characterization for Air Pollution Health Studies

    EPA Science Inventory

    The United States and the United Kingdom are faced with increasing challenges in determining the human health impact of air pollutants emitted locally. Often, these pollutants can be toxic at relatively low doses, are highly reactive, or generate large gradients across space beca...

  9. Current Grid Generation Strategies and Future Requirements in Hypersonic Vehicle Design, Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Papadopoulos, Periklis; Venkatapathy, Ethiraj; Prabhu, Dinesh; Loomis, Mark P.; Olynick, Dave; Arnold, James O. (Technical Monitor)

    1998-01-01

    Recent advances in computational power enable computational fluid dynamic modeling of increasingly complex configurations. A review of grid generation methodologies implemented in support of the computational work performed for the X-38 and X-33 are presented. In strategizing topological constructs and blocking structures factors considered are the geometric configuration, optimal grid size, numerical algorithms, accuracy requirements, physics of the problem at hand, computational expense, and the available computer hardware. Also addressed are grid refinement strategies, the effects of wall spacing, and convergence. The significance of grid is demonstrated through a comparison of computational and experimental results of the aeroheating environment experienced by the X-38 vehicle. Special topics on grid generation strategies are also addressed to model control surface deflections, and material mapping.

  10. Global partnerships: Expanding the frontiers of space exploration education

    NASA Astrophysics Data System (ADS)

    MacLeish, Marlene Y.; Akinyede, Joseph O.; Goswami, Nandu; Thomson, William A.

    2012-11-01

    Globalization is creating an interdependent space-faring world and new opportunities for international partnerships that strengthen space knowledge development and transfer. These opportunities have been codified in the Global Exploration Strategy, which endorses the "inspirational and educational value of space exploration" [1]. Also, during the 2010 Heads of Space Agencies Summit celebrating the International Academy of Astronautics' (IAA) 50th Anniversary, space-faring nations from across the globe issued a collective call in support of robust international partnerships to expand the frontiers of space exploration and generate knowledge for improving life on Earth [2]. Educators play a unique role in this mission, developing strategic partnerships and sharing best educational practices to (1) further global understanding of the benefits of space exploration for life on Earth and (2) prepare the next generation of scientists required for the 21st Century space workforce. Educational Outreach (EO) programs use evidence-based, measurable outcomes strategies and cutting edge information technologies to transfer space-based science, technology, engineering and mathematics (STEM) knowledge to new audiences; create indigenous materials with cultural resonance for emerging space societies; support teacher professional development; and contribute to workforce development initiatives that inspire and prepare new cohorts of students for space exploration careers. The National Space Biomedical Research Institute (NSBRI), the National Aeronautics and Space Administration (NASA) and Morehouse School of Medicine (MSM) have sustained a 13-year space science education partnership dedicated to these objectives. This paper briefly describes the design and achievements of NSBRI's educational programs, with special emphasis on those initiatives' involvement with IAA and the International Astronautical Congress (IAC). The IAA Commission 2 Draft Report, Space for Africa, is discussed as a model for developing sustainable partnerships and indigenous programs that support Africa's steady emergence as a global space-faring force. The IAC will provide timely: 2011 South Africa will provide timely feedback to refine that report's strategies for space life sciences education and public engagement in Africa and around the globe.

  11. JIGSAW-GEO (1.0): Locally Orthogonal Staggered Unstructured Grid Generation for General Circulation Modelling on the Sphere

    NASA Technical Reports Server (NTRS)

    Engwirda, Darren

    2017-01-01

    An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered VoronoiDelaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.

  12. JIGSAW-GEO (1.0): locally orthogonal staggered unstructured grid generation for general circulation modelling on the sphere

    NASA Astrophysics Data System (ADS)

    Engwirda, Darren

    2017-06-01

    An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered Voronoi-Delaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.

  13. Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications

    NASA Technical Reports Server (NTRS)

    Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.

    2017-01-01

    Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.

  14. Stable Direct Adaptive Control of Linear Infinite-dimensional Systems Using a Command Generator Tracker Approach

    NASA Technical Reports Server (NTRS)

    Balas, M. J.; Kaufman, H.; Wen, J.

    1985-01-01

    A command generator tracker approach to model following contol of linear distributed parameter systems (DPS) whose dynamics are described on infinite dimensional Hilbert spaces is presented. This method generates finite dimensional controllers capable of exponentially stable tracking of the reference trajectories when certain ideal trajectories are known to exist for the open loop DPS; we present conditions for the existence of these ideal trajectories. An adaptive version of this type of controller is also presented and shown to achieve (in some cases, asymptotically) stable finite dimensional control of the infinite dimensional DPS.

  15. Long-Term Global Morphology of Gravity Wave Activity Using UARS Data

    NASA Technical Reports Server (NTRS)

    Eckermann, Stephen D.; Bacmeister, Julio T.; Wu, Dong L.

    1998-01-01

    Progress in research into the global morphology of gravity wave activity using UARS data is described for the period March-June, 1998. Highlights this quarter include further progress in the analysis and interpretation of CRISTA temperature variances; model-generated climatologies of mesospheric gravity wave activity using the HWM-93 wind and temperature model; and modeling of gravity wave detection from space-based platforms. Preliminary interpretations and recommended avenues for further analysis are also described.

  16. Linear Sigma Model Toolshed for D-brane Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellerman, Simeon

    Building on earlier work, we construct linear sigma models for strings on curved spaces in the presence of branes. Our models include an extremely general class of brane-worldvolume gauge field configurations. We explain in an accessible manner the mathematical ideas which suggest appropriate worldsheet interactions for generating a given open string background. This construction provides an explanation for the appearance of the derived category in D-brane physic complementary to that of recent work of Douglas.

  17. Design and simulation of a lighting system for a shadowless space

    NASA Astrophysics Data System (ADS)

    Wang, Ye; Fang, Li

    2017-10-01

    This paper described implementing the shadowless space by two kinds of methods. The first method will implement the shadowless space utilizing the semblable principles used in the integrating sphere. The rays from a built in light source will eventually evolve into a uniform lighting through diffuse reflections for numerous times, consider that the spherical cavity structure and the inner surface with high reflectivity. There is possibility to create a shadowless space through diffuse reflections. At a 27.4m2 area, illuminance uniformity achieved 88.2% in this model. The other method is analogous with the method used in medical shadowless lamps. Lights will fall on the object in different angles and each light will generate a shadow. By changing the position distribution of multiple lights, increasing the number of light sources, the possibility of obtaining shadowless area will gradually increase. Based on these two approaches, two simple models are proposed showing the optical system designed for the shadowless space. By taking simulation software TracePro as design platform, this paper simulated the two systems.

  18. High-order space charge effects using automatic differentiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reusch, M.F.; Bruhwiler, D.L.

    1997-02-01

    The Northrop Grumman Topkark code has been upgraded to Fortran 90, making use of operator overloading, so the same code can be used to either track an array of particles or construct a Taylor map representation of the accelerator lattice. We review beam optics and beam dynamics simulations conducted with TOPKARK in the past and we present a new method for modeling space charge forces to high-order with automatic differentiation. This method generates an accurate, high-order, 6-D Taylor map of the phase space variable trajectories for a bunched, high-current beam. The spatial distribution is modeled as the product of amore » Taylor Series times a Gaussian. The variables in the argument of the Gaussian are normalized to the respective second moments of the distribution. This form allows for accurate representation of a wide range of realistic distributions, including any asymmetries, and allows for rapid calculation of the space charge fields with free space boundary conditions. An example problem is presented to illustrate our approach. {copyright} {ital 1997 American Institute of Physics.}« less

  19. Space-charge limited photocurrent.

    PubMed

    Mihailetchi, V D; Wildeman, J; Blom, P W M

    2005-04-01

    In 1971 Goodman and Rose predicted the occurrence of a fundamental electrostatic limit for the photocurrent in semiconductors at high light intensities. Blends of conjugated polymers and fullerenes are an ideal model system to observe this space-charge limit experimentally, since they combine an unbalanced charge transport, long lifetimes, high charge carrier generation efficiencies, and low mobility of the slowest charge carrier. The experimental photocurrents reveal all the characteristics of a space-charge limited photocurrent: a one-half power dependence on voltage, a three-quarter power dependence on light intensity, and a one-half power scaling of the voltage at which the photocurrent switches into full saturation with light intensity.

  20. [STEM on Station Education

    NASA Technical Reports Server (NTRS)

    Lundebjerg, Kristen

    2016-01-01

    The STEM on Station team is part of Education which is part of the External Relations organization (ERO). ERO has traditional goals based around BHAG (Big Hairy Audacious Goal). The BHAG model is simplified to a saying: Everything we do stimulates actions by others to advance human space exploration. The STEM on Station education initiate is a project focused on bringing off the earth research and learning into classrooms. Educational resources such as lesson plans, activities to connect with the space station and STEM related contests are available and hosted by the STEM on Station team along with their partners such as Texas Instruments. These educational activities engage teachers and students in the current happenings aboard the international space station, inspiring the next generation of space explorers.

  1. Parametric State Space Structuring

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Tilgner, Marco

    1997-01-01

    Structured approaches based on Kronecker operators for the description and solution of the infinitesimal generator of a continuous-time Markov chains are receiving increasing interest. However, their main advantage, a substantial reduction in the memory requirements during the numerical solution, comes at a price. Methods based on the "potential state space" allocate a probability vector that might be much larger than actually needed. Methods based on the "actual state space", instead, have an additional logarithmic overhead. We present an approach that realizes the advantages of both methods with none of their disadvantages, by partitioning the local state spaces of each submodel. We apply our results to a model of software rendezvous, and show how they reduce memory requirements while, at the same time, improving the efficiency of the computation.

  2. Advanced space power requirements and techniques. Task 1: Mission projections and requirements. Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Wolfe, M. G.

    1978-01-01

    The objectives of this study were to: (1) develop projections of the NASA, DoD, and civil space power requirements for the 1980-1995 time period; (2) identify specific areas of application and space power subsystem type needs for each prospective user; (3) document the supporting and historical base, including relevant cost related measures of performance; and (4) quantify the benefits of specific technology projection advancements. The initial scope of the study included: (1) construction of likely models for NASA, DoD, and civil space systems; (2) generation of a number of future scenarios; (3) extraction of time phased technology requirements based on the scenarios; and (4) cost/benefit analyses of some of the technologies identified.

  3. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions

    NASA Astrophysics Data System (ADS)

    Hagerty, S.; Ellis, H., Jr.

    2016-09-01

    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization noise), and environmental effects (radiation hits with selectable angular distributions and 4-layer atmospheric turbulence model for ground based sensors). We have developed an accurate flash Light Detection and Ranging (LIDAR) model that supports reconstruction of 3-dimensional information on the RSO. PROXOR™ contains many important imaging effects such as intra-frame smear, realized by oversampling the image in time and capturing target motion and jitter during the integration time.

  4. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data and inferences from the tests will allow for improvements in the development and design of next generation life support systems and configurations. Preliminary experimental and modeling work in this area will be presented. This involves testing of a single inlet-exit model with detailed 3-D flow visualization and quantitative diagnostics and computational modeling of the system.

  5. Application of a rat hindlimb model: a prediction of force spaces reachable through stimulation of nerve fascicles.

    PubMed

    Johnson, Will L; Jindrich, Devin L; Zhong, Hui; Roy, Roland R; Edgerton, V Reggie

    2011-12-01

    A device to generate standing or locomotion through chronically placed electrodes has not been fully developed due in part to limitations of clinical experimentation and the high number of muscle activation inputs of the leg. We investigated the feasibility of functional electrical stimulation paradigms that minimize the input dimensions for controlling the limbs by stimulating at nerve fascicles, utilizing a model of the rat hindlimb, which combined previously collected morphological data with muscle physiological parameters presented herein. As validation of the model, we investigated the suitability of a lumped-parameter model for the prediction of muscle activation during dynamic tasks. Using the validated model, we found that the space of forces producible through activation of muscle groups sharing common nerve fascicles was nonlinearly dependent on the number of discrete muscle groups that could be individually activated (equivalently, the neuroanatomical level of activation). Seven commonly innervated muscle groups were sufficient to produce 78% of the force space producible through individual activation of the 42 modeled hindlimb muscles. This novel, neuroanatomically derived reduction in input dimension emphasizes the potential to simplify controllers for functional electrical stimulation to improve functional recovery after a neuromuscular injury.

  6. Application of a Rat Hindlimb Model: A Prediction of Force Spaces Reachable Through Stimulation of Nerve Fascicles

    PubMed Central

    Johnson, Will L.; Jindrich, Devin L.; Zhong, Hui; Roy, Roland R.

    2011-01-01

    A device to generate standing or locomotion through chronically placed electrodes has not been fully developed due in part to limitations of clinical experimentation and the high number of muscle activation inputs of the leg. We investigated the feasibility of functional electrical stimulation paradigms that minimize the input dimensions for controlling the limbs by stimulating at nerve fascicles, utilizing a model of the rat hindlimb which combined previously collected morphological data with muscle physiological parameters presented herein. As validation of the model we investigated the suitability of a lumped-parameter model for prediction of muscle activation during dynamic tasks. Using the validated model we found that the space of forces producible through activation of muscle groups sharing common nerve fascicles was nonlinearly dependent on the number of discrete muscle groups that could be individually activated (equivalently, the neuroanatomical level of activation). Seven commonly innervated muscle groups were sufficient to produce 78% of the force space producible through individual activation of the 42 modeled hindlimb muscles. This novel, neuroanatomically derived reduction in input dimension emphasizes the potential to simplify controllers for functional electrical stimulation to improve functional recovery after a neuromuscular injury. PMID:21244999

  7. GeneLab for High Schools: Data Mining for the Next Generation

    NASA Technical Reports Server (NTRS)

    Blaber, Elizabeth A.; Ly, Diana; Sato, Kevin Y.; Taylor, Elizabeth

    2016-01-01

    Modern biological sciences have become increasingly based on molecular biology and high-throughput molecular techniques, such as genomics, transcriptomics, and proteomics. NASA Scientists and the NASA Space Biology Program have aimed to examine the fundamental building blocks of life (RNA, DNA and protein) in order to understand the response of living organisms to space and aid in fundamental research discoveries on Earth. In an effort to enable NASA funded science to be available to everyone, NASA has collected the data from omics studies and curated them in a data system called GeneLab. Whilst most college-level interns, academics and other scientists have had some interaction with omics data sets and analysis tools, high school students often have not. Therefore, the Space Biology Program is implementing a new Summer Program for high-school students that aims to inspire the next generation of scientists to learn about and get involved in space research using GeneLabs Data System. The program consists of three main components core learning modules, focused on developing students knowledge on the Space Biology Program and Space Biology research, Genelab and the data system, and previous research conducted on model organisms in space; networking and team work, enabling students to interact with guest lecturers from local universities and their fellow peers, and also enabling them to visit local universities and genomics centers around the Bay area; and finally an independent learning project, whereby students will be required to form small groups, analyze a dataset on the Genelab platform, generate a hypothesis and develop a research plan to test their hypothesis. This program will not only help inspire high-school students to become involved in space-based research but will also help them develop key critical thinking and bioinformatics skills required for most college degrees and furthermore, will enable them to establish networks with their peers and connections with university Professors that may help them achieve their educational goals.

  8. Space shuttle propellant constitutive law verification tests

    NASA Technical Reports Server (NTRS)

    Thompson, James R.

    1995-01-01

    As part of the Propellants Task (Task 2.0) on the Solid Propulsion Integrity Program (SPIP), a database of material properties was generated for the Space Shuttle Redesigned Solid Rocket Motor (RSRM) PBAN-based propellant. A parallel effort on the Propellants Task was the generation of an improved constitutive theory for the PBAN propellant suitable for use in a finite element analysis (FEA) of the RSRM. The outcome of an analysis with the improved constitutive theory would be more reliable prediction of structural margins of safety. The work described in this report was performed by Materials Laboratory personnel at Thiokol Corporation/Huntsville Division under NASA contract NAS8-39619, Mod. 3. The report documents the test procedures for the refinement and verification tests for the improved Space Shuttle RSRM propellant material model, and summarizes the resulting test data. TP-H1148 propellant obtained from mix E660411 (manufactured February 1989) which had experienced ambient igloo storage in Huntsville, Alabama since January 1990, was used for these tests.

  9. NASA's Aero-Space Technology

    NASA Technical Reports Server (NTRS)

    Milstead, Phil

    2000-01-01

    This presentation reviews the three pillars and the associated goals of NASA's Aero-Space Technology Enterprise. The three pillars for success are: (1) Global Civil Aviation, (2) Revolutionary Technology Leaps, (3) Advanced Space Transportation. The associated goals of the first pillar are to reduce accidents, emissions, and cost, and to increase the aviation system capacity. The goals of the second pillar are to reduce transoceanic travel time, revolutionize general aviation aircraft, and improve development capacity. The goals associated with the third pillar are to reduce the launch cost for low earth orbit and to reduce travel time for planetary missions. In order to meet these goals NASA must provide next-generation design capability for new and or experimental craft which enable a balance between reducing components of the design cycle by up to 50% and or increasing the confidence in design by 50%. These next-generation design tools, concepts, and processes will revolutionize vehicle development. The presentation finally reviews the importance of modeling and simulation in achieving the goals.

  10. Calibration of International Space Station (ISS) Node 1 Vibro-Acoustic Model-Report 2

    NASA Technical Reports Server (NTRS)

    Zhang, Weiguo; Raveendra, Ravi

    2014-01-01

    Reported here is the capability of the Energy Finite Element Method (E-FEM) to predict the vibro-acoustic sound fields within the International Space Station (ISS) Node 1 and to compare the results with simulated leak sounds. A series of electronically generated structural ultrasonic noise sources were created in the pressure wall to emulate leak signals at different locations of the Node 1 STA module during its period of storage at Stennis Space Center (SSC). The exact sound source profiles created within the pressure wall at the source were unknown, but were estimated from the closest sensor measurement. The E-FEM method represents a reverberant sound field calculation, and of importance to this application is the requirement to correctly handle the direct field effect of the sound generation. It was also important to be able to compute the sound energy fields in the ultrasonic frequency range. This report demonstrates the capability of this technology as applied to this type of application.

  11. NGMIX: Gaussian mixture models for 2D images

    NASA Astrophysics Data System (ADS)

    Sheldon, Erin

    2015-08-01

    NGMIX implements Gaussian mixture models for 2D images. Both the PSF profile and the galaxy are modeled using mixtures of Gaussians. Convolutions are thus performed analytically, resulting in fast model generation as compared to methods that perform the convolution in Fourier space. For the galaxy model, NGMIX supports exponential disks and de Vaucouleurs and Sérsic profiles; these are implemented approximately as a sum of Gaussians using the fits from Hogg & Lang (2013). Additionally, any number of Gaussians can be fit, either completely free or constrained to be cocentric and co-elliptical.

  12. More-Realistic Digital Modeling of a Human Body

    NASA Technical Reports Server (NTRS)

    Rogge, Renee

    2010-01-01

    A MATLAB computer program has been written to enable improved (relative to an older program) modeling of a human body for purposes of designing space suits and other hardware with which an astronaut must interact. The older program implements a kinematic model based on traditional anthropometric measurements that do provide important volume and surface information. The present program generates a three-dimensional (3D) whole-body model from 3D body-scan data. The program utilizes thin-plate spline theory to reposition the model without need for additional scans.

  13. Operational characterisation of requirements and early validation environment for high demanding space systems

    NASA Technical Reports Server (NTRS)

    Barro, E.; Delbufalo, A.; Rossi, F.

    1993-01-01

    The definition of some modern high demanding space systems requires a different approach to system definition and design from that adopted for traditional missions. System functionality is strongly coupled to the operational analysis, aimed at characterizing the dynamic interactions of the flight element with its surrounding environment and its ground control segment. Unambiguous functional, operational and performance requirements are to be defined for the system, thus improving also the successive development stages. This paper proposes a Petri Nets based methodology and two related prototype applications (to ARISTOTELES orbit control and to Hermes telemetry generation) for the operational analysis of space systems through the dynamic modeling of their functions and a related computer aided environment (ISIDE) able to make the dynamic model work, thus enabling an early validation of the system functional representation, and to provide a structured system requirements data base, which is the shared knowledge base interconnecting static and dynamic applications, fully traceable with the models and interfaceable with the external world.

  14. DTFM Modeling and Analysis Method for Gossamer Structures

    NASA Technical Reports Server (NTRS)

    Fang, Hou-Fei; Lou, Michael; Broduer, Steve (Technical Monitor)

    2001-01-01

    Gossamer systems are mostly composed of support structures formed by highly flexible, long tubular elements and pre-tensioned thin-film membranes. These systems offer order-of-magnitude reductions in mass and launch volume and will revolutionize the architecture and design of space flight systems that require large in-orbit configurations and apertures. A great interest has been generated in recent years to fly gossamer systems on near-term and future space missions. Modeling and analysis requirements for gossamer structures are unique. Simulation of in-space performance issues of gossamer structures, such as inflation deployment of flexible booms, formation and effects of wrinkle in tensioned membranes, synthesis of tubular and membrane elements into a complete structural system, usually cannot be accomplished by using the general-purpose finite-element structural analysis codes. This has led to the need of structural modeling and analysis capabilities specifically suitable for gossamer structures. The Distributed Transfer Function Method (DTFM) can potentially meet this urgent need. Additional information is contained in the original extended abstract.

  15. The Application of the SPASE Metadata Standard in the U.S. and Worldwide

    NASA Astrophysics Data System (ADS)

    Thieman, J. R.; King, T. A.; Roberts, D.

    2012-12-01

    The Space Physics Archive Search and Extract (SPASE) Metadata standard for Heliophysics and related data is now an established standard within the NASA-funded space and solar physics community and is spreading to the international groups within that community. Development of SPASE had involved a number of international partners and the current version of the SPASE Metadata Model (version 2.2.2) has not needed any structural modifications since January 2011 . The SPASE standard has been adopted by groups such as NASA's Heliophysics division, the Canadian Space Science Data Portal (CSSDP), Canada's AUTUMN network, Japan's Inter-university Upper atmosphere Global Observation NETwork (IUGONET), Centre de Données de la Physique des Plasmas (CDPP), and the near-Earth space data infrastructure for e-Science (ESPAS). In addition, portions of the SPASE dictionary have been modeled in semantic web ontologies for use with reasoners and semantic searches. While we anticipate additional modifications to the model in the future to accommodate simulation and model data, these changes will not affect the data descriptions already generated for instrument-related datasets. Examples of SPASE descriptions can be viewed at http://www.spase-group.org/registry/explorer and data can be located using SPASE concepts by searching the Virtual Space Physics Observatory (http://vspo.gsfc.nasa.gov/websearch/dispatcher) for data of interest.

  16. Correlations in state space can cause sub-optimal adaptation of optimal feedback control models.

    PubMed

    Aprasoff, Jonathan; Donchin, Opher

    2012-04-01

    Control of our movements is apparently facilitated by an adaptive internal model in the cerebellum. It was long thought that this internal model implemented an adaptive inverse model and generated motor commands, but recently many reject that idea in favor of a forward model hypothesis. In theory, the forward model predicts upcoming state during reaching movements so the motor cortex can generate appropriate motor commands. Recent computational models of this process rely on the optimal feedback control (OFC) framework of control theory. OFC is a powerful tool for describing motor control, it does not describe adaptation. Some assume that adaptation of the forward model alone could explain motor adaptation, but this is widely understood to be overly simplistic. However, an adaptive optimal controller is difficult to implement. A reasonable alternative is to allow forward model adaptation to 're-tune' the controller. Our simulations show that, as expected, forward model adaptation alone does not produce optimal trajectories during reaching movements perturbed by force fields. However, they also show that re-optimizing the controller from the forward model can be sub-optimal. This is because, in a system with state correlations or redundancies, accurate prediction requires different information than optimal control. We find that adding noise to the movements that matches noise found in human data is enough to overcome this problem. However, since the state space for control of real movements is far more complex than in our simple simulations, the effects of correlations on re-adaptation of the controller from the forward model cannot be overlooked.

  17. Scalable approximate policies for Markov decision process models of hospital elective admissions.

    PubMed

    Zhu, George; Lizotte, Dan; Hoey, Jesse

    2014-05-01

    To demonstrate the feasibility of using stochastic simulation methods for the solution of a large-scale Markov decision process model of on-line patient admissions scheduling. The problem of admissions scheduling is modeled as a Markov decision process in which the states represent numbers of patients using each of a number of resources. We investigate current state-of-the-art real time planning methods to compute solutions to this Markov decision process. Due to the complexity of the model, traditional model-based planners are limited in scalability since they require an explicit enumeration of the model dynamics. To overcome this challenge, we apply sample-based planners along with efficient simulation techniques that given an initial start state, generate an action on-demand while avoiding portions of the model that are irrelevant to the start state. We also propose a novel variant of a popular sample-based planner that is particularly well suited to the elective admissions problem. Results show that the stochastic simulation methods allow for the problem size to be scaled by a factor of almost 10 in the action space, and exponentially in the state space. We have demonstrated our approach on a problem with 81 actions, four specialities and four treatment patterns, and shown that we can generate solutions that are near-optimal in about 100s. Sample-based planners are a viable alternative to state-based planners for large Markov decision process models of elective admissions scheduling. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Space Station UCS antenna pattern computation and measurement. [UHF Communication Subsystem

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Lu, Ba P.; Johnson, Larry A.; Fournet, Jon S.; Panneton, Robert J.; Ngo, John D.; Eggers, Donald S.; Arndt, G. D.

    1993-01-01

    The purpose of this paper is to analyze the interference to the Space Station Ultrahigh Frequency (UHF) Communication Subsystem (UCS) antenna radiation pattern due to its environment - Space Station. A hybrid Computational Electromagnetics (CEM) technique was applied in this study. The antenna was modeled using the Method of Moments (MOM) and the radiation patterns were computed using the Uniform Geometrical Theory of Diffraction (GTD) in which the effects of the reflected and diffracted fields from surfaces, edges, and vertices of the Space Station structures were included. In order to validate the CEM techniques, and to provide confidence in the computer-generated results, a comparison with experimental measurements was made for a 1/15 scale Space Station mockup. Based on the results accomplished, good agreement on experimental and computed results was obtained. The computed results using the CEM techniques for the Space Station UCS antenna pattern predictions have been validated.

  19. Riemannian geometric approach to human arm dynamics, movement optimization, and invariance

    NASA Astrophysics Data System (ADS)

    Biess, Armin; Flash, Tamar; Liebermann, Dario G.

    2011-03-01

    We present a generally covariant formulation of human arm dynamics and optimization principles in Riemannian configuration space. We extend the one-parameter family of mean-squared-derivative (MSD) cost functionals from Euclidean to Riemannian space, and we show that they are mathematically identical to the corresponding dynamic costs when formulated in a Riemannian space equipped with the kinetic energy metric. In particular, we derive the equivalence of the minimum-jerk and minimum-torque change models in this metric space. Solutions of the one-parameter family of MSD variational problems in Riemannian space are given by (reparametrized) geodesic paths, which correspond to movements with least muscular effort. Finally, movement invariants are derived from symmetries of the Riemannian manifold. We argue that the geometrical structure imposed on the arm’s configuration space may provide insights into the emerging properties of the movements generated by the motor system.

  20. Three-Dimensional Modeling of the Brain's ECS by Minimum Configurational Energy Packing of Fluid Vesicles

    PubMed Central

    Nandigam, Ravi K.; Kroll, Daniel M.

    2007-01-01

    The extracellular space of the brain is the heterogeneous porous medium formed by the spaces between the brain cells. Diffusion in this interstitial space is the mechanism by which glucose and oxygen are delivered to the brain cells from the vascular system. It is also a medium for the transport of certain informational substances between the cells (called volume transmission), and for drug delivery. This work involves three-dimensional modeling of the extracellular space as void space in close-packed arrays of fluid membrane vesicles. These packings are generated by minimizing the configurational energy using a Monte Carlo procedure. Both regular and random packs of vesicles are considered. A random walk algorithm is then used to compute the geometric tortuosities, and the results are compared with published experimental data. For the random packings, it is found that although the absolute values for the tortuosities differ, the dependence of the tortuosity on pore volume fraction is very similar to that observed in experiment. The tortuosities we measure are larger than those computed in previous studies of packings of convex polytopes, and modeling improvements, which require higher resolution studies and an improved modeling of brain cell shapes and mechanical properties, could help resolve remaining discrepancies between model simulations and experiment. It is also shown that the specular reflection scheme is the appropriate technique for implementing zero-flux boundary conditions in random walk simulations commonly encountered in diffusion problems. PMID:17307830

Top