Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
New York State energy-analytic information system: first-stage implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allentuck, J.; Carroll, O.; Fiore, L.
1979-09-01
So that energy policy by state government may be formulated within the constraints imposed by policy determined at the national level - yet reflect the diverse interests of its citizens - large quantities of data and sophisticated analytic capabilities are required. This report presents the design of an energy-information/analytic system for New York State, the data for a base year, 1976, and projections of these data. At the county level, 1976 energy-supply demand data and electric generating plant data are provided as well. Data-base management is based on System 2000. Three computerized models provide the system's basic analytic capacity. Themore » Brookhaven Energy System Network Simulator provides an integrating framework while a price-response model and a weather sensitive energy demand model furnished a short-term energy response estimation capability. The operation of these computerized models is described. 62 references, 25 figures, 39 tables.« less
Fire behavior modeling-a decision tool
Jack Cohen; Bill Bradshaw
1986-01-01
The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...
NASA Technical Reports Server (NTRS)
Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.
1985-01-01
An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
An Examination of Advisor Concerns in the Era of Academic Analytics
ERIC Educational Resources Information Center
Daughtry, Jeremy J.
2017-01-01
Performance-based funding models are increasingly becoming the norm for many institutions of higher learning. Such models place greater emphasis on student retention and success metrics, for example, as requirements for receiving state appropriations. To stay competitive, universities have adopted academic analytics technologies capable of…
Empirical testing of an analytical model predicting electrical isolation of photovoltaic models
NASA Astrophysics Data System (ADS)
Garcia, A., III; Minning, C. P.; Cuddihy, E. F.
A major design requirement for photovoltaic modules is that the encapsulation system be capable of withstanding large DC potentials without electrical breakdown. Presented is a simple analytical model which can be used to estimate material thickness to meet this requirement for a candidate encapsulation system or to predict the breakdown voltage of an existing module design. A series of electrical tests to verify the model are described in detail. The results of these verification tests confirmed the utility of the analytical model for preliminary design of photovoltaic modules.
Thermal Effects Modeling Developed for Smart Structures
NASA Technical Reports Server (NTRS)
Lee, Ho-Jun
1998-01-01
Applying smart materials in aeropropulsion systems may improve the performance of aircraft engines through a variety of vibration, noise, and shape-control applications. To facilitate the experimental characterization of these smart structures, researchers have been focusing on developing analytical models to account for the coupled mechanical, electrical, and thermal response of these materials. One focus of current research efforts has been directed toward incorporating a comprehensive thermal analysis modeling capability. Typically, temperature affects the behavior of smart materials by three distinct mechanisms: Induction of thermal strains because of coefficient of thermal expansion mismatch 1. Pyroelectric effects on the piezoelectric elements; 2. Temperature-dependent changes in material properties; and 3. Previous analytical models only investigated the first two thermal effects mechanisms. However, since the material properties of piezoelectric materials generally vary greatly with temperature (see the graph), incorporating temperature-dependent material properties will significantly affect the structural deflections, sensory voltages, and stresses. Thus, the current analytical model captures thermal effects arising from all three mechanisms through thermopiezoelectric constitutive equations. These constitutive equations were incorporated into a layerwise laminate theory with the inherent capability to model both the active and sensory response of smart structures in thermal environments. Corresponding finite element equations were formulated and implemented for both the beam and plate elements to provide a comprehensive thermal effects modeling capability.
Application of capability indices and control charts in the analytical method control strategy.
Oliva, Alexis; Llabres Martinez, Matías
2017-08-01
In this study, we assessed the usefulness of control charts in combination with the process capability indices, C pm and C pk , in the control strategy of an analytical method. The traditional X-chart and moving range chart were used to monitor the analytical method over a 2-year period. The results confirmed that the analytical method is in-control and stable. Different criteria were used to establish the specifications limits (i.e. analyst requirements) for fixed method performance (i.e. method requirements). If the specification limits and control limits are equal in breadth, the method can be considered "capable" (C pm = 1), but it does not satisfy the minimum method capability requirements proposed by Pearn and Shu (2003). Similar results were obtained using the C pk index. The method capability was also assessed as a function of method performance for fixed analyst requirements. The results indicate that the method does not meet the requirements of the analytical target approach. A real-example data of a SEC with light-scattering detection method was used as a model whereas previously published data were used to illustrate the applicability of the proposed approach. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Creating Synthetic Coronal Observational Data From MHD Models: The Forward Technique
NASA Technical Reports Server (NTRS)
Rachmeler, Laurel A.; Gibson, Sarah E.; Dove, James; Kucera, Therese Ann
2010-01-01
We present a generalized forward code for creating simulated corona) observables off the limb from numerical and analytical MHD models. This generalized forward model is capable of creating emission maps in various wavelengths for instruments such as SXT, EIT, EIS, and coronagraphs, as well as spectropolari metric images and line profiles. The inputs to our code can be analytic models (of which four come with the code) or 2.5D and 3D numerical datacubes. We present some examples of the observable data created with our code as well as its functional capabilities. This code is currently available for beta-testing (contact authors), with the ultimate goal of release as a SolarSoft package
User's Guide To CHEAP0 II-Economic Analysis of Stand Prognosis Model Outputs
Joseph E. Horn; E. Lee Medema; Ervin G. Schuster
1986-01-01
CHEAP0 II provides supplemental economic analysis capability for users of version 5.1 of the Stand Prognosis Model, including recent regeneration and insect outbreak extensions. Although patterned after the old CHEAP0 model, CHEAP0 II has more features and analytic capabilities, especially for analysis of existing and uneven-aged stands....
Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik; ...
2017-10-06
A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik
A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less
NASA Technical Reports Server (NTRS)
Van Dresar, N. T.
1992-01-01
A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluid will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.
NASA Technical Reports Server (NTRS)
Vandresar, N. T.
1992-01-01
A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluids will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity, followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.
Curved Thermopiezoelectric Shell Structures Modeled by Finite Element Analysis
NASA Technical Reports Server (NTRS)
Lee, Ho-Jun
2000-01-01
"Smart" structures composed of piezoelectric materials may significantly improve the performance of aeropropulsion systems through a variety of vibration, noise, and shape-control applications. The development of analytical models for piezoelectric smart structures is an ongoing, in-house activity at the NASA Glenn Research Center at Lewis Field focused toward the experimental characterization of these materials. Research efforts have been directed toward developing analytical models that account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. Current work revolves around implementing thermal effects into a curvilinear-shell finite element code. This enhances capabilities to analyze curved structures and to account for coupling effects arising from thermal effects and the curved geometry. The current analytical model implements a unique mixed multi-field laminate theory to improve computational efficiency without sacrificing accuracy. The mechanics can model both the sensory and active behavior of piezoelectric composite shell structures. Finite element equations are being implemented for an eight-node curvilinear shell element, and numerical studies are being conducted to demonstrate capabilities to model the response of curved piezoelectric composite structures (see the figure).
Nonlinear feedback control for high alpha flight
NASA Technical Reports Server (NTRS)
Stalford, Harold
1990-01-01
Analytical aerodynamic models are derived from a high alpha 6 DOF wind tunnel model. One detail model requires some interpolation between nonlinear functions of alpha. One analytical model requires no interpolation and as such is a completely continuous model. Flight path optimization is conducted on the basic maneuvers: half-loop, 90 degree pitch-up, and level turn. The optimal control analysis uses the derived analytical model in the equations of motion and is based on both moment and force equations. The maximum principle solution for the half-loop is poststall trajectory performing the half-loop in 13.6 seconds. The agility induced by thrust vectoring capability provided a minimum effect on reducing the maneuver time. By means of thrust vectoring control the 90 degrees pitch-up maneuver can be executed in a small place over a short time interval. The agility capability of thrust vectoring is quite beneficial for pitch-up maneuvers. The level turn results are based currently on only outer layer solutions of singular perturbation. Poststall solutions provide high turn rates but generate higher losses of energy than that of classical sustained solutions.
Electronic cooling design and test validation
NASA Astrophysics Data System (ADS)
Murtha, W. B.
1983-07-01
An analytical computer model has been used to design a counterflow air-cooled heat exchanger according to the cooling, structural and geometric requirements of a U.S. Navy shipboard electronics cabinet, emphasizing high reliability performance through the maintenance of electronic component junction temperatures lower than 110 C. Environmental testing of the design obtained has verified that the analytical predictions were conservative. Model correlation to the test data furnishes an upgraded capability for the evaluation of tactical effects, and has established a two-orders of magnitude growth potential for increased electronics capabilities through enhanced heat dissipation. Electronics cabinets of this type are destined for use with Vertical Launching System-type combatant vessel magazines.
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
NASA Technical Reports Server (NTRS)
Turner, E. R.; Wilson, M. D.; Hylton, L. D.; Kaufman, R. M.
1985-01-01
Progress in predictive design capabilities for external heat transfer to turbine vanes was summarized. A two dimensional linear cascade (previously used to obtain vane surface heat transfer distributions on nonfilm cooled airfoils) was used to examine the effect of leading edge shower head film cooling on downstream heat transfer. The data were used to develop and evaluate analytical models. Modifications to the two dimensional boundary layer model are described. The results were used to formulate and test an effective viscosity model capable of predicting heat transfer phenomena downstream of the leading edge film cooling array on both the suction and pressure surfaces, with and without mass injection.
Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y
2016-03-01
Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.
Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation
NASA Astrophysics Data System (ADS)
Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.
2016-03-01
Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.
Analytical modeling of circuit aerodynamics in the new NASA Lewis wind tunnel
NASA Technical Reports Server (NTRS)
Towne, C. E.; Povinelli, L. A.; Kunik, W. G.; Muramoto, K. K.; Hughes, C. E.; Levy, R.
1985-01-01
Rehabilitation and extention of the capability of the altitude wind tunnel (AWT) was analyzed. The analytical modeling program involves the use of advanced axisymmetric and three dimensional viscous analyses to compute the flow through the various AWT components. Results for the analytical modeling of the high speed leg aerodynamics are presented; these include: an evaluation of the flow quality at the entrance to the test section, an investigation of the effects of test section bleed for different model blockages, and an examination of three dimensional effects in the diffuser due to reentry flow and due to the change in cross sectional shape of the exhaust scoop.
2011-11-30
detection of fatigue damage at early stage, well before onset of fracture and crack development. Analytical and numerical models of MEAS and MMI are...stage, well before onset of fracture and crack development. Analytical and numerical models of MEAS and MMI are suggested. Finally, MEAS capability...47 2.4.1 Far-Field Crack Detection
Analytical methods for the development of Reynolds stress closures in turbulence
NASA Technical Reports Server (NTRS)
Speziale, Charles G.
1990-01-01
Analytical methods for the development of Reynolds stress models in turbulence are reviewed in detail. Zero, one and two equation models are discussed along with second-order closures. A strong case is made for the superior predictive capabilities of second-order closure models in comparison to the simpler models. The central points are illustrated by examples from both homogeneous and inhomogeneous turbulence. A discussion of the author's views concerning the progress made in Reynolds stress modeling is also provided along with a brief history of the subject.
2005-04-01
RTO-MP-SAS-055 4 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Analytical Support Capabilities of Turkish General Staff Scientific...the end failed to achieve anything commensurate with the effort. The analytical support capabilities of Turkish Scientific Decision Support Center to...percent of the İpekkan, Z.; Özkil, A. (2005) Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to
Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders
NASA Technical Reports Server (NTRS)
Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)
2002-01-01
A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.
An Analytical-Numerical Model for Two-Phase Slug Flow through a Sudden Area Change in Microchannels
Momen, A. Mehdizadeh; Sherif, S. A.; Lear, W. E.
2016-01-01
In this article, two new analytical models have been developed to calculate two-phase slug flow pressure drop in microchannels through a sudden contraction. Even though many studies have been reported on two-phase flow in microchannels, considerable discrepancies still exist, mainly due to the difficulties in experimental setup and measurements. Numerical simulations were performed to support the new analytical models and to explore in more detail the physics of the flow in microchannels with a sudden contraction. Both analytical and numerical results were compared to the available experimental data and other empirical correlations. Results show that models, which were developed basedmore » on the slug and semi-slug assumptions, agree well with experiments in microchannels. Moreover, in contrast to the previous empirical correlations which were tuned for a specific geometry, the new analytical models are capable of taking geometrical parameters as well as flow conditions into account.« less
DOT National Transportation Integrated Search
1982-01-01
The Detailed Station Model (DSM) provides operational and performance measures of alternative station configurations and management policies with respect to vehicle and passenger capabilities. It provides an analytic tool to support tradeoff studies ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, IIftekhar; Husain, Tausif; Uddin, Md Wasi
2015-08-24
This paper presents a nonlinear analytical model of a novel double-sided flux concentrating Transverse Flux Machine (TFM) based on the Magnetic Equivalent Circuit (MEC) model. The analytical model uses a series-parallel combination of flux tubes to predict the flux paths through different parts of the machine including air gaps, permanent magnets, stator, and rotor. The two-dimensional MEC model approximates the complex three-dimensional flux paths of the TFM and includes the effects of magnetic saturation. The model is capable of adapting to any geometry that makes it a good alternative for evaluating prospective designs of TFM compared to finite element solversmore » that are numerically intensive and require more computation time. A single-phase, 1-kW, 400-rpm machine is analytically modeled, and its resulting flux distribution, no-load EMF, and torque are verified with finite element analysis. The results are found to be in agreement, with less than 5% error, while reducing the computation time by 25 times.« less
Analytical Modeling of a Novel Transverse Flux Machine for Direct Drive Wind Turbine Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, IIftekhar; Husain, Tausif; Uddin, Md Wasi
2015-09-02
This paper presents a nonlinear analytical model of a novel double sided flux concentrating Transverse Flux Machine (TFM) based on the Magnetic Equivalent Circuit (MEC) model. The analytical model uses a series-parallel combination of flux tubes to predict the flux paths through different parts of the machine including air gaps, permanent magnets (PM), stator, and rotor. The two-dimensional MEC model approximates the complex three-dimensional flux paths of the TFM and includes the effects of magnetic saturation. The model is capable of adapting to any geometry which makes it a good alternative for evaluating prospective designs of TFM as compared tomore » finite element solvers which are numerically intensive and require more computation time. A single phase, 1 kW, 400 rpm machine is analytically modeled and its resulting flux distribution, no-load EMF and torque, verified with Finite Element Analysis (FEA). The results are found to be in agreement with less than 5% error, while reducing the computation time by 25 times.« less
Manufacturing data analytics using a virtual factory representation.
Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun
2017-01-01
Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.
Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems
NASA Technical Reports Server (NTRS)
Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.
1992-01-01
The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.
Description of a Generalized Analytical Model for the Micro-dosimeter Response
NASA Technical Reports Server (NTRS)
Badavi, Francis F.; Stewart-Sloan, Charlotte R.; Xapsos, Michael A.; Shinn, Judy L.; Wilson, John W.; Hunter, Abigail
2007-01-01
An analytical prediction capability for space radiation in Low Earth Orbit (LEO), correlated with the Space Transportation System (STS) Shuttle Tissue Equivalent Proportional Counter (TEPC) measurements, is presented. The model takes into consideration the energy loss straggling and chord length distribution of the TEPC detector, and is capable of predicting energy deposition fluctuations in a micro-volume by incoming ions through both direct and indirect ionic events. The charged particle transport calculations correlated with STS 56, 51, 110 and 114 flights are accomplished by utilizing the most recent version (2005) of the Langley Research Center (LaRC) deterministic ionized particle transport code High charge (Z) and Energy TRaNsport WZETRN), which has been extensively validated with laboratory beam measurements and available space flight data. The agreement between the TEPC model prediction (response function) and the TEPC measured differential and integral spectra in lineal energy (y) domain is promising.
Performance Models for the Spike Banded Linear System Solver
Manguoglu, Murat; Saied, Faisal; Sameh, Ahmed; ...
2011-01-01
With availability of large-scale parallel platforms comprised of tens-of-thousands of processors and beyond, there is significant impetus for the development of scalable parallel sparse linear system solvers and preconditioners. An integral part of this design process is the development of performance models capable of predicting performance and providing accurate cost models for the solvers and preconditioners. There has been some work in the past on characterizing performance of the iterative solvers themselves. In this paper, we investigate the problem of characterizing performance and scalability of banded preconditioners. Recent work has demonstrated the superior convergence properties and robustness of banded preconditioners,more » compared to state-of-the-art ILU family of preconditioners as well as algebraic multigrid preconditioners. Furthermore, when used in conjunction with efficient banded solvers, banded preconditioners are capable of significantly faster time-to-solution. Our banded solver, the Truncated Spike algorithm is specifically designed for parallel performance and tolerance to deep memory hierarchies. Its regular structure is also highly amenable to accurate performance characterization. Using these characteristics, we derive the following results in this paper: (i) we develop parallel formulations of the Truncated Spike solver, (ii) we develop a highly accurate pseudo-analytical parallel performance model for our solver, (iii) we show excellent predication capabilities of our model – based on which we argue the high scalability of our solver. Our pseudo-analytical performance model is based on analytical performance characterization of each phase of our solver. These analytical models are then parameterized using actual runtime information on target platforms. An important consequence of our performance models is that they reveal underlying performance bottlenecks in both serial and parallel formulations. All of our results are validated on diverse heterogeneous multiclusters – platforms for which performance prediction is particularly challenging. Finally, we provide predict the scalability of the Spike algorithm using up to 65,536 cores with our model. In this paper we extend the results presented in the Ninth International Symposium on Parallel and Distributed Computing.« less
ERIC Educational Resources Information Center
Marceau, Kristine; Ram, Nilam; Houts, Renate M.; Grimm, Kevin J.; Susman, Elizabeth J.
2011-01-01
Pubertal development is a nonlinear process progressing from prepubescent beginnings through biological, physical, and psychological changes to full sexual maturity. To tether theoretical concepts of puberty with sophisticated longitudinal, analytical models capable of articulating pubertal development more accurately, we used nonlinear…
Analytical stability and simulation response study for a coupled two-body system
NASA Technical Reports Server (NTRS)
Tao, K. M.; Roberts, J. R.
1975-01-01
An analytical stability study and a digital simulation response study of two connected rigid bodies are documented. Relative rotation of the bodies at the connection is allowed, thereby providing a model suitable for studying system stability and response during a soft-dock regime. Provisions are made of a docking port axes alignment torque and a despin torque capability for encountering spinning payloads. Although the stability analysis is based on linearized equations, the digital simulation is based on nonlinear models.
Experimental investigation of elastic mode control on a model of a transport aircraft
NASA Technical Reports Server (NTRS)
Abramovitz, M.; Heimbaugh, R. M.; Nomura, J. K.; Pearson, R. M.; Shirley, W. A.; Stringham, R. H.; Tescher, E. L.; Zoock, I. E.
1981-01-01
A 4.5 percent DC-10 derivative flexible model with active controls is fabricated, developed, and tested to investigate the ability to suppress flutter and reduce gust loads with active controlled surfaces. The model is analyzed and tested in both semispan and complete model configuration. Analytical methods are refined and control laws are developed and successfully tested on both versions of the model. A 15 to 25 percent increase in flutter speed due to the active system is demonstrated. The capability of an active control system to significantly reduce wing bending moments due to turbulence is demonstrated. Good correlation is obtained between test and analytical prediction.
Productivity and injectivity of horizontal wells. Quarterly report, October 1--December 31, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fayers, F.J.; Aziz, K.; Hewett, T.A.
1993-03-10
A number of activities have been carried out in the last three months. A list outlining these efforts is presented below followed by brief description of each activity in the subsequent sections of this report: Progress is being made on the development of a black oil three-phase simulator which will allow the use of a generalized Voronoi grid in the plane perpendicular to a horizontal well. The available analytical solutions in the literature for calculating productivity indices (Inflow Performance) of horizontal wells have been reviewed. The pseudo-steady state analytic model of Goode and Kuchuk has been applied to an examplemore » problem. A general mechanistic two-phase flow model is under development. The model is capable of predicting flow transition boundaries for a horizontal pipe at any inclination angle. It also has the capability of determining pressure drops and holdups for all the flow regimes. A large code incorporating all the features of the model has been programmed and is currently being tested.« less
On the pursuit of a nuclear development capability: The case of the Cuban nuclear program
NASA Astrophysics Data System (ADS)
Benjamin-Alvarado, Jonathan Calvert
1998-09-01
While there have been many excellent descriptive accounts of modernization schemes in developing states, energy development studies based on prevalent modernization theory have been rare. Moreover, heretofore there have been very few analyses of efforts to develop a nuclear energy capability by developing states. Rarely have these analyses employed social science research methodologies. The purpose of this study was to develop a general analytical framework, based on such a methodology to analyze nuclear energy development and to utilize this framework for the study of the specific case of Cuba's decision to develop nuclear energy. The analytical framework developed focuses on a qualitative tracing of the process of Cuban policy objectives and implementation to develop a nuclear energy capability, and analyzes the policy in response to three models of modernization offered to explain the trajectory of policy development. These different approaches are the politically motivated modernization model, the economic and technological modernization model and the economic and energy security model. Each model provides distinct and functionally differentiated expectations for the path of development toward this objective. Each model provides expected behaviors to external stimuli that would result in specific policy responses. In the study, Cuba's nuclear policy responses to stimuli from domestic constraints and intensities, institutional development, and external influences are analyzed. The analysis revealed that in pursuing the nuclear energy capability, Cuba primarily responded by filtering most of the stimuli through the twin objectives of economic rationality and technological advancement. Based upon the Cuban policy responses to the domestic and international stimuli, the study concluded that the economic and technological modernization model of nuclear energy development offered a more complete explanation of the trajectory of policy development than either the politically-motivated or economic and energy security models. The findings of this case pose some interesting questions for the general study of energy programs in developing states. By applying the analytical framework employed in this study to a number of other cases, perhaps the understanding of energy development schemes may be expanded through future research.
Attitude Determination Error Analysis System (ADEAS) mathematical specifications document
NASA Technical Reports Server (NTRS)
Nicholson, Mark; Markley, F.; Seidewitz, E.
1988-01-01
The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.
Semantic Interaction for Sensemaking: Inferring Analytical Reasoning for Model Steering.
Endert, A; Fiaux, P; North, C
2012-12-01
Visual analytic tools aim to support the cognitively demanding task of sensemaking. Their success often depends on the ability to leverage capabilities of mathematical models, visualization, and human intuition through flexible, usable, and expressive interactions. Spatially clustering data is one effective metaphor for users to explore similarity and relationships between information, adjusting the weighting of dimensions or characteristics of the dataset to observe the change in the spatial layout. Semantic interaction is an approach to user interaction in such spatializations that couples these parametric modifications of the clustering model with users' analytic operations on the data (e.g., direct document movement in the spatialization, highlighting text, search, etc.). In this paper, we present results of a user study exploring the ability of semantic interaction in a visual analytic prototype, ForceSPIRE, to support sensemaking. We found that semantic interaction captures the analytical reasoning of the user through keyword weighting, and aids the user in co-creating a spatialization based on the user's reasoning and intuition.
Micromechanics Analysis Code (MAC) User Guide: Version 1.0
NASA Technical Reports Server (NTRS)
Wilt, T. E.; Arnold, S. M.
1994-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triple ply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control), and thermomechanical load histories can be imposed; (2) different integration algorithms may be selected; (3) a variety of constituent constitutive models may be utilized and/or implemented; and (4) a variety of fiber architectures may be easily accessed through their corresponding representative volume elements.
Micromechanics Analysis Code (MAC). User Guide: Version 2.0
NASA Technical Reports Server (NTRS)
Wilt, T. E.; Arnold, S. M.
1996-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code's (MAC) who's predictive capability rests entirely upon the fully analytical generalized method of cells (GMC), micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, (2) different integration algorithms may be selected, (3) a variety of constituent constitutive models may be utilized and/or implemented, and (4) a variety of fiber and laminate architectures may be easily accessed through their corresponding representative volume elements.
A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes
Ma, Xin; Shen, Jianping
2017-01-01
The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094
The Synergistic Engineering Environment
NASA Technical Reports Server (NTRS)
Cruz, Jonathan
2006-01-01
The Synergistic Engineering Environment (SEE) is a system of software dedicated to aiding the understanding of space mission operations. The SEE can integrate disparate sets of data with analytical capabilities, geometric models of spacecraft, and a visualization environment, all contributing to the creation of an interactive simulation of spacecraft. Initially designed to satisfy needs pertaining to the International Space Station, the SEE has been broadened in scope to include spacecraft ranging from those in low orbit around the Earth to those on deep-space missions. The SEE includes analytical capabilities in rigid-body dynamics, kinematics, orbital mechanics, and payload operations. These capabilities enable a user to perform real-time interactive engineering analyses focusing on diverse aspects of operations, including flight attitudes and maneuvers, docking of visiting spacecraft, robotic operations, impingement of spacecraft-engine exhaust plumes, obscuration of instrumentation fields of view, communications, and alternative assembly configurations. .
Satellite attitude motion models for capture and retrieval investigations
NASA Technical Reports Server (NTRS)
Cochran, John E., Jr.; Lahr, Brian S.
1986-01-01
The primary purpose of this research is to provide mathematical models which may be used in the investigation of various aspects of the remote capture and retrieval of uncontrolled satellites. Emphasis has been placed on analytical models; however, to verify analytical solutions, numerical integration must be used. Also, for satellites of certain types, numerical integration may be the only practical or perhaps the only possible method of solution. First, to provide a basis for analytical and numerical work, uncontrolled satellites were categorized using criteria based on: (1) orbital motions, (2) external angular momenta, (3) internal angular momenta, (4) physical characteristics, and (5) the stability of their equilibrium states. Several analytical solutions for the attitude motions of satellite models were compiled, checked, corrected in some minor respects and their short-term prediction capabilities were investigated. Single-rigid-body, dual-spin and multi-rotor configurations are treated. To verify the analytical models and to see how the true motion of a satellite which is acted upon by environmental torques differs from its corresponding torque-free motion, a numerical simulation code was developed. This code contains a relatively general satellite model and models for gravity-gradient and aerodynamic torques. The spacecraft physical model for the code and the equations of motion are given. The two environmental torque models are described.
NASA Astrophysics Data System (ADS)
Pan, Jun-Yang; Xie, Yi
2015-02-01
With tremendous advances in modern techniques, Einstein's general relativity has become an inevitable part of deep space missions. We investigate the relativistic algorithm for time transfer between the proper time τ of the onboard clock and the Geocentric Coordinate Time, which extends some previous works by including the effects of propagation of electromagnetic signals. In order to evaluate the implicit algebraic equations and integrals in the model, we take an analytic approach to work out their approximate values. This analytic model might be used in an onboard computer because of its limited capability to perform calculations. Taking an orbiter like Yinghuo-1 as an example, we find that the contributions of the Sun, the ground station and the spacecraft dominate the outcomes of the relativistic corrections to the model.
Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery
NASA Astrophysics Data System (ADS)
Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.
2017-12-01
Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.
Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.
Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim
2016-04-01
Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.
NASA Technical Reports Server (NTRS)
Lee, Ho-Jun
2001-01-01
Analytical formulations are developed to account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. The coupled response is captured at the material level through the thermopiezoelectric constitutive equations and leads to the inherent capability to model both the sensory and active responses of piezoelectric materials. A layerwise laminate theory is incorporated to provide more accurate analysis of the displacements, strains, stresses, electric fields, and thermal fields through-the-thickness. Thermal effects which arise from coefficient of thermal expansion mismatch, pyroelectric effects, and temperature dependent material properties are explicitly accounted for in the formulation. Corresponding finite element formulations are developed for piezoelectric beam, plate, and shell elements to provide a more generalized capability for the analysis of arbitrary piezoelectric composite structures. The accuracy of the current formulation is verified with comparisons from published experimental data and other analytical models. Additional numerical studies are also conducted to demonstrate additional capabilities of the formulation to represent the sensory and active behaviors. A future plan of experimental studies is provided to characterize the high temperature dynamic response of piezoelectric composite materials.
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)
Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).
Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.
NASA Technical Reports Server (NTRS)
Bienart, W. B.
1973-01-01
The objective of this program was to investigate analytically and experimentally the performance of heat pipes with composite wicks--specifically, those having pedestal arteries and screwthread circumferential grooves. An analytical model was developed to describe the effects of screwthreads and screen secondary wicks on the transport capability of the artery. The model describes the hydrodynamics of the circumferential flow in triangular grooves with azimuthally varying capillary menisci and liquid cross-sections. Normalized results were obtained which give the influence of evaporator heat flux on the axial heat transport capability of the arterial wick. In order to evaluate the priming behavior of composite wicks under actual load conditions, an 'inverted' glass heat pipe was designed and constructed. The results obtained from the analysis and from the tests with the glass heat pipe were applied to the OAO-C Level 5 heat pipe, and an improved correlation between predicted and measured evaporator and transport performance were obtained.
Trace level detection of analytes using artificial olfactometry
NASA Technical Reports Server (NTRS)
Lewis, Nathan S. (Inventor); Severin, Erik J. (Inventor); Wong, Bernard (Inventor)
2002-01-01
The present invention provides a device for detecting the presence of an analyte, such as for example, a lightweight device, including: a sample chamber having a fluid inlet port for the influx of the analyte; a fluid concentrator in flow communication with the sample chamber wherein the fluid concentrator has an absorbent material capable of absorbing the analyte and capable of desorbing a concentrated analyte; and an array of sensors in fluid communication with the concentrated analyte to be released from the fluid concentrator.
A conceptual snow model with an analytic resolution of the heat and phase change equations
NASA Astrophysics Data System (ADS)
Riboust, Philippe; Le Moine, Nicolas; Thirel, Guillaume; Ribstein, Pierre
2017-04-01
Compared to degree-day snow models, physically-based snow models resolve more processes in an attempt to achieve a better representation of reality. Often these physically-based models resolve the heat transport equations in snow using a vertical discretization of the snowpack. The snowpack is decomposed into several layers in which the mechanical and thermal states of the snow are calculated. A higher number of layers in the snowpack allow for better accuracy but it also tends to increase the computational costs. In order to develop a snow model that estimates the temperature profile of snow with a lower computational cost, we used an analytical decomposition of the vertical profile using eigenfunctions (i.e. trigonometric functions adapted to the specific boundary conditions). The mass transfer of snow melt has also been estimated using an analytical conceptualization of runoff fingering and matrix flow. As external meteorological forcing, the model uses solar and atmospheric radiation, air temperature, atmospheric humidity and precipitations. It has been tested and calibrated at point scale at two different stations in the Alps: Col de Porte (France, 1325 m) and Weissfluhjoch (Switzerland, 2540 m). A sensitivity analysis of model parameters and model inputs will be presented together with a comparison with measured snow surface temperature, SWE, snow depth, temperature profile and snow melt data. The snow model is created in order to be ultimately coupled with hydrological models for rainfall-runoff modeling in mountainous areas. We hope to create a model faster than physically-based models but capable to estimate more physical processes than degree-day snow models. This should help to build a more reliable snow model capable of being easily calibrated by remote sensing and in situ observation or to assimilate these data for forecasting purposes.
Vibrations and structureborne noise in space station
NASA Technical Reports Server (NTRS)
Vaicaitis, R.
1985-01-01
Theoretical models were developed capable of predicting structural response and noise transmission to random point mechanical loads. Fiber reinforced composite and aluminum materials were considered. Cylindrical shells and circular plates were taken as typical representatives of structural components for space station habitability modules. Analytical formulations include double wall and single wall constructions. Pressurized and unpressurized models were considered. Parametric studies were conducted to determine the effect on structural response and noise transmission due to fiber orientation, point load location, damping in the core and the main load carrying structure, pressurization, interior acoustic absorption, etc. These analytical models could serve as preliminary tools for assessing noise related problems, for space station applications.
Using landscape disturbance and succession models to support forest management
Eric J. Gustafson; Brian R. Sturtevant; Anatoly S. Shvidenko; Robert M. Scheller
2010-01-01
Managers of forested landscapes must account for multiple, interacting ecological processes operating at broad spatial and temporal scales. These interactions can be of such complexity that predictions of future forest ecosystem states are beyond the analytical capability of the human mind. Landscape disturbance and succession models (LDSM) are predictive and...
NASA Technical Reports Server (NTRS)
Pieper, Jerry L.; Walker, Richard E.
1993-01-01
During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.
Tavčar, Gregor; Katrašnik, Tomaž
2014-01-01
The parallel straight channel PEM fuel cell model presented in this paper extends the innovative hybrid 3D analytic-numerical (HAN) approach previously published by the authors with capabilities to address ternary diffusion systems and counter-flow configurations. The model's core principle is modelling species transport by obtaining a 2D analytic solution for species concentration distribution in the plane perpendicular to the cannel gas-flow and coupling consecutive 2D solutions by means of a 1D numerical pipe-flow model. Electrochemical and other nonlinear phenomena are coupled to the species transport by a routine that uses derivative approximation with prediction-iteration. The latter is also the core of the counter-flow computation algorithm. A HAN model of a laboratory test fuel cell is presented and evaluated against a professional 3D CFD simulation tool showing very good agreement between results of the presented model and those of the CFD simulation. Furthermore, high accuracy results are achieved at moderate computational times, which is owed to the semi-analytic nature and to the efficient computational coupling of electrochemical kinetics and species transport.
ERIC Educational Resources Information Center
Rowe, Jeremy; Razdan, Anshuman
The Partnership for Research in Spatial Modeling (PRISM) project at Arizona State University (ASU) developed modeling and analytic tools to respond to the limitations of two-dimensional (2D) data representations perceived by affiliated discipline scientists, and to take advantage of the enhanced capabilities of three-dimensional (3D) data that…
service line analytics in the new era.
Spence, Jay; Seargeant, Dan
2015-08-01
To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.
NASA Astrophysics Data System (ADS)
Ferrara, Alessandro; Polverino, Pierpaolo; Pianese, Cesare
2018-06-01
This paper proposes an analytical model of the water content of the electrolyte of a Proton Exchange Membrane Fuel Cell. The model is designed by accounting for several simplifying assumptions, which make the model suitable for on-board/online water management applications, while ensuring a good accuracy of the considered phenomena, with respect to advanced numerical solutions. The achieved analytical solution, expressing electrolyte water content, is compared with that obtained by means of a complex numerical approach, used to solve the same mathematical problem. The achieved results show that the mean error is below 5% for electrodes water content values ranging from 2 to 15 (given as boundary conditions), and it does not overcome 0.26% for electrodes water content above 5. These results prove the capability of the solution to correctly model electrolyte water content at any operating condition, aiming at embodiment into more complex frameworks (e.g., cell or stack models), related to fuel cell simulation, monitoring, control, diagnosis and prognosis.
Multidimensional Data Modeling for Business Process Analysis
NASA Astrophysics Data System (ADS)
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Combining Modeling and Gaming for Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riensche, Roderick M.; Whitney, Paul D.
2012-08-22
Many of our most significant challenges involve people. While human behavior has long been studied, there are recent advances in computational modeling of human behavior. With advances in computational capabilities come increases in the volume and complexity of data that humans must understand in order to make sense of and capitalize on these modeling advances. Ultimately, models represent an encapsulation of human knowledge. One inherent challenge in modeling is efficient and accurate transfer of knowledge from humans to models, and subsequent retrieval. The simulated real-world environment of games presents one avenue for these knowledge transfers. In this paper we describemore » our approach of combining modeling and gaming disciplines to develop predictive capabilities, using formal models to inform game development, and using games to provide data for modeling.« less
Wang, Kai; Liu, Menglong; Su, Zhongqing; Yuan, Shenfang; Fan, Zheng
2018-08-01
To characterize fatigue cracks, in the undersized stage in particular, preferably in a quantitative and precise manner, a two-dimensional (2D) analytical model is developed for interpreting the modulation mechanism of a "breathing" crack on guided ultrasonic waves (GUWs). In conjunction with a modal decomposition method and a variational principle-based algorithm, the model is capable of analytically depicting the propagating and evanescent waves induced owing to the interaction of probing GUWs with a "breathing" crack, and further extracting linear and nonlinear wave features (e.g., reflection, transmission, mode conversion and contact acoustic nonlinearity (CAN)). With the model, a quantitative correlation between CAN embodied in acquired GUWs and crack parameters (e.g., location and severity) is obtained, whereby a set of damage indices is proposed via which the severity of the crack can be evaluated quantitatively. The evaluation, in principle, does not entail a benchmarking process against baseline signals. As validation, the results obtained from the analytical model are compared with those from finite element simulation, showing good consistency. This has demonstrated accuracy of the developed analytical model in interpreting contact crack-induced CAN, and spotlighted its application to quantitative evaluation of fatigue damage. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R
Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less
Stream temperature investigations: field and analytic methods
Bartholow, J.M.
1989-01-01
Alternative public domain stream and reservoir temperature models are contrasted with SNTEMP. A distinction is made between steady-flow and dynamic-flow models and their respective capabilities. Regression models are offered as an alternative approach for some situations, with appropriate mathematical formulas suggested. Appendices provide information on State and Federal agencies that are good data sources, vendors for field instrumentation, and small computer programs useful in data reduction.
Analytical and Radiochemistry for Nuclear Forensics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steiner, Robert Ernest; Dry, Donald E.; Kinman, William Scott
Information about nonproliferation nuclear forensics, activities in forensics at Los Alamos National Laboratory, radio analytical work at LANL, radiochemical characterization capabilities, bulk chemical and materials analysis capabilities, and future interests in forensics interactions.
NASA Astrophysics Data System (ADS)
Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan
2017-07-01
The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.
NASA Astrophysics Data System (ADS)
Márquez, Andrés; Francés, Jorge; Martínez, Francisco J.; Gallego, Sergi; Álvarez, Mariela L.; Calzado, Eva M.; Pascual, Inmaculada; Beléndez, Augusto
2018-03-01
Simplified analytical models with predictive capability enable simpler and faster optimization of the performance in applications of complex photonic devices. We recently demonstrated the most simplified analytical model still showing predictive capability for parallel-aligned liquid crystal on silicon (PA-LCoS) devices, which provides the voltage-dependent retardance for a very wide range of incidence angles and any wavelength in the visible. We further show that the proposed model is not only phenomenological but also physically meaningful, since two of its parameters provide the correct values for important internal properties of these devices related to the birefringence, cell gap, and director profile. Therefore, the proposed model can be used as a means to inspect internal physical properties of the cell. As an innovation, we also show the applicability of the split-field finite-difference time-domain (SF-FDTD) technique for phase-shift and retardance evaluation of PA-LCoS devices under oblique incidence. As a simplified model for PA-LCoS devices, we also consider the exact description of homogeneous birefringent slabs. However, we show that, despite its higher degree of simplification, the proposed model is more robust, providing unambiguous and physically meaningful solutions when fitting its parameters.
Technology requirements for large flexible space structures
NASA Technical Reports Server (NTRS)
Wada, B. K.; Freeland, R. E.; Garcia, N. F.
1983-01-01
Research, test, and demonstration experiments necessary for establishing a data base that will permit construction of large, lightweight flexible space structures meeting on-orbit pointing and surface precesion criteria are discussed. Attention is focused on the wrap-rib proof-of-concept antenna structures developed from technology used on the ATS-6 satellite. The target structure will be up to 150 m in diameter or smaller, operate at RF levels, be amenable to packaging for carriage in the Shuttle bay, be capable of being ground-tested, and permit on-orbit deployment and retraction. Graphite/epoxy has been chosen as the antenna ribs material, and the antenna mesh will be gold-plated Mo wire. A 55-m diam reflector was built as proof-of-concept with ground-test capability. Tests will proceed on components, a model, the entire structure, and in-flight. An analytical model has been formulated to characterize the antenna's thermal behavior. The flight test of the 55-m prototype in-orbit offers the chance to validate the analytical model and characterize the control, mechanical, and thermal characteristics of the antenna configuration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth; Engel, Dave; Star, Keith
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suitemore » of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.« less
Improvements to the APBS biomolecular solvation software suite.
Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A
2018-01-01
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.
Contact and Impact Dynamic Modeling Capabilities of LS-DYNA for Fluid-Structure Interaction Problems
2010-12-02
rigid sphere in a vertical water entry,” Applied Ocean Research, 13(1), pp. 43-48. Monaghan, J.J., 1994. “ Simulating free surface flows with SPH ...The kinematic free surface condition was used to determine the intersection between the free surface and the body in the outer flow domain...and the results were compared with analytical and numerical predictions. The predictive capability of ALE and SPH features of LS-DYNA for simulation
ATLAS, an integrated structural analysis and design system. Volume 1: ATLAS user's guide
NASA Technical Reports Server (NTRS)
Dreisbach, R. L. (Editor)
1979-01-01
Some of the many analytical capabilities provided by the ATLAS Version 4.0 System in the logical sequence are described in which model-definition data are prepared and the subsequent computer job is executed. The example data presented and the fundamental technical considerations that are highlighted can be used as guides during the problem solving process. This guide does not describe the details of the ATLAS capabilities, but provides an introduction to the new user of ATLAS to the level at which the complete array of capabilities described in the ATLAS User's Manual can be exploited fully.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutmacher, R.; Crawford, R.
This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.
Actionable data analytics in oncology: are we there yet?
Barkley, Ronald; Greenapple, Rhonda; Whang, John
2014-03-01
To operate under a new value-based paradigm, oncology providers must develop the capability to aggregate, analyze, measure, and report their value proposition--that is, their outcomes and associated costs. How are oncology providers positioned currently to perform these functions in a manner that is actionable? What is the current state of analytic capabilities in oncology? Are oncology providers prepared? This line of inquiry was the basis for the 2013 Cancer Center Business Summit annual industry research survey. This article reports on the key findings and implications of the 2013 research survey with regard to data analytic capabilities in the oncology sector. The essential finding from the study is that only a small number of oncology providers (7%) currently possess the analytic tools and capabilities necessary to satisfy internal and external demands for aggregating and reporting clinical outcome and economic data. However there is an expectation that a majority of oncology providers (60%) will have developed such capabilities within the next 2 years.
Calculation of ground vibration spectra from heavy military vehicles
NASA Astrophysics Data System (ADS)
Krylov, V. V.; Pickup, S.; McNuff, J.
2010-07-01
The demand for reliable autonomous systems capable to detect and identify heavy military vehicles becomes an important issue for UN peacekeeping forces in the current delicate political climate. A promising method of detection and identification is the one using the information extracted from ground vibration spectra generated by heavy military vehicles, often termed as their seismic signatures. This paper presents the results of the theoretical investigation of ground vibration spectra generated by heavy military vehicles, such as tanks and armed personnel carriers. A simple quarter car model is considered to identify the resulting dynamic forces applied from a vehicle to the ground. Then the obtained analytical expressions for vehicle dynamic forces are used for calculations of generated ground vibrations, predominantly Rayleigh surface waves, using Green's function method. A comparison of the obtained theoretical results with the published experimental data shows that analytical techniques based on the simplified quarter car vehicle model are capable of producing ground vibration spectra of heavy military vehicles that reproduce basic properties of experimental spectra.
Interaction Junk: User Interaction-Based Evaluation of Visual Analytic Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; North, Chris
2012-10-14
With the growing need for visualization to aid users in understanding large, complex datasets, the ability for users to interact and explore these datasets is critical. As visual analytic systems have advanced to leverage powerful computational models and data analytics capabilities, the modes by which users engage and interact with the information are limited. Often, users are taxed with directly manipulating parameters of these models through traditional GUIs (e.g., using sliders to directly manipulate the value of a parameter). However, the purpose of user interaction in visual analytic systems is to enable visual data exploration – where users can focusmore » on their task, as opposed to the tool or system. As a result, users can engage freely in data exploration and decision-making, for the purpose of gaining insight. In this position paper, we discuss how evaluating visual analytic systems can be approached through user interaction analysis, where the goal is to minimize the cognitive translation between the visual metaphor and the mode of interaction (i.e., reducing the “Interactionjunk”). We motivate this concept through a discussion of traditional GUIs used in visual analytics for direct manipulation of model parameters, and the importance of designing interactions the support visual data exploration.« less
Song, Hongjun; Wang, Yi; Pant, Kapil
2013-01-01
This paper presents an analytical study of the cross-stream diffusion of an analyte in a rectangular microchannel under combined electroosmotic flow (EOF) and pressure driven flow to investigate the heterogeneous transport behavior and spatially-dependent diffusion scaling law. An analytical model capable of accurately describing 3D steady-state convection-diffusion in microchannels with arbitrary aspect ratios is developed based on the assumption of the thin Electric Double Layer (EDL). The model is verified against high-fidelity numerical simulation in terms of flow velocity and analyte concentration profiles with excellent agreement (<0.5% relative error). An extensive parametric analysis is then undertaken to interrogate the effect of the combined flow velocity field on the transport behavior in both the positive pressure gradient (PPG) and negative pressure gradient (NPG) cases. For the first time, the evolution from the spindle-shaped concentration profile in the PPG case, via the stripe-shaped profile (pure EOF), and finally to the butterfly-shaped profile in the PPG case is obtained using the analytical model along with a quantitative depiction of the spatially-dependent diffusion layer thickness and scaling law across a wide range of the parameter space.
Song, Hongjun; Wang, Yi; Pant, Kapil
2012-01-01
This paper presents an analytical study of the cross-stream diffusion of an analyte in a rectangular microchannel under combined electroosmotic flow (EOF) and pressure driven flow to investigate the heterogeneous transport behavior and spatially-dependent diffusion scaling law. An analytical model capable of accurately describing 3D steady-state convection-diffusion in microchannels with arbitrary aspect ratios is developed based on the assumption of the thin Electric Double Layer (EDL). The model is verified against high-fidelity numerical simulation in terms of flow velocity and analyte concentration profiles with excellent agreement (<0.5% relative error). An extensive parametric analysis is then undertaken to interrogate the effect of the combined flow velocity field on the transport behavior in both the positive pressure gradient (PPG) and negative pressure gradient (NPG) cases. For the first time, the evolution from the spindle-shaped concentration profile in the PPG case, via the stripe-shaped profile (pure EOF), and finally to the butterfly-shaped profile in the PPG case is obtained using the analytical model along with a quantitative depiction of the spatially-dependent diffusion layer thickness and scaling law across a wide range of the parameter space. PMID:23554584
Analytical skin friction and heat transfer formula for compressible internal flows
NASA Technical Reports Server (NTRS)
Dechant, Lawrence J.; Tattar, Marc J.
1994-01-01
An analytic, closed-form friction formula for turbulent, internal, compressible, fully developed flow was derived by extending the incompressible law-of-the-wall relation to compressible cases. The model is capable of analyzing heat transfer as a function of constant surface temperatures and surface roughness as well as analyzing adiabatic conditions. The formula reduces to Prandtl's law of friction for adiabatic, smooth, axisymmetric flow. In addition, the formula reduces to the Colebrook equation for incompressible, adiabatic, axisymmetric flow with various roughnesses. Comparisons with available experiments show that the model averages roughly 12.5 percent error for adiabatic flow and 18.5 percent error for flow involving heat transfer.
Chernetsova, Elena S; Revelsky, Alexander I; Morlock, Gertrud E
2011-08-30
The present study is a first step towards the unexplored capabilities of Direct Analysis in Real Time (DART) mass spectrometry (MS) arising from the possibility of the desorption at an angle: scanning analysis of surfaces, including the coupling of thin-layer chromatography (TLC) with DART-MS, and a more sensitive analysis due to the preliminary concentration of analytes dissolved in large volumes of liquids on glass surfaces. In order to select the most favorable conditions for DART-MS analysis, proper positioning of samples is important. Therefore, a simple and cheap technique for the visualization of the impact region of the DART gas stream onto a substrate was developed. A filter paper or TLC plate, previously loaded with the analyte, was immersed in a derivatization solution. On this substrate, owing to the impact of the hot DART gas, reaction of the analyte to a colored product occurred. An improved capability of detection of DART-MS for the analysis of liquids was demonstrated by applying large volumes of model solutions of coumaphos into small glass vessels and drying these solutions prior to DART-MS analysis under ambient conditions. This allowed the introduction of, by up to more than two orders of magnitude, increased quantities of analyte compared with the conventional DART-MS analysis of liquids. Through this improved detectability, the capabilities of DART-MS in trace analysis could be strengthened. Copyright © 2011 John Wiley & Sons, Ltd.
Proactive Supply Chain Performance Management with Predictive Analytics
Stefanovic, Nenad
2014-01-01
Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605
Proactive supply chain performance management with predictive analytics.
Stefanovic, Nenad
2014-01-01
Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.
Duct flow nonuniformities for Space Shuttle Main Engine (SSME)
NASA Technical Reports Server (NTRS)
1988-01-01
Analytical capabilities for modeling hot gas flow on the fuel side of the Space Shuttle Main Engines are developed. Emphasis is placed on construction and documentation of a computational grid code for modeling an elliptical two-duct version of the fuel side hot gas manifold. Computational results for flow past a support strut in an annular channel are also presented.
Verification of Modelica-Based Models with Analytical Solutions for Tritium Diffusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rader, Jordan D.; Greenwood, Michael Scott; Humrickhouse, Paul W.
Here, tritium transport in metal and molten salt fluids combined with diffusion through high-temperature structural materials is an important phenomenon in both magnetic confinement fusion (MCF) and molten salt reactor (MSR) applications. For MCF, tritium is desirable to capture for fusion fuel. For MSRs, uncaptured tritium potentially can be released to the environment. In either application, quantifying the time- and space-dependent tritium concentration in the working fluid(s) and structural components is necessary.Whereas capability exists specifically for calculating tritium transport in such systems (e.g., using TMAP for fusion reactors), it is desirable to unify the calculation of tritium transport with othermore » system variables such as dynamic fluid and structure temperature combined with control systems such as those that might be found in a system code. Some capability for radioactive trace substance transport exists in thermal-hydraulic systems codes (e.g., RELAP5-3D); however, this capability is not coupled to species diffusion through solids. Combined calculations of tritium transport and thermal-hydraulic solution have been demonstrated with TRIDENT but only for a specific type of MSR.Researchers at Oak Ridge National Laboratory have developed a set of Modelica-based dynamic system modeling tools called TRANsient Simulation Framework Of Reconfigurable Models (TRANSFORM) that were used previously to model advanced fission reactors and associated systems. In this system, the augmented TRANSFORM library includes dynamically coupled fluid and solid trace substance transport and diffusion. Results from simulations are compared against analytical solutions for verification.« less
Verification of Modelica-Based Models with Analytical Solutions for Tritium Diffusion
Rader, Jordan D.; Greenwood, Michael Scott; Humrickhouse, Paul W.
2018-03-20
Here, tritium transport in metal and molten salt fluids combined with diffusion through high-temperature structural materials is an important phenomenon in both magnetic confinement fusion (MCF) and molten salt reactor (MSR) applications. For MCF, tritium is desirable to capture for fusion fuel. For MSRs, uncaptured tritium potentially can be released to the environment. In either application, quantifying the time- and space-dependent tritium concentration in the working fluid(s) and structural components is necessary.Whereas capability exists specifically for calculating tritium transport in such systems (e.g., using TMAP for fusion reactors), it is desirable to unify the calculation of tritium transport with othermore » system variables such as dynamic fluid and structure temperature combined with control systems such as those that might be found in a system code. Some capability for radioactive trace substance transport exists in thermal-hydraulic systems codes (e.g., RELAP5-3D); however, this capability is not coupled to species diffusion through solids. Combined calculations of tritium transport and thermal-hydraulic solution have been demonstrated with TRIDENT but only for a specific type of MSR.Researchers at Oak Ridge National Laboratory have developed a set of Modelica-based dynamic system modeling tools called TRANsient Simulation Framework Of Reconfigurable Models (TRANSFORM) that were used previously to model advanced fission reactors and associated systems. In this system, the augmented TRANSFORM library includes dynamically coupled fluid and solid trace substance transport and diffusion. Results from simulations are compared against analytical solutions for verification.« less
Study of tethered satellite active attitude control
NASA Technical Reports Server (NTRS)
Colombo, G.
1982-01-01
Existing software was adapted for the study of tethered subsatellite rotational dynamics, an analytic solution for a stable configuration of a tethered subsatellite was developed, the analytic and numerical integrator (computer) solutions for this "test case' was compared in a two mass tether model program (DUMBEL), the existing multiple mass tether model (SKYHOOK) was modified to include subsatellite rotational dynamics, the analytic "test case,' was verified, and the use of the SKYHOOK rotational dynamics capability with a computer run showing the effect of a single off axis thruster on the behavior of the subsatellite was demonstrated. Subroutines for specific attitude control systems are developed and applied to the study of the behavior of the tethered subsatellite under realistic on orbit conditions. The effect of all tether "inputs,' including pendular oscillations, air drag, and electrodynamic interactions, on the dynamic behavior of the tether are included.
The role of mechanics during brain development
NASA Astrophysics Data System (ADS)
Budday, Silvia; Steinmann, Paul; Kuhl, Ellen
2014-12-01
Convolutions are a classical hallmark of most mammalian brains. Brain surface morphology is often associated with intelligence and closely correlated with neurological dysfunction. Yet, we know surprisingly little about the underlying mechanisms of cortical folding. Here we identify the role of the key anatomic players during the folding process: cortical thickness, stiffness, and growth. To establish estimates for the critical time, pressure, and the wavelength at the onset of folding, we derive an analytical model using the Föppl-von Kármán theory. Analytical modeling provides a quick first insight into the critical conditions at the onset of folding, yet it fails to predict the evolution of complex instability patterns in the post-critical regime. To predict realistic surface morphologies, we establish a computational model using the continuum theory of finite growth. Computational modeling not only confirms our analytical estimates, but is also capable of predicting the formation of complex surface morphologies with asymmetric patterns and secondary folds. Taken together, our analytical and computational models explain why larger mammalian brains tend to be more convoluted than smaller brains. Both models provide mechanistic interpretations of the classical malformations of lissencephaly and polymicrogyria. Understanding the process of cortical folding in the mammalian brain has direct implications on the diagnostics of neurological disorders including severe retardation, epilepsy, schizophrenia, and autism.
The role of mechanics during brain development
Budday, Silvia; Steinmann, Paul; Kuhl, Ellen
2014-01-01
Convolutions are a classical hallmark of most mammalian brains. Brain surface morphology is often associated with intelligence and closely correlated to neurological dysfunction. Yet, we know surprisingly little about the underlying mechanisms of cortical folding. Here we identify the role of the key anatomic players during the folding process: cortical thickness, stiffness, and growth. To establish estimates for the critical time, pressure, and the wavelength at the onset of folding, we derive an analytical model using the Föppl-von-Kármán theory. Analytical modeling provides a quick first insight into the critical conditions at the onset of folding, yet it fails to predict the evolution of complex instability patterns in the post-critical regime. To predict realistic surface morphologies, we establish a computational model using the continuum theory of finite growth. Computational modeling not only confirms our analytical estimates, but is also capable of predicting the formation of complex surface morphologies with asymmetric patterns and secondary folds. Taken together, our analytical and computational models explain why larger mammalian brains tend to be more convoluted than smaller brains. Both models provide mechanistic interpretations of the classical malformations of lissencephaly and polymicrogyria. Understanding the process of cortical folding in the mammalian brain has direct implications on the diagnostics of neurological disorders including severe retardation, epilepsy, schizophrenia, and autism. PMID:25202162
Numerical investigation of band gaps in 3D printed cantilever-in-mass metamaterials
NASA Astrophysics Data System (ADS)
Qureshi, Awais; Li, Bing; Tan, K. T.
2016-06-01
In this research, the negative effective mass behavior of elastic/mechanical metamaterials is exhibited by a cantilever-in-mass structure as a proposed design for creating frequency stopping band gaps, based on local resonance of the internal structure. The mass-in-mass unit cell model is transformed into a cantilever-in-mass model using the Bernoulli-Euler beam theory. An analytical model of the cantilever-in-mass structure is derived and the effects of geometrical dimensions and material parameters to create frequency band gaps are examined. A two-dimensional finite element model is created to validate the analytical results, and excellent agreement is achieved. The analytical model establishes an easily tunable metamaterial design to realize wave attenuation based on locally resonant frequency. To demonstrate feasibility for 3D printing, the analytical model is employed to design and fabricate 3D printable mechanical metamaterial. A three-dimensional numerical experiment is performed using COMSOL Multiphysics to validate the wave attenuation performance. Results show that the cantilever-in-mass metamaterial is capable of mitigating stress waves at the desired resonance frequency. Our study successfully presents the use of one constituent material to create a 3D printed cantilever-in-mass metamaterial with negative effective mass density for stress wave mitigation purposes.
Determining your organization's 'risk capability'.
Hannah, Bill; Hancock, Melinda
2014-05-01
An assessment of a provider's level of risk capability should focus on three key elements: Business intelligence, including sophisticated analytical models that can offer insight into the expected cost and quality of care for a given population. Clinical enterprise maturity, marked by the ability to improve health outcomes and to manage utilization and costs to drive change. Revenue transformation, emphasizing the need for a revenue cycle platform that allows for risk acceptance and management and that provides incentives for performance against defined objectives.
NASA Technical Reports Server (NTRS)
Lung, Shun-fat; Pak, Chan-gi
2008-01-01
Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization (MDAO) tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.
NASA Technical Reports Server (NTRS)
Lung, Shun-fat; Pak, Chan-gi
2008-01-01
Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization [MDAO] tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.
NASA Technical Reports Server (NTRS)
Saravanos, Dimitris A.
1997-01-01
The development of aeropropulsion components that incorporate "smart" composite laminates with embedded piezoelectric actuators and sensors is expected to ameliorate critical problems in advanced aircraft engines related to vibration, noise emission, and thermal stability. To facilitate the analytical needs of this effort, the NASA Lewis Research Center has developed mechanics and multidisciplinary computational models to analyze the complicated electromechanical behavior of realistic smart-structure configurations operating in combined mechanical, thermal, and acoustic environments. The models have been developed to accommodate the particular geometries, environments, and technical challenges encountered in advanced aircraft engines, yet their unique analytical features are expected to facilitate application of this new technology in a variety of commercial applications.
The role of light microscopy in aerospace analytical laboratories
NASA Technical Reports Server (NTRS)
Crutcher, E. R.
1977-01-01
Light microscopy has greatly reduced analytical flow time and added new dimensions to laboratory capability. Aerospace analytical laboratories are often confronted with problems involving contamination, wear, or material inhomogeneity. The detection of potential problems and the solution of those that develop necessitate the most sensitive and selective applications of sophisticated analytical techniques and instrumentation. This inevitably involves light microscopy. The microscope can characterize and often identify the cause of a problem in 5-15 minutes with confirmatory tests generally less than one hour. Light microscopy has and will make a very significant contribution to the analytical capabilities of aerospace laboratories.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.
Comment on atomic independent-particle models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doda, D.D.; Gravey, R.H.; Green, A.E.S.
1975-08-01
The Hartree-Fock-Slater (HFS) independent-particle model in the form developed by Hermann and Skillman (HS) and the Green, Sellin, and Zachor (GSZ) analytic independent-particle model are being used for many types of applications of atomic theory to avoid cumbersome, albeit more rigorous, many-body calculations. The single-electron eigenvalues obtained with these models are examined and it is found that the GSZ model is capable of yielding energy eigenvalues for valence electrons which are substantially closer to experimental values than are the results of HS-HFS calculations. With the aid of an analytic representation of the equivalent HS-HFS screening function, the difficulty with thismore » model is identified as a weakness of the potential in the neighborhood of the valence shell. Accurate representations of valence states are important in most atomic applications of the independent-particle model. (auth)« less
A Model of High-Frequency Self-Mixing in Double-Barrier Rectifier
NASA Astrophysics Data System (ADS)
Palma, Fabrizio; Rao, R.
2018-03-01
In this paper, a new model of the frequency dependence of the double-barrier THz rectifier is presented. The new structure is of interest because it can be realized by CMOS image sensor technology. Its application in a complex field such as that of THz receivers requires the availability of an analytical model, which is reliable and able to highlight the dependence on the parameters of the physical structure. The model is based on the hydrodynamic semiconductor equations, solved in the small signal approximation. The model depicts the mechanisms of the THz modulation of the charge in the depleted regions of the double-barrier device and explains the self-mixing process, the frequency dependence, and the detection capability of the structure. The model thus substantially improves the analytical models of the THz rectification available in literature, mainly based on lamped equivalent circuits.
NASA Technical Reports Server (NTRS)
Gabel, R.; Lang, P. F.; Smith, L. A.; Reed, D. A.
1989-01-01
Boeing Helicopter, together with other United States helicopter manufacturers, participated in a finite element applications program to emplace in the United States a superior capability to utilize finite element analysis models in support of helicopter airframe design. The activities relating to planning and creating a finite element vibrations model of the Boeing Model 36-0 composite airframe are summarized, along with the subsequent analytical correlation with ground shake test data.
Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC): User Guide. Version 3
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Bednarcyk, B. A.; Wilt, T. E.; Trowbridge, D.
1999-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC/ GMC is a versatile form of research software that "drives" the double or triply periodic micromechanics constitutive models based upon GMC. MAC/GMC enhances the basic capabilities of GMC by providing a modular framework wherein 1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, 2) different integration algorithms may be selected, 3) a variety of material constitutive models (both deformation and life) may be utilized and/or implemented, and 4) a variety of fiber architectures (both unidirectional, laminate and woven) may be easily accessed through their corresponding representative volume elements contained within the supplied library of RVEs or input directly by the user, and 5) graphical post processing of the macro and/or micro field quantities is made available.
Bifurcation analysis of parametrically excited bipolar disorder model
NASA Astrophysics Data System (ADS)
Nana, Laurent
2009-02-01
Bipolar II disorder is characterized by alternating hypomanic and major depressive episode. We model the periodic mood variations of a bipolar II patient with a negatively damped harmonic oscillator. The medications administrated to the patient are modeled via a forcing function that is capable of stabilizing the mood variations and of varying their amplitude. We analyze analytically, using perturbation method, the amplitude and stability of limit cycles and check this analysis with numerical simulations.
Relative resolution: A hybrid formalism for fluid mixtures.
Chaimovich, Aviel; Peter, Christine; Kremer, Kurt
2015-12-28
We show here that molecular resolution is inherently hybrid in terms of relative separation. While nearest neighbors are characterized by a fine-grained (geometrically detailed) model, other neighbors are characterized by a coarse-grained (isotropically simplified) model. We notably present an analytical expression for relating the two models via energy conservation. This hybrid framework is correspondingly capable of retrieving the structural and thermal behavior of various multi-component and multi-phase fluids across state space.
Relative resolution: A hybrid formalism for fluid mixtures
NASA Astrophysics Data System (ADS)
Chaimovich, Aviel; Peter, Christine; Kremer, Kurt
2015-12-01
We show here that molecular resolution is inherently hybrid in terms of relative separation. While nearest neighbors are characterized by a fine-grained (geometrically detailed) model, other neighbors are characterized by a coarse-grained (isotropically simplified) model. We notably present an analytical expression for relating the two models via energy conservation. This hybrid framework is correspondingly capable of retrieving the structural and thermal behavior of various multi-component and multi-phase fluids across state space.
The Space Tug economic analysis study - What we learned
NASA Technical Reports Server (NTRS)
Hopkins, C. V.
1975-01-01
This paper summarizes the scope, analytical methods, and principal findings of a recently performed Space-Tug economic analysis. Both the Shuttle/Tug transportation system and its unmanned payloads were modeled in this study. A variety of upper-stage concepts capable of fulfilling the Tug mission were evaluated against this model, and the 'best' Tug concepts were identified for a range of economic measures.
Unsteady transonic potential flow over a flexible fuselage
NASA Technical Reports Server (NTRS)
Gibbons, Michael D.
1993-01-01
A flexible fuselage capability has been developed and implemented within version 1.2 of the CAP-TSD code. The capability required adding time dependent terms to the fuselage surface boundary conditions and the fuselage surface pressure coefficient. The new capability will allow modeling the effect of a flexible fuselage on the aeroelastic stability of complex configurations. To assess the flexible fuselage capability several steady and unsteady calculations have been performed for slender fuselages with circular cross-sections. Steady surface pressures are compared with experiment at transonic flight conditions. Unsteady cross-sectional lift is compared with other analytical results at a low subsonic speed and a transonic case has been computed. The comparisons demonstrate the accuracy of the flexible fuselage modifications.
Semantic Interaction for Visual Analytics: Toward Coupling Cognition and Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander
2014-07-01
The dissertation discussed in this article [1] was written in the midst of an era of digitization. The world is becoming increasingly instrumented with sensors, monitoring, and other methods for generating data describing social, physical, and natural phenomena. Thus, data exist with the potential of being analyzed to uncover, or discover, the phenomena from which it was created. However, as the analytic models leveraged to analyze these data continue to increase in complexity and computational capability, how can visualizations and user interaction methodologies adapt and evolve to continue to foster discovery and sensemaking?
NASA Technical Reports Server (NTRS)
Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.
1983-01-01
Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.
NASA Astrophysics Data System (ADS)
Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.
1983-05-01
Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.
Resource utilization model for the algorithm to architecture mapping model
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Patel, Rakesh R.
1993-01-01
The analytical model for resource utilization and the variable node time and conditional node model for the enhanced ATAMM model for a real-time data flow architecture are presented in this research. The Algorithm To Architecture Mapping Model, ATAMM, is a Petri net based graph theoretic model developed at Old Dominion University, and is capable of modeling the execution of large-grained algorithms on a real-time data flow architecture. Using the resource utilization model, the resource envelope may be obtained directly from a given graph and, consequently, the maximum number of required resources may be evaluated. The node timing diagram for one iteration period may be obtained using the analytical resource envelope. The variable node time model, which describes the change in resource requirement for the execution of an algorithm under node time variation, is useful to expand the applicability of the ATAMM model to heterogeneous architectures. The model also describes a method of detecting the presence of resource limited mode and its subsequent prevention. Graphs with conditional nodes are shown to be reduced to equivalent graphs with time varying nodes and, subsequently, may be analyzed using the variable node time model to determine resource requirements. Case studies are performed on three graphs for the illustration of applicability of the analytical theories.
Adding a solar-radiance function to the Hošek-Wilkie skylight model.
Hošek, Lukáš; Wilkie, Alexander
2013-01-01
One prerequisite for realistic renderings of outdoor scenes is the proper capturing of the sky's appearance. Currently, an explicit simulation of light scattering in the atmosphere isn't computationally feasible, and won't be in the foreseeable future. Captured luminance patterns have proven their usefulness in practice but can't meet all user needs. To fill this capability gap, computer graphics technology has employed analytical models of sky-dome luminance patterns for more than two decades. For technical reasons, such models deal with only the sky dome's appearance, though, and exclude the solar disc. The widely used model proposed by Arcot Preetham and colleagues employed a separately derived analytical formula for adding a solar emitter of suitable radiant intensity. Although this yields reasonable results, the formula is derived in a manner that doesn't exactly match the conditions in their sky-dome model. But the more sophisticated a skylight model is and the more subtly it can represent different conditions, the more the solar radiance should exactly match the skylight's conditions. Toward that end, researchers propose a solar-radiance function that exactly matches a recently published high-quality analytical skylight model.
Evaluation of simplified stream-aquifer depletion models for water rights administration
Sophocleous, Marios; Koussis, Antonis; Martin, J.L.; Perkins, S.P.
1995-01-01
We assess the predictive accuracy of Glover's (1974) stream-aquifer analytical solutions, which are commonly used in administering water rights, and evaluate the impact of the assumed idealizations on administrative and management decisions. To achieve these objectives, we evaluate the predictive capabilities of the Glover stream-aquifer depletion model against the MODFLOW numerical standard, which, unlike the analytical model, can handle increasing hydrogeologic complexity. We rank-order and quantify the relative importance of the various assumptions on which the analytical model is based, the three most important being: (1) streambed clogging as quantified by streambed-aquifer hydraulic conductivity contrast; (2) degree of stream partial penetration; and (3) aquifer heterogeneity. These three factors relate directly to the multidimensional nature of the aquifer flow conditions. From these considerations, future efforts to reduce the uncertainty in stream depletion-related administrative decisions should primarily address these three factors in characterizing the stream-aquifer process. We also investigate the impact of progressively coarser model grid size on numerically estimating stream leakage and conclude that grid size effects are relatively minor. Therefore, when modeling is required, coarser model grids could be used thus minimizing the input data requirements.
Performing data analytics on information obtained from various sensors on an OSUS compliant system
NASA Astrophysics Data System (ADS)
Cashion, Kelly; Landoll, Darian; Klawon, Kevin; Powar, Nilesh
2017-05-01
The Open Standard for Unattended Sensors (OSUS) was developed by DIA and ARL to provide a plug-n-play platform for sensor interoperability. Our objective is to use the standardized data produced by OSUS in performing data analytics on information obtained from various sensors. Data analytics can be integrated in one of three ways: within an asset itself; as an independent plug-in designed for one type of asset (i.e. camera or seismic sensor); or as an independent plug-in designed to incorporate data from multiple assets. As a proof-of-concept, we develop a model that can be used in the second of these types - an independent component for camera images. The dataset used was collected as part of a demonstration and test of OSUS capabilities. The image data includes images of empty outdoor scenes and scenes with human or vehicle activity. We design, test, and train a convolution neural network (CNN) to analyze these images and assess the presence of activity in the image. The resulting classifier labels input images as empty or activity with 86.93% accuracy, demonstrating the promising opportunities for deep learning, machine learning, and predictive analytics as an extension of OSUS's already robust suite of capabilities.
Simulation study of a new inverse-pinch high Coulomb transfer switch
NASA Technical Reports Server (NTRS)
Choi, S. H.
1984-01-01
A simulation study of a simplified model of a high coulomb transfer switch is performed. The switch operates in an inverse pinch geometry formed by an all metal chamber, which greatly reduces hot spot formations on the electrode surfaces. Advantages of the switch over the conventional switches are longer useful life, higher current capability and lower inductance, which improves the characteristics required for a high repetition rate switch. The simulation determines the design parameters by analytical computations and comparison with the experimentally measured risetime, current handling capability, electrode damage, and hold-off voltages. The parameters of initial switch design can be determined for the anticipated switch performance. Results are in agreement with the experiment results. Although the model is simplified, the switch characteristics such as risetime, current handling capability, electrode damages, and hold-off voltages are accurately determined.
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
Characterization of structural connections using free and forced response test data
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Huckelbridge, Arthur A.
1989-01-01
The accurate prediction of system dynamic response often has been limited by deficiencies in existing capabilities to characterize connections adequately. Connections between structural components often are complex mechanically, and difficult to accurately model analytically. Improved analytical models for connections are needed to improve system dynamic preditions. A procedure for identifying physical connection properties from free and forced response test data is developed, then verified utilizing a system having both a linear and nonlinear connection. Connection properties are computed in terms of physical parameters so that the physical characteristics of the connections can better be understood, in addition to providing improved input for the system model. The identification procedure is applicable to multi-degree of freedom systems, and does not require that the test data be measured directly at the connection locations.
Hybrid perturbation methods based on statistical time series models
NASA Astrophysics Data System (ADS)
San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario
2016-04-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.
Analytical modeling and experimental validation of a magnetorheological mount
NASA Astrophysics Data System (ADS)
Nguyen, The; Ciocanel, Constantin; Elahinia, Mohammad
2009-03-01
Magnetorheological (MR) fluid has been increasingly researched and applied in vibration isolation devices. To date, the suspension system of several high performance vehicles has been equipped with MR fluid based dampers and research is ongoing to develop MR fluid based mounts for engine and powertrain isolation. MR fluid based devices have received attention due to the MR fluid's capability to change its properties in the presence of a magnetic field. This characteristic places MR mounts in the class of semiactive isolators making them a desirable substitution for the passive hydraulic mounts. In this research, an analytical model of a mixed-mode MR mount was constructed. The magnetorheological mount employs flow (valve) mode and squeeze mode. Each mode is powered by an independent electromagnet, so one mode does not affect the operation of the other. The analytical model was used to predict the performance of the MR mount with different sets of parameters. Furthermore, in order to produce the actual prototype, the analytical model was used to identify the optimal geometry of the mount. The experimental phase of this research was carried by fabricating and testing the actual MR mount. The manufactured mount was tested to evaluate the effectiveness of each mode individually and in combination. The experimental results were also used to validate the ability of the analytical model in predicting the response of the MR mount. Based on the observed response of the mount a suitable controller can be designed for it. However, the control scheme is not addressed in this study.
2007-02-01
neurosciences , 12 I CH APT ER 2 particularly those analytic elements that create models to assist in understanding individual and...precision geo-location 10. Cause-effect models (environment, infrastructure, socio-cultural, DIME, PMESII) 11. Storytelling , gisting and advanced...sources/TRL 5 Storytelling , gisting and advanced visualization)/TRL 2-5 High fidelity, socio-culturally relevant immersive games, training and mission
ERIC Educational Resources Information Center
Gray, Geraldine; McGuinness, Colm; Owende, Philip; Carthy, Aiden
2014-01-01
Increasing college participation rates, and diversity in student population, is posing a challenge to colleges in their attempts to facilitate learners achieve their full academic potential. Learning analytics is an evolving discipline with capability for educational data analysis that could enable better understanding of learning process, and…
F C Pan, Frank
2014-03-01
Nurses have long been relied as the major labor force in hospitals. Featured with complicated and highly labor-intensive job requirement, multiple pressures from different sources was inevitable. Success in identifying stresses and accordingly coping with such stresses is important for job performance of nurses, and service quality of a hospital. Purpose of this research is to identify the determinants of nurses' capabilities. A modified Analytic Hierarchy Process (AHP) was adopted. Overall, 105 nurses from several randomly selected hospitals in southern Taiwan were investigated to generate factors. Ten experienced practitioners were included as the expert in the AHP to produce weights of each criterion. Six nurses from two regional hospitals were then selected to test the model. Four factors are then identified as the second level of hierarchy. The study result shows that the family factor is the most important factor, and followed by the personal attributes. Top three sub-criteria that attribute to the nurse's stress-coping capability are children's education, good career plan, and healthy family. The practical simulation provided evidence for the usefulness of this model. The study suggested including these key determinants into the practice of human-resource management, and restructuring the hospital's organization, creating an employee-support system as well as a family-friendly working climate. The research provided evidence that supports the usefulness of AHP in identifying the key factors that help stabilizing a nursing team.
F. C. PAN, Frank
2014-01-01
Abstract Background Nurses have long been relied as the major labor force in hospitals. Featured with complicated and highly labor-intensive job requirement, multiple pressures from different sources was inevitable. Success in identifying stresses and accordingly coping with such stresses is important for job performance of nurses, and service quality of a hospital. Purpose of this research is to identify the determinants of nurses' capabilities. Methods A modified Analytic Hierarchy Process (AHP) was adopted. Overall, 105 nurses from several randomly selected hospitals in southern Taiwan were investigated to generate factors. Ten experienced practitioners were included as the expert in the AHP to produce weights of each criterion. Six nurses from two regional hospitals were then selected to test the model. Results Four factors are then identified as the second level of hierarchy. The study result shows that the family factor is the most important factor, and followed by the personal attributes. Top three sub-criteria that attribute to the nurse's stress-coping capability are children's education, good career plan, and healthy family. The practical simulation provided evidence for the usefulness of this model. Conclusion The study suggested including these key determinants into the practice of human-resource management, and restructuring the hospital's organization, creating an employee-support system as well as a family-friendly working climate. The research provided evidence that supports the usefulness of AHP in identifying the key factors that help stabilizing a nursing team. PMID:25988086
Reliability and maintainability assessment factors for reliable fault-tolerant systems
NASA Technical Reports Server (NTRS)
Bavuso, S. J.
1984-01-01
A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.
Investigating Compaction by Intergranular Pressure Solution Using the Discrete Element Method
NASA Astrophysics Data System (ADS)
van den Ende, M. P. A.; Marketos, G.; Niemeijer, A. R.; Spiers, C. J.
2018-01-01
Intergranular pressure solution creep is an important deformation mechanism in the Earth's crust. The phenomenon has been frequently studied and several analytical models have been proposed that describe its constitutive behavior. These models require assumptions regarding the geometry of the aggregate and the grain size distribution in order to solve for the contact stresses and often neglect shear tractions. Furthermore, analytical models tend to overestimate experimental compaction rates at low porosities, an observation for which the underlying mechanisms remain to be elucidated. Here we present a conceptually simple, 3-D discrete element method (DEM) approach for simulating intergranular pressure solution creep that explicitly models individual grains, relaxing many of the assumptions that are required by analytical models. The DEM model is validated against experiments by direct comparison of macroscopic sample compaction rates. Furthermore, the sensitivity of the overall DEM compaction rate to the grain size and applied stress is tested. The effects of the interparticle friction and of a distributed grain size on macroscopic strain rates are subsequently investigated. Overall, we find that the DEM model is capable of reproducing realistic compaction behavior, and that the strain rates produced by the model are in good agreement with uniaxial compaction experiments. Characteristic features, such as the dependence of the strain rate on grain size and applied stress, as predicted by analytical models, are also observed in the simulations. DEM results show that interparticle friction and a distributed grain size affect the compaction rates by less than half an order of magnitude.
Orbital storage and supply of subcritical liquid nitrogen
NASA Technical Reports Server (NTRS)
Aydelott, John C.
1990-01-01
Subcritical cryogenic fluid management has long been recognized as an enabling technology for key propulsion applications, such as space transfer vehicles (STV) and the on-orbit cryogenic fuel depots which will provide STV servicing capability. The LeRC Cryogenic Fluids Technology Office (CFTO), under the sponsorship of OAST, has the responsibility of developing the required technology via a balanced program involving analytical modeling, ground based testing, and in-space experimentation. Topics covered in viewgraph form include: cryogenic management technologies; nitrogen storage and supply; cryogenic nitrogen cooling capability; and LN2 system demonstration technical objectives.
NASA Technical Reports Server (NTRS)
Kennedy, Ronald; Padovan, Joe
1987-01-01
In a three-part series of papers, a generalized finite element solution strategy is developed to handle traveling load problems in rolling, moving and rotating structure. The main thrust of this section consists of the development of three-dimensional and shell type moving elements. In conjunction with this work, a compatible three-dimensional contact strategy is also developed. Based on these modeling capabilities, extensive analytical and experimental benchmarking is presented. Such testing includes traveling loads in rotating structure as well as low- and high-speed rolling contact involving standing wave-type response behavior. These point to the excellent modeling capabilities of moving element strategies.
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
An Assessment of Current Fan Noise Prediction Capability
NASA Technical Reports Server (NTRS)
Envia, Edmane; Woodward, Richard P.; Elliott, David M.; Fite, E. Brian; Hughes, Christopher E.; Podboy, Gary G.; Sutliff, Daniel L.
2008-01-01
In this paper, the results of an extensive assessment exercise carried out to establish the current state of the art for predicting fan noise at NASA are presented. Representative codes in the empirical, analytical, and computational categories were exercised and assessed against a set of benchmark acoustic data obtained from wind tunnel tests of three model scale fans. The chosen codes were ANOPP, representing an empirical capability, RSI, representing an analytical capability, and LINFLUX, representing a computational aeroacoustics capability. The selected benchmark fans cover a wide range of fan pressure ratios and fan tip speeds, and are representative of modern turbofan engine designs. The assessment results indicate that the ANOPP code can predict fan noise spectrum to within 4 dB of the measurement uncertainty band on a third-octave basis for the low and moderate tip speed fans except at extreme aft emission angles. The RSI code can predict fan broadband noise spectrum to within 1.5 dB of experimental uncertainty band provided the rotor-only contribution is taken into account. The LINFLUX code can predict interaction tone power levels to within experimental uncertainties at low and moderate fan tip speeds, but could deviate by as much as 6.5 dB outside the experimental uncertainty band at the highest tip speeds in some case.
NASA Technical Reports Server (NTRS)
Sulyma, P. R.
1980-01-01
Fundamental equations and similarity definition and application are described as well as the computational steps of a computer program developed to design model nozzles for wind tunnel tests conducted to define power-on aerodynamic characteristics of the space shuttle over a range of ascent trajectory conditions. The computer code capabilities, a user's guide for the model nozzle design program, and the output format are examined. A program listing is included.
NASA Technical Reports Server (NTRS)
Gayda, J.; Srolovitz, D. J.
1989-01-01
This paper presents a specialized microstructural lattice model, MCFET (Monte Carlo finite element technique), which simulates microstructural evolution in materials in which strain energy has an important role in determining morphology. The model is capable of accounting for externally applied stress, surface tension, misfit, elastic inhomogeneity, elastic anisotropy, and arbitrary temperatures. The MCFET analysis was found to compare well with the results of analytical calculations of the equilibrium morphologies of isolated particles in an infinite matrix.
Thermal/structural design verification strategies for large space structures
NASA Technical Reports Server (NTRS)
Benton, David
1988-01-01
Requirements for space structures of increasing size, complexity, and precision have engendered a search for thermal design verification methods that do not impose unreasonable costs, that fit within the capabilities of existing facilities, and that still adequately reduce technical risk. This requires a combination of analytical and testing methods. This requires two approaches. The first is to limit thermal testing to sub-elements of the total system only in a compact configuration (i.e., not fully deployed). The second approach is to use a simplified environment to correlate analytical models with test results. These models can then be used to predict flight performance. In practice, a combination of these approaches is needed to verify the thermal/structural design of future very large space systems.
Development of a bidirectional ring thermal actuator
NASA Astrophysics Data System (ADS)
Stevenson, Mathew; Yang, Peng; Lai, Yongjun; Mechefske, Chris
2007-10-01
A new planar micro electrothermal actuator capable of bidirectional rotation is presented. The ring thermal actuator has a wheel-like geometry with eight arms connecting an outer ring to a central hub. Thermal expansion of the arms results in a rotation of the outer ring about its center. An analytical model is developed for the electrothermal and thermal-mechanical aspects of the actuator's operation. Finite element analysis is used to validate the analytic study. The actuator has been fabricated using the multi-user MEMS process and experimental displacement results are compared with model predictions. Experiments show a possible displacement of 7.4 µm in each direction. Also, by switching the current between the arms it is possible to achieve an oscillating motion.
Simulation of noisy dynamical system by Deep Learning
NASA Astrophysics Data System (ADS)
Yeo, Kyongmin
2017-11-01
Deep learning has attracted huge attention due to its powerful representation capability. However, most of the studies on deep learning have been focused on visual analytics or language modeling and the capability of the deep learning in modeling dynamical systems is not well understood. In this study, we use a recurrent neural network to model noisy nonlinear dynamical systems. In particular, we use a long short-term memory (LSTM) network, which constructs internal nonlinear dynamics systems. We propose a cross-entropy loss with spatial ridge regularization to learn a non-stationary conditional probability distribution from a noisy nonlinear dynamical system. A Monte Carlo procedure to perform time-marching simulations by using the LSTM is presented. The behavior of the LSTM is studied by using noisy, forced Van der Pol oscillator and Ikeda equation.
NASA Astrophysics Data System (ADS)
Hufner, D. R.; Augustine, M. R.
2018-05-01
A novel experimental method was developed to simulate underwater explosion pressure pulses within a laboratory environment. An impact-based experimental apparatus was constructed; capable of generating pressure pulses with basic character similar to underwater explosions, while also allowing the pulse to be tuned to different intensities. Having the capability to vary the shock impulse was considered essential to producing various levels of shock-induced damage without the need to modify the fixture. The experimental apparatus and test method are considered ideal for investigating the shock response of composite material systems and/or experimental validation of new material models. One such test program is presented herein, in which a series of E-glass/Vinylester laminates were subjected to a range of shock pulses that induced varying degrees of damage. Analysis-test correlations were performed using a rate-dependent constitutive model capable of representing anisotropic damage and ultimate yarn failure. Agreement between analytical predictions and experimental results was considered acceptable.
Sun, Bo; Sunkavalli, Kalyan; Ramamoorthi, Ravi; Belhumeur, Peter N; Nayar, Shree K
2007-01-01
The properties of virtually all real-world materials change with time, causing their bidirectional reflectance distribution functions (BRDFs) to be time varying. However, none of the existing BRDF models and databases take time variation into consideration; they represent the appearance of a material at a single time instance. In this paper, we address the acquisition, analysis, modeling, and rendering of a wide range of time-varying BRDFs (TVBRDFs). We have developed an acquisition system that is capable of sampling a material's BRDF at multiple time instances, with each time sample acquired within 36 sec. We have used this acquisition system to measure the BRDFs of a wide range of time-varying phenomena, which include the drying of various types of paints (watercolor, spray, and oil), the drying of wet rough surfaces (cement, plaster, and fabrics), the accumulation of dusts (household and joint compound) on surfaces, and the melting of materials (chocolate). Analytic BRDF functions are fit to these measurements and the model parameters' variations with time are analyzed. Each category exhibits interesting and sometimes nonintuitive parameter trends. These parameter trends are then used to develop analytic TVBRDF models. The analytic TVBRDF models enable us to apply effects such as paint drying and dust accumulation to arbitrary surfaces and novel materials.
Studzinski, J
2017-06-01
The Digital Imaging Adoption Model (DIAM) has been jointly developed by HIMSS Analytics and the European Society of Radiology (ESR). It helps evaluate the maturity of IT-supported processes in medical imaging, particularly in radiology. This eight-stage maturity model drives your organisational, strategic and tactical alignment towards imaging-IT planning. The key audience for the model comprises hospitals with imaging centers, as well as external imaging centers that collaborate with hospitals. The assessment focuses on different dimensions relevant to digital imaging, such as software infrastructure and usage, workflow security, clinical documentation and decision support, data exchange and analytical capabilities. With its standardised approach, it enables regional, national and international benchmarking. All DIAM participants receive a structured report that can be used as a basis for presenting, e.g. budget planning and investment decisions at management level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, F.; Nehl, T.W.
1998-09-01
Because of their high efficiency and power density the PM brushless dc motor is a strong candidate for electric and hybrid vehicle propulsion systems. An analytical approach is developed to predict the inverter high frequency pulse width modulation (PWM) switching caused eddy-current losses in a permanent magnet brushless dc motor. The model uses polar coordinates to take curvature effects into account, and is also capable of including the space harmonic effect of the stator magnetic field and the stator lamination effect on the losses. The model was applied to an existing motor design and was verified with the finite elementmore » method. Good agreement was achieved between the two approaches. Hence, the model is expected to be very helpful in predicting PWM switching losses in permanent magnet machine design.« less
Numerical Modeling of Ablation Heat Transfer
NASA Technical Reports Server (NTRS)
Ewing, Mark E.; Laker, Travis S.; Walker, David T.
2013-01-01
A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.
A new pneumatic suspension system with independent stiffness and ride height tuning capabilities
NASA Astrophysics Data System (ADS)
Yin, Zhihong; Khajepour, Amir; Cao, Dongpu; Ebrahimi, Babak; Guo, Konghui
2012-12-01
This paper introduces a new pneumatic spring for vehicle suspension systems, allowing independent tuning of stiffness and ride height according to different vehicle operating conditions and driver preferences. The proposed pneumatic spring comprises a double-acting pneumatic cylinder, two accumulators and a tuning subsystem. This paper presents a detailed description of the pneumatic spring and its working principle. The mathematical model is established based on principles of thermo and fluid dynamics. An experimental setup has been designed and fabricated for testing and evaluating the proposed pneumatic spring. The analytical and experimental results confirm the capability of the new pneumatic spring system for independent tuning of stiffness and ride height. The mathematical model is verified and the capabilities of the pneumatic spring are further proved. It is concluded that this new pneumatic spring provides a more flexible suspension design alternative for meeting various conflicting suspension requirements for ride comfort and performance.
2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions
2017-12-21
modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data
Interdisciplinary Applications of Autonomous Observation Systems
2008-01-01
analytical capabilities for describing the distributions and activities of marine microbes in relation to their physical, chemical and optical environment in...SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY...a sensitive indicator of physiology (light acclimation status) and also a key parameter in models of primary productivity. We are now continuing
An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1994-01-01
The primary accomplishments of the project are as follows: (1) Using the transonic small perturbation equation as a flowfield model, the project demonstrated that the quasi-analytical method could be used to obtain aerodynamic sensitivity coefficients for airfoils at subsonic, transonic, and supersonic conditions for design variables such as Mach number, airfoil thickness, maximum camber, angle of attack, and location of maximum camber. It was established that the quasi-analytical approach was an accurate method for obtaining aerodynamic sensitivity derivatives for airfoils at transonic conditions and usually more efficient than the finite difference approach. (2) The usage of symbolic manipulation software to determine the appropriate expressions and computer coding associated with the quasi-analytical method for sensitivity derivatives was investigated. Using the three dimensional fully conservative full potential flowfield model, it was determined that symbolic manipulation along with a chain rule approach was extremely useful in developing a combined flowfield and quasi-analytical sensitivity derivative code capable of considering a large number of realistic design variables. (3) Using the three dimensional fully conservative full potential flowfield model, the quasi-analytical method was applied to swept wings (i.e. three dimensional) at transonic flow conditions. (4) The incremental iterative technique has been applied to the three dimensional transonic nonlinear small perturbation flowfield formulation, an equivalent plate deflection model, and the associated aerodynamic and structural discipline sensitivity equations; and coupled aeroelastic results for an aspect ratio three wing in transonic flow have been obtained.
Thermal design and TDM test of the ETS-VI
NASA Astrophysics Data System (ADS)
Yoshinaka, T.; Kanamori, K.; Takenaka, N.; Kawashima, J.; Ido, Y.; Kuriyama, Y.
The Engineering Test Satellite-VI (ETS-VI) thermal design, thermal development model (TDM) test, and evaluation results are described. The allocation of the thermal control materials on the spacecraft is illustrated. The principal design approach is to minimize the interactions between the antenna tower module and the main body, and between the main body and the liquid apogee propulsion system by means of multilayer insulation blankets and low conductance graphite epoxy support structures. The TDM test shows that the thermal control subsystem is capable of maintaining the on-board components within specified temperature limits. The heat pipe network is confirmed to operate properly, and a uniform panel temperature distribution is accomplished. The thermal analytical model is experimentally verified. The validity of the thermal control subsystem design is confirmed by the modified on-orbit analytical model.
Visual analytics of brain networks.
Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming
2012-05-15
Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.
Analysis and test for space shuttle propellant dynamics
NASA Technical Reports Server (NTRS)
Berry, R. L.; Demchak, L. J.; Tegart, J. R.
1983-01-01
This report presents the results of a study to develop an analytical model capable of predicting the dynamic interaction forces on the Shuttle External Tank, due to large amplitude propellant slosh during RTLS separation. The report details low-g drop tower and KC-135 test programs that were conducted to investigate propellant reorientation during RTLS. In addition, the development of a nonlinear finite element slosh model (LAMPS2, two dimensional, and one LAMPS3, three dimensional) is presented. Correlation between the model and test data is presented as a verification of the modeling approach.
A predictive analytic model for the solar modulation of cosmic rays
Cholis, Ilias; Hooper, Dan; Linden, Tim
2016-02-23
An important factor limiting our ability to understand the production and propagation of cosmic rays pertains to the effects of heliospheric forces, commonly known as solar modulation. The solar wind is capable of generating time- and charge-dependent effects on the spectrum and intensity of low-energy (≲10 GeV) cosmic rays reaching Earth. Previous analytic treatments of solar modulation have utilized the force-field approximation, in which a simple potential is adopted whose amplitude is selected to best fit the cosmic-ray data taken over a given period of time. Making use of recently available cosmic-ray data from the Voyager 1 spacecraft, along withmore » measurements of the heliospheric magnetic field and solar wind, we construct a time-, charge- and rigidity-dependent model of solar modulation that can be directly compared to data from a variety of cosmic-ray experiments. Here, we provide a simple analytic formula that can be easily utilized in a variety of applications, allowing us to better predict the effects of solar modulation and reduce the number of free parameters involved in cosmic-ray propagation models.« less
Climate Analytics as a Service. Chapter 11
NASA Technical Reports Server (NTRS)
Schnase, John L.
2016-01-01
Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.
Waterhammer Testing and Modeling of the Ares I Upper Stage Reaction Control System
NASA Technical Reports Server (NTRS)
Williams, J. Hunter; Holt, Kimberly A.
2010-01-01
NASA's Ares I rocket is the agency's first step in completing the goals of the Constellation Program, which plans to deliver a new generation of space explorers into low earth orbit for future missions to the International Space Station, the moon, and other destinations within the solar system. Ares I is a two-stage rocket topped by the Orion crew capsule and its service module. The launch vehicle's First Stage is a single, five-segment reusable solid rocket booster (RSRB), derived from the Space Shuttle Program's four segment RSRB. The vehicle's Upper Stage, being designed at Marshall Space Flight Center (MSFC), is propelled by a single J-2X Main Engine fueled with liquid oxygen and liquid hydrogen. During active Upper Stage flight of the Ares I launch vehicle, the Upper Stage Reaction Control System (US ReCS) will perform attitude control operations for the vehicle. The US ReCS will provide three-axis attitude control capability (roll, pitch, and yaw) for the Upper Stage while the J-2X is not firing and roll control capability while the engine is firing. Because of the requirements imposed upon the system, the design must accommodate rapid pulsing of multiple thrusters simultaneously to maintain attitude control. In support of these design activities and in preparation for Critical Design Review, analytical models of the US ReCS propellant feed system have been developed using the Thermal Hydraulic Library of MSC.EASY5 v.2008, herein referred to as EASY5. EASY5 is a commercially available fluid system modeling package with significant history of modeling space propulsion systems. In Fall 2009, a series of development tests were conducted at MSFC on a cold-flow test article for the US ReCS, herein referred to as System Development Test Article (SDTA). A subset of those tests performed were aimed at examining the effects of waterhammer on a flight-representative system and to ensure that those effects could be quantified with analytical models and incorporated into the design of the flight system. This paper presents an overview of the test article and the test approach, along with a discussion of the analytical modeling methodology. In addition, the results of that subset of development tests, along with analytical model pre-test predictions and post-test model correlations, will also be discussed in detail.
Modelling vortex-induced fluid-structure interaction.
Benaroya, Haym; Gabbai, Rene D
2008-04-13
The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.
NASA Technical Reports Server (NTRS)
Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig
2018-01-01
This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.
Ball Bearing Analysis with the ORBIS Tool
NASA Technical Reports Server (NTRS)
Halpin, Jacob D.
2016-01-01
Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.
NASA Wrangler: Automated Cloud-Based Data Assembly in the RECOVER Wildfire Decision Support System
NASA Technical Reports Server (NTRS)
Schnase, John; Carroll, Mark; Gill, Roger; Wooten, Margaret; Weber, Keith; Blair, Kindra; May, Jeffrey; Toombs, William
2017-01-01
NASA Wrangler is a loosely-coupled, event driven, highly parallel data aggregation service designed to take advantageof the elastic resource capabilities of cloud computing. Wrangler automatically collects Earth observational data, climate model outputs, derived remote sensing data products, and historic biophysical data for pre-, active-, and post-wildfire decision making. It is a core service of the RECOVER decision support system, which is providing rapid-response GIS analytic capabilities to state and local government agencies. Wrangler reduces to minutes the time needed to assemble and deliver crucial wildfire-related data.
NASA Technical Reports Server (NTRS)
1971-01-01
The analytical models developed for the Space Propulsion Automated Synthesis Modeling (SPASM) program are presented. Weight scaling laws developed during this study are incorporated into the program's scaling data bank. A detail listing, logic diagram and input/output formats are supplied for the SPASM program. Two test examples for one to four-stage vehicles performing different types of missions are shown to demonstrate the program's capability and versatility.
2009-01-01
Department of Homeland Security (DHS) has defined 15 National Planning Scenarios ( NPSs ), along with a Target Capabilities List, which describes...including those in the NPSs , is limited. Considering the range of analytical expertise and resources available in different communities, the suite of...rolled out in phases. From the start, it will include mod- els to support some of the NPSs , as well as models that were most com- monly requested in the
TH-C-BRD-02: Analytical Modeling and Dose Calculation Method for Asymmetric Proton Pencil Beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelover, E; Wang, D; Hill, P
2014-06-15
Purpose: A dynamic collimation system (DCS), which consists of two pairs of orthogonal trimmer blades driven by linear motors has been proposed to decrease the lateral penumbra in pencil beam scanning proton therapy. The DCS reduces lateral penumbra by intercepting the proton pencil beam near the lateral boundary of the target in the beam's eye view. The resultant trimmed pencil beams are asymmetric and laterally shifted, and therefore existing pencil beam dose calculation algorithms are not capable of trimmed beam dose calculations. This work develops a method to model and compute dose from trimmed pencil beams when using the DCS.more » Methods: MCNPX simulations were used to determine the dose distributions expected from various trimmer configurations using the DCS. Using these data, the lateral distribution for individual beamlets was modeled with a 2D asymmetric Gaussian function. The integral depth dose (IDD) of each configuration was also modeled by combining the IDD of an untrimmed pencil beam with a linear correction factor. The convolution of these two terms, along with the Highland approximation to account for lateral growth of the beam along the depth direction, allows a trimmed pencil beam dose distribution to be analytically generated. The algorithm was validated by computing dose for a single energy layer 5×5 cm{sup 2} treatment field, defined by the trimmers, using both the proposed method and MCNPX beamlets. Results: The Gaussian modeled asymmetric lateral profiles along the principal axes match the MCNPX data very well (R{sup 2}≥0.95 at the depth of the Bragg peak). For the 5×5 cm{sup 2} treatment plan created with both the modeled and MCNPX pencil beams, the passing rate of the 3D gamma test was 98% using a standard threshold of 3%/3 mm. Conclusion: An analytical method capable of accurately computing asymmetric pencil beam dose when using the DCS has been developed.« less
Pellegrino Vidal, Rocío B; Allegrini, Franco; Olivieri, Alejandro C
2018-03-20
Multivariate curve resolution-alternating least-squares (MCR-ALS) is the model of choice when dealing with some non-trilinear arrays, specifically when the data are of chromatographic origin. To drive the iterative procedure to chemically interpretable solutions, the use of constraints becomes essential. In this work, both simulated and experimental data have been analyzed by MCR-ALS, applying chemically reasonable constraints, and investigating the relationship between selectivity, analytical sensitivity (γ) and root mean square error of prediction (RMSEP). As the selectivity in the instrumental modes decreases, the estimated values for γ did not fully represent the predictive model capabilities, judged from the obtained RMSEP values. Since the available sensitivity expressions have been developed by error propagation theory in unconstrained systems, there is a need of developing new expressions or analytical indicators. They should not only consider the specific profiles retrieved by MCR-ALS, but also the constraints under which the latter ones have been obtained. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Everett, L.
1992-01-01
This report documents the performance characteristics of a Targeting Reflective Alignment Concept (TRAC) sensor. The performance will be documented for both short and long ranges. For long ranges, the sensor is used without the flat mirror attached to the target. To better understand the capabilities of the TRAC based sensors, an engineering model is required. The model can be used to better design the system for a particular application. This is necessary because there are many interrelated design variables in application. These include lense parameters, camera, and target configuration. The report presents first an analytical development of the performance, and second an experimental verification of the equations. In the analytical presentation it is assumed that the best vision resolution is a single pixel element. The experimental results suggest however that the resolution is better than 1 pixel. Hence the analytical results should be considered worst case conditions. The report also discusses advantages and limitations of the TRAC sensor in light of the performance estimates. Finally the report discusses potential improvements.
Energy absorption capabilities of composite sandwich panels under blast loads
NASA Astrophysics Data System (ADS)
Sankar Ray, Tirtha
As blast threats on military and civilian structures continue to be a significant concern, there remains a need for improved design strategies to increase blast resistance capabilities. The approach to blast resistance proposed here is focused on dissipating the high levels of pressure induced during a blast through maximizing the potential for energy absorption of composite sandwich panels, which are a competitive structural member type due to the inherent energy absorption capabilities of fiber reinforced polymer (FRP) composites. Furthermore, the middle core in the sandwich panels can be designed as a sacrificial layer allowing for a significant amount of deformation or progressive failure to maximize the potential for energy absorption. The research here is aimed at the optimization of composite sandwich panels for blast mitigation via energy absorption mechanisms. The energy absorption mechanisms considered include absorbed strain energy due to inelastic deformation as well as energy dissipation through progressive failure of the core of the sandwich panels. The methods employed in the research consist of a combination of experimentally-validated finite element analysis (FEA) and the derivation and use of a simplified analytical model. The key components of the scope of work then includes: establishment of quantified energy absorption criteria, validation of the selected FE modeling techniques, development of the simplified analytical model, investigation of influential core architectures and geometric parameters, and investigation of influential material properties. For the parameters that are identified as being most-influential, recommended values for these parameters are suggested in conceptual terms that are conducive to designing composite sandwich panels for various blast threats. Based on reviewing the energy response characteristic of the panel under blast loading, a non-dimensional parameter AET/ ET (absorbed energy, AET, normalized by total energy, ET) was suggested to compare energy absorption capabilities of the structures under blast loading. In addition, AEweb/ET (where AEweb is the energy absorbed by the middle core) was also employed to evaluate the energy absorption contribution from the web. Taking advantage of FEA and the simplified analytical model, the influences of material properties as well as core architectures and geometries on energy absorption capabilities (quantified by AET/ ET and AEweb/E T) were investigated through parametric studies. Results from the material property investigation indicated that density of the front face sheet and strength were most influential on the energy absorption capability of the composite sandwich panels under blast loading. The study to investigate the potential effectiveness of energy absorbed via inelastic deformation compared to energy absorbed via progressive failure indicated that for practical applications (where the position of bomb is usually unknown and the panel is designed to be the same anywhere), the energy absorption via inelastic deformation is the more efficient approach. Regarding the geometric optimization, it was found that a core architecture consisting of vertically-oriented webs was ideal. The optimum values for these parameters can be generally described as those which cause the most inelasticity, but not failure, of the face sheets and webs.
Hybrid methods for cybersecurity analysis :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Warren Leon,; Dunlavy, Daniel M.
2014-01-01
Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crull, E W; Brown Jr., C G; Perkins, M P
2008-07-30
For short monopoles in this low-power case, it has been shown that a simple circuit model is capable of accurate predictions for the shape and magnitude of the antenna response to lightning-generated electric field coupling effects, provided that the elements of the circuit model have accurate values. Numerical EM simulation can be used to provide more accurate values for the circuit elements than the simple analytical formulas, since the analytical formulas are used outside of their region of validity. However, even with the approximate analytical formulas the simple circuit model produces reasonable results, which would improve if more accurate analyticalmore » models were used. This report discusses the coupling analysis approaches taken to understand the interaction between a time-varying EM field and a short monopole antenna, within the context of lightning safety for nuclear weapons at DOE facilities. It describes the validation of a simple circuit model using laboratory study in order to understand the indirect coupling of energy into a part, and the resulting voltage. Results show that in this low-power case, the circuit model predicts peak voltages within approximately 32% using circuit component values obtained from analytical formulas and about 13% using circuit component values obtained from numerical EM simulation. We note that the analytical formulas are used outside of their region of validity. First, the antenna is insulated and not a bare wire and there are perhaps fringing field effects near the termination of the outer conductor that the formula does not take into account. Also, the effective height formula is for a monopole directly over a ground plane, while in the time-domain measurement setup the monopole is elevated above the ground plane by about 1.5-inch (refer to Figure 5).« less
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.
2015-12-01
Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.
TopicLens: Efficient Multi-Level Visual Topic Exploration of Large-Scale Document Collections.
Kim, Minjeong; Kang, Kyeongpil; Park, Deokgun; Choo, Jaegul; Elmqvist, Niklas
2017-01-01
Topic modeling, which reveals underlying topics of a document corpus, has been actively adopted in visual analytics for large-scale document collections. However, due to its significant processing time and non-interactive nature, topic modeling has so far not been tightly integrated into a visual analytics workflow. Instead, most such systems are limited to utilizing a fixed, initial set of topics. Motivated by this gap in the literature, we propose a novel interaction technique called TopicLens that allows a user to dynamically explore data through a lens interface where topic modeling and the corresponding 2D embedding are efficiently computed on the fly. To support this interaction in real time while maintaining view consistency, we propose a novel efficient topic modeling method and a semi-supervised 2D embedding algorithm. Our work is based on improving state-of-the-art methods such as nonnegative matrix factorization and t-distributed stochastic neighbor embedding. Furthermore, we have built a web-based visual analytics system integrated with TopicLens. We use this system to measure the performance and the visualization quality of our proposed methods. We provide several scenarios showcasing the capability of TopicLens using real-world datasets.
Horowitz, Arthur J.
2013-01-01
Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.
Meng, X Flora; Baetica, Ania-Ariadna; Singhal, Vipul; Murray, Richard M
2017-05-01
Noise is often indispensable to key cellular activities, such as gene expression, necessitating the use of stochastic models to capture its dynamics. The chemical master equation (CME) is a commonly used stochastic model of Kolmogorov forward equations that describe how the probability distribution of a chemically reacting system varies with time. Finding analytic solutions to the CME can have benefits, such as expediting simulations of multiscale biochemical reaction networks and aiding the design of distributional responses. However, analytic solutions are rarely known. A recent method of computing analytic stationary solutions relies on gluing simple state spaces together recursively at one or two states. We explore the capabilities of this method and introduce algorithms to derive analytic stationary solutions to the CME. We first formally characterize state spaces that can be constructed by performing single-state gluing of paths, cycles or both sequentially. We then study stochastic biochemical reaction networks that consist of reversible, elementary reactions with two-dimensional state spaces. We also discuss extending the method to infinite state spaces and designing the stationary behaviour of stochastic biochemical reaction networks. Finally, we illustrate the aforementioned ideas using examples that include two interconnected transcriptional components and biochemical reactions with two-dimensional state spaces. © 2017 The Author(s).
Baetica, Ania-Ariadna; Singhal, Vipul; Murray, Richard M.
2017-01-01
Noise is often indispensable to key cellular activities, such as gene expression, necessitating the use of stochastic models to capture its dynamics. The chemical master equation (CME) is a commonly used stochastic model of Kolmogorov forward equations that describe how the probability distribution of a chemically reacting system varies with time. Finding analytic solutions to the CME can have benefits, such as expediting simulations of multiscale biochemical reaction networks and aiding the design of distributional responses. However, analytic solutions are rarely known. A recent method of computing analytic stationary solutions relies on gluing simple state spaces together recursively at one or two states. We explore the capabilities of this method and introduce algorithms to derive analytic stationary solutions to the CME. We first formally characterize state spaces that can be constructed by performing single-state gluing of paths, cycles or both sequentially. We then study stochastic biochemical reaction networks that consist of reversible, elementary reactions with two-dimensional state spaces. We also discuss extending the method to infinite state spaces and designing the stationary behaviour of stochastic biochemical reaction networks. Finally, we illustrate the aforementioned ideas using examples that include two interconnected transcriptional components and biochemical reactions with two-dimensional state spaces. PMID:28566513
ERIC Educational Resources Information Center
Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.
2013-01-01
Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern analysis, hotspot mapping, near-repeat analysis, and risk terrain modeling. One approach to crime analysis suggests that the best way to predict future crime occurrence is to use past behavior, such as…
COBRA ATD multispectral camera response model
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.
Camboulives, A-R; Velluet, M-T; Poulenard, S; Saint-Antonin, L; Michau, V
2018-02-01
An optical communication link performance between the ground and a geostationary satellite can be impaired by scintillation, beam wandering, and beam spreading due to its propagation through atmospheric turbulence. These effects on the link performance can be mitigated by tracking and error correction codes coupled with interleaving. Precise numerical tools capable of describing the irradiance fluctuations statistically and of creating an irradiance time series are needed to characterize the benefits of these techniques and optimize them. The wave optics propagation methods have proven their capability of modeling the effects of atmospheric turbulence on a beam, but these are known to be computationally intensive. We present an analytical-numerical model which provides good results on the probability density functions of irradiance fluctuations as well as a time series with an important saving of time and computational resources.
NASA Astrophysics Data System (ADS)
Devrient, M.; Da, X.; Frick, T.; Schmidt, M.
Laser transmission welding is a well known joining technology for thermoplastics. Because of the needs of lightweight, cost effective and green production thermoplastics are usually filled with glass fibers. These lead to higher absorption and more scattering within the upper joining partner with a negative influence on the welding process. Here an experimental method for the characterization of the scattering behavior of semi crystalline thermoplastics filled with short glass fibers and a finite element model of the welding process capable to consider scattering as well as an analytical model are introduced. The experimental data is used for the numerical and analytical investigation of laser transmission welding under consideration of scattering. The scattering effects of several thermoplastics onto the calculated temperature fields as well as weld seam geometries are quantified.
Analytical Tools for Behavioral Influences Operations
2003-12-01
NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing
3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Dame, L. T.; Chen, P. C.; Hartle, M. S.; Huang, H. T.
1985-01-01
The objective is to develop analytical tools capable of economically evaluating the cyclic time dependent plasticity which occurs in hot section engine components in areas of strain concentration resulting from the combination of both mechanical and thermal stresses. Three models were developed. A simple model performs time dependent inelastic analysis using the power law creep equation. The second model is the classical model of Professors Walter Haisler and David Allen of Texas A and M University. The third model is the unified model of Bodner, Partom, et al. All models were customized for linear variation of loads and temperatures with all material properties and constitutive models being temperature dependent.
Advancing Collaboration through Hydrologic Data and Model Sharing
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.
2015-12-01
HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.
Extending Climate Analytics-As to the Earth System Grid Federation
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.
2015-12-01
We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.
NASA Astrophysics Data System (ADS)
Guzmán, H. A.; Lárraga, M. E.; Alvarez-Icaza, L.; Carvajal, J.
2018-02-01
In this paper, a reliable cellular automata model oriented to faithfully reproduce deceleration and acceleration according to realistic reactions of drivers, when vehicles with different deceleration capabilities are considered is presented. The model focuses on describing complex traffic phenomena by coding in its rules the basic mechanisms of drivers behavior, vehicles capabilities and kinetics, while preserving simplicity. In particular, vehiclés kinetics is based on uniform accelerated motion, rather than in impulsive accelerated motion as in most existing CA models. Thus, the proposed model calculates in an analytic way three safe preserving distances to determine the best action a follower vehicle can take under a worst case scenario. Besides, the prediction analysis guarantees that under the proper assumptions, collision between vehicles may not happen at any future time. Simulations results indicate that all interactions of heterogeneous vehicles (i.e., car-truck, truck-car, car-car and truck-truck) are properly reproduced by the model. In addition, the model overcomes one of the major limitations of CA models for traffic modeling: the inability to perform smooth approach to slower or stopped vehicles. Moreover, the model is also capable of reproducing most empirical findings including the backward speed of the downstream front of the traffic jam, and different congested traffic patterns induced by a system with open boundary conditions with an on-ramp. Like most CA models, integer values are used to make the model run faster, which makes the proposed model suitable for real time traffic simulation of large networks.
Analytical network-averaging of the tube model: Strain-induced crystallization in natural rubber
NASA Astrophysics Data System (ADS)
Khiêm, Vu Ngoc; Itskov, Mikhail
2018-07-01
In this contribution, we extend the analytical network-averaging concept (Khiêm and Itskov, 2016) to phase transition during strain-induced crystallization of natural rubber. To this end, a physically-based constitutive model describing the nonisothermal strain-induced crystallization is proposed. Accordingly, the spatial arrangement of polymer subnetworks is driven by crystallization nucleation and consequently alters the mesoscopic deformation measures. The crystallization growth is elucidated by diffusion of chain segments into crystal nuclei. The crystallization results in a change of temperature and an evolution of heat source. By this means, not only the crystallization kinetics but also the Gough-Joule effect are thoroughly described. The predictive capability of the constitutive model is illustrated by comparison with experimental data for natural rubbers undergoing strain-induced crystallization. All measurable values such as stress, crystallinity and heat source are utilized for the comparison.
Influence of Wake Models on Calculated Tiltrotor Aerodynamics
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2001-01-01
The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.
Analytical methods to predict liquid congealing in ram air heat exchangers during cold operation
NASA Astrophysics Data System (ADS)
Coleman, Kenneth; Kosson, Robert
1989-07-01
Ram air heat exchangers used to cool liquids such as lube oils or Ethylene-Glycol/water solutions can be subject to congealing in very cold ambients, resulting in a loss of cooling capability. Two-dimensional, transient analytical models have been developed to explore this phenomenon with both continuous and staggered fin cores. Staggered fin predictions are compared to flight test data from the E-2C Allison T56 engine lube oil system during winter conditions. For simpler calculations, a viscosity ratio correction was introduced and found to provide reasonable cold ambient performance predictions for the staggered fin core, using a one-dimensional approach.
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
NASA Astrophysics Data System (ADS)
Ivanova, Bojidarka; Spiteller, Michael
2018-04-01
The problematic that we consider in this paper treats the quantitative correlation model equations between experimental kinetic and thermodynamic parameters of coupled electrospray ionization (ESI) mass spectrometry (MS) or atmospheric pressure chemical ionization (APCI) mass spectrometry with collision induced dissociation mass spectrometry, accounting for the fact that the physical phenomena and mechanisms of ESI- and APCI-ion formation are completely different. There are described forty two fragment reactions of three analytes under independent ESI- and APCI-measurements. The developed new quantitative models allow us to study correlatively the reaction kinetics and thermodynamics using the methods of mass spectrometry, which complementary application with the methods of the quantum chemistry provide 3D structural information of the analytes. Both static and dynamic quantum chemical computations are carried out. The object of analyses are [2,3-dimethyl-4-(4-methyl-benzoyl)-2,3-di-p-tolyl-cyclobutyl]-p-tolyl-methanone (1) and the polycyclic aromatic hydrocarbons derivatives of dibenzoperylen (2) and tetrabenzo [a,c,fg,op]naphthacene (3), respectively. As far as (1) is known to be a product of [2π+2π] cycloaddition reactions of chalcone (1,3-di-p-tolyl-propenone), however producing cyclic derivatives with different stereo selectivity, so that the study provide crucial data about the capability of mass spectrometry to provide determine the stereo selectivity of the analytes. This work also first provides quantitative treatment of the relations '3D molecular/electronic structures'-'quantum chemical diffusion coefficient'-'mass spectrometric diffusion coefficient', thus extending the capability of the mass spectrometry for determination of the exact 3D structure of the analytes using independent measurements and computations of the diffusion coefficients. The determination of the experimental diffusion parameters is carried out within the 'current monitoring method' evaluating the translation diffusion of charged analytes, while the theoretical modelling of MS ions and computations of theoretical diffusion coefficients are based on the Arrhenius type behavior of the charged species under ESI- and APCI-conditions. Although the study provide certain sound considerations for the quantitative relations between the reaction kinetic-thermodynamics and 3D structure of the analytes together with correlations between 3D molecular/electronic structures-quantum chemical diffusion coefficient-mass spectrometric diffusion coefficient, which contribute significantly to the structural analytical chemistry, the results have importance to other areas such as organic synthesis and catalysis as well.
NASA Technical Reports Server (NTRS)
Haste, Deepak; Azam, Mohammad; Ghoshal, Sudipto; Monte, James
2012-01-01
Health management (HM) in any engineering systems requires adequate understanding about the system s functioning; a sufficient amount of monitored data; the capability to extract, analyze, and collate information; and the capability to combine understanding and information for HM-related estimation and decision-making. Rotorcraft systems are, in general, highly complex. Obtaining adequate understanding about functioning of such systems is quite difficult, because of the proprietary (restricted access) nature of their designs and dynamic models. Development of an EIM (exact inverse map) solution for rotorcraft requires a process that can overcome the abovementioned difficulties and maximally utilize monitored information for HM facilitation via employing advanced analytic techniques. The goal was to develop a versatile HM solution for rotorcraft for facilitation of the Condition Based Maintenance Plus (CBM+) capabilities. The effort was geared towards developing analytic and reasoning techniques, and proving the ability to embed the required capabilities on a rotorcraft platform, paving the way for implementing the solution on an aircraft-level system for consolidation and reporting. The solution for rotorcraft can he used offboard or embedded directly onto a rotorcraft system. The envisioned solution utilizes available monitored and archived data for real-time fault detection and identification, failure precursor identification, and offline fault detection and diagnostics, health condition forecasting, optimal guided troubleshooting, and maintenance decision support. A variant of the onboard version is a self-contained hardware and software (HW+SW) package that can be embedded on rotorcraft systems. The HM solution comprises components that gather/ingest data and information, perform information/feature extraction, analyze information in conjunction with the dependency/diagnostic model of the target system, facilitate optimal guided troubleshooting, and offer decision support for optimal maintenance.
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2017-12-01
NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing
UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.
Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun
2013-12-01
Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.
NASA Technical Reports Server (NTRS)
Tamkin, Glenn S. (Inventor); Duffy, Daniel Q. (Inventor); Schnase, John L. (Inventor)
2016-01-01
A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.
Visual Information for the Desktop, version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2006-03-29
VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.
Analytical calculation on the determination of steep side wall angles from far field measurements
NASA Astrophysics Data System (ADS)
Cisotto, Luca; Pereira, Silvania F.; Urbach, H. Paul
2018-06-01
In the semiconductor industry, the performance and capabilities of the lithographic process are evaluated by measuring specific structures. These structures are often gratings of which the shape is described by a few parameters such as period, middle critical dimension, height, and side wall angle (SWA). Upon direct measurement or retrieval of these parameters, the determination of the SWA suffers from considerable inaccuracies. Although the scattering effects that steep SWAs have on the illumination can be obtained with rigorous numerical simulations, analytical models constitute a very useful tool to get insights into the problem we are treating. In this paper, we develop an approach based on analytical calculations to describe the scattering of a cliff and a ridge with steep SWAs. We also propose a detection system to determine the SWAs of the structures.
Relative motion of orbiting particles under the influence of perturbing forces. Volume 1: Summary
NASA Technical Reports Server (NTRS)
Eades, J. B., Jr.
1974-01-01
The relative motion for orbiting vehicles, under the influence of various perturbing forces, has been studied to determine what influence these inputs, and others, can have. The analytical tasks are discribed in general terms; the force types considered, are outlined modelled and simulated, and the capabilities of the computer programs which have evolved in support of this work are denoted.
An analytical model with flexible accuracy for deep submicron DCVSL cells
NASA Astrophysics Data System (ADS)
Valiollahi, Sepideh; Ardeshir, Gholamreza
2018-07-01
Differential cascoded voltage switch logic (DCVSL) cells are among the best candidates of circuit designers for a wide range of applications due to advantages such as low input capacitance, high switching speed, small area and noise-immunity; nevertheless, a proper model has not yet been developed to analyse them. This paper analyses deep submicron DCVSL cells based on a flexible accuracy-simplicity trade-off including the following key features: (1) the model is capable of producing closed-form expressions with an acceptable accuracy; (2) model equations can be solved numerically to offer higher accuracy; (3) the short-circuit currents occurring in high-low/low-high transitions are accounted in analysis and (4) the changes in the operating modes of transistors during transitions together with an efficient submicron I-V model, which incorporates the most important non-ideal short-channel effects, are considered. The accuracy of the proposed model is validated in IBM 0.13 µm CMOS technology through comparisons with the accurate physically based BSIM3 model. The maximum error caused by analytical solutions is below 10%, while this amount is below 7% for numerical solutions.
NASA Technical Reports Server (NTRS)
Orifici, Adrian C.; Krueger, Ronald
2010-01-01
With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.
2012-12-01
MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.
Inelastic response of metal matrix composites under biaxial loading
NASA Technical Reports Server (NTRS)
Mirzadeh, F.; Pindera, Marek-Jerzy; Herakovich, Carl T.
1990-01-01
Elements of the analytical/experimental program to characterize the response of silicon carbide titanium (SCS-6/Ti-15-3) composite tubes under biaxial loading are outlined. The analytical program comprises prediction of initial yielding and subsequent inelastic response of unidirectional and angle-ply silicon carbide titanium tubes using a combined micromechanics approach and laminate analysis. The micromechanics approach is based on the method of cells model and has the capability of generating the effective thermomechanical response of metal matrix composites in the linear and inelastic region in the presence of temperature and time-dependent properties of the individual constituents and imperfect bonding on the initial yield surfaces and inelastic response of (0) and (+ or - 45)sub s SCS-6/Ti-15-3 laminates loaded by different combinations of stresses. The generated analytical predictions will be compared with the experimental results. The experimental program comprises generation of initial yield surfaces, subsequent stress-strain curves and determination of failure loads of the SCS-6/Ti-15-3 tubes under selected loading conditions. The results of the analytical investigation are employed to define the actual loading paths for the experimental program. A brief overview of the experimental methodology is given. This includes the test capabilities of the Composite Mechanics Laboratory at the University of Virginia, the SCS-6/Ti-15-3 composite tubes secured from McDonnell Douglas Corporation, a text fixture specifically developed for combined axial-torsional loading, and the MTS combined axial-torsion loader that will be employed in the actual testing.
NASA Astrophysics Data System (ADS)
Cabello, Violeta
2017-04-01
This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.
NASA Technical Reports Server (NTRS)
Yeager, William T., Jr.; Kvaternik, Raymond G.
2001-01-01
A historical account of the contributions of the Aeroelasticity Branch (AB) and the Langley Transonic Dynamics Tunnel (TDT) to rotorcraft technology and development since the tunnel's inception in 1960 is presented. The paper begins with a summary of the major characteristics of the TDT and a description of the unique capability offered by the TDT for testing aeroelastic models by virtue of its heavy gas test medium. This is followed by some remarks on the role played by scale models in the design and development of rotorcraft vehicles and a review of the basic scaling relationships important for designing and building dynamic aeroelastic models of rotorcraft vehicles for testing in the TDT. Chronological accounts of helicopter and tiltrotor research conducted in AB/TDT are then described in separate sections. Both experimental and analytical studies are reported and include a description of the various physical and mathematical models employed, the specific objectives of the investigations, and illustrative experimental and analytical results.
NASA Technical Reports Server (NTRS)
1975-01-01
An introduction to the MAPSEP organization and a detailed analytical description of all models and algorithms are given. These include trajectory and error covariance propagation methods, orbit determination processes, thrust modeling, and trajectory correction (guidance) schemes. Earth orbital MAPSEP contains the capability of analyzing almost any currently projected low thrust mission from low earth orbit to super synchronous altitudes. Furthermore, MAPSEP is sufficiently flexible to incorporate extended dynamic models, alternate mission strategies, and almost any other system requirement imposed by the user. As in the interplanetary version, earth orbital MAPSEP represents a trade-off between precision modeling and computational speed consistent with defining necessary system requirements. It can be used in feasibility studies as well as in flight operational support. Pertinent operational constraints are available both implicitly and explicitly. However, the reader should be warned that because of program complexity, MAPSEP is only as good as the user and will quickly succumb to faulty user inputs.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Layerwise Finite Elements for Smart Piezoceramic Composite Plates in Thermal Environments
NASA Technical Reports Server (NTRS)
Saravanos, Dimitris A.; Lee, Ho-Jun
1996-01-01
Analytical formulations are presented which account for the coupled mechanical, electrical, and thermal response of piezoelectric composite laminates and plate structures. A layerwise theory is formulated with the inherent capability to explicitly model the active and sensory response of piezoelectric composite plates having arbitrary laminate configurations in thermal environments. Finite element equations are derived and implemented for a bilinear 4-noded plate element. Application cases demonstrate the capability to manage thermally induced bending and twisting deformations in symmetric and antisymmetric composite plates with piezoelectric actuators, and show the corresponding electrical response of distributed piezoelectric sensors. Finally, the resultant stresses in the thermal piezoelectric composite laminates are investigated.
Strategic analytics: towards fully embedding evidence in healthcare decision-making.
Garay, Jason; Cartagena, Rosario; Esensoy, Ali Vahit; Handa, Kiren; Kane, Eli; Kaw, Neal; Sadat, Somayeh
2015-01-01
Cancer Care Ontario (CCO) has implemented multiple information technology solutions and collected health-system data to support its programs. There is now an opportunity to leverage these data and perform advanced end-to-end analytics that inform decisions around improving health-system performance. In 2014, CCO engaged in an extensive assessment of its current data capacity and capability, with the intent to drive increased use of data for evidence-based decision-making. The breadth and volume of data at CCO uniquely places the organization to contribute to not only system-wide operational reporting, but more advanced modelling of current and future state system management and planning. In 2012, CCO established a strategic analytics practice to assist the agency's programs contextualize and inform key business decisions and to provide support through innovative predictive analytics solutions. This paper describes the organizational structure, services and supporting operations that have enabled progress to date, and discusses the next steps towards the vision of embedding evidence fully into healthcare decision-making. Copyright © 2014 Longwoods Publishing.
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary D.; Goldberg, Robert K.
2010-01-01
The reliability of impact simulations for aircraft components made with triaxial-braided carbon-fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Methods to characterize the material properties used in the analytical models from a systematically obtained set of test data are also lacking. A macroscopic finite element based analytical model to analyze the impact response of these materials has been developed. The stiffness and strength properties utilized in the material model are obtained from a set of quasi-static in-plane tension, compression and shear coupon level tests. Full-field optical strain measurement techniques are applied in the testing, and the results are used to help in characterizing the model. The unit cell of the braided composite is modeled as a series of shell elements, where each element is modeled as a laminated composite. The braided architecture can thus be approximated within the analytical model. The transient dynamic finite element code LS-DYNA is utilized to conduct the finite element simulations, and an internal LS-DYNA constitutive model is utilized in the analysis. Methods to obtain the stiffness and strength properties required by the constitutive model from the available test data are developed. Simulations of quasi-static coupon tests and impact tests of a represented braided composite are conducted. Overall, the developed method shows promise, but improvements that are needed in test and analysis methods for better predictive capability are examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, D.R.; Hutchinson, J.L.
Eagle 11 is a prototype analytic model derived from the integration of the low resolution Eagle model with the high resolution SIMNET model. This integration promises a new capability to allow for a more effective examination of proposed or existing combat systems that could not be easily evaluated using either Eagle or SIMNET alone. In essence, Eagle II becomes a multi-resolution combat model in which simulated combat units can exhibit both high and low fidelity behavior at different times during model execution. This capability allows a unit to behave in a highly manner only when required, thereby reducing the overallmore » computational and manpower requirements for a given study. In this framework, the SIMNET portion enables a highly credible assessment of the performance of individual combat systems under consideration, encompassing both engineering performance and crew capabilities. However, when the assessment being conducted goes beyond system performance and extends to questions of force structure balance and sustainment, then SISMNET results can be used to ``calibrate`` the Eagle attrition process appropriate to the study at hand. Advancing technologies, changes in the world-wide threat, requirements for flexible response, declining defense budgets, and down-sizing of military forces motivate the development of manpower-efficient, low-cost, responsive tools for combat development studies. Eagle and SIMNET both serve as credible and useful tools. The integration of these two models promises enhanced capabilities to examine the broader, deeper, more complex battlefield of the future with higher fidelity, greater responsiveness and low overall cost.« less
Montemurro, Milagros; Pinto, Licarion; Véras, Germano; de Araújo Gomes, Adriano; Culzoni, María J; Ugulino de Araújo, Mário C; Goicoechea, Héctor C
2016-07-01
A study regarding the acquisition and analytical utilization of four-way data acquired by monitoring excitation-emission fluorescence matrices at different elution time points in a fast HPLC procedure is presented. The data were modeled with three well-known algorithms: PARAFAC, U-PLS/RTL and MCR-ALS, the latter conveniently adapted to model third-order data. The second-order advantage was exploited when analyzing samples containing uncalibrated components. The best results were furnished with the algorithm U-PLS/RTL. This fact is indicative of both no peak time shifts occurrence among samples and high colinearity among spectra. Besides, this latent-variable structured algorithm is capable of better handle the need of achieving high sensitivity for the analysis of one of the analytes. In addition, a significant enhancement in both predictions and analytical figures of merit was observed for carbendazim, thiabendazole, fuberidazole, carbofuran, carbaryl and 1-naphtol, when going from second- to third-order data. LODs obtained were ranged between 0.02 and 2.4μgL(-1). Copyright © 2016 Elsevier B.V. All rights reserved.
Bringing Business Intelligence to Health Information Technology Curriculum
ERIC Educational Resources Information Center
Zheng, Guangzhi; Zhang, Chi; Li, Lei
2015-01-01
Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare information technology (HIT) or health…
Electron-Beam Diagnostic Methods for Hypersonic Flow Diagnostics
NASA Technical Reports Server (NTRS)
1994-01-01
The purpose of this work was the evaluation of the use of electron-bean fluorescence for flow measurements during hypersonic flight. Both analytical and numerical models were developed in this investigation to evaluate quantitatively flow field imaging concepts based upon the electron beam fluorescence technique for use in flight research and wind tunnel applications. Specific models were developed for: (1) fluorescence excitation/emission for nitrogen, (2) rotational fluorescence spectrum for nitrogen, (3) single and multiple scattering of electrons in a variable density medium, (4) spatial and spectral distribution of fluorescence, (5) measurement of rotational temperature and density, (6) optical filter design for fluorescence imaging, and (7) temperature accuracy and signal acquisition time requirements. Application of these models to a typical hypersonic wind tunnel flow is presented. In particular, the capability of simulating the fluorescence resulting from electron impact ionization in a variable density nitrogen or air flow provides the capability to evaluate the design of imaging instruments for flow field mapping. The result of this analysis is a recommendation that quantitative measurements of hypersonic flow fields using electron-bean fluorescence is a tractable method with electron beam energies of 100 keV. With lower electron energies, electron scattering increases with significant beam divergence which makes quantitative imaging difficult. The potential application of the analytical and numerical models developed in this work is in the design of a flow field imaging instrument for use in hypersonic wind tunnels or onboard a flight research vehicle.
Experimental and Analytical Determinations of Spiral Bevel Gear-Tooth Bending Stress Compared
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.
2000-01-01
Spiral bevel gears are currently used in all main-rotor drive systems for rotorcraft produced in the United States. Applications such as these need spiral bevel gears to turn the corner from the horizontal gas turbine engine to the vertical rotor shaft. These gears must typically operate at extremely high rotational speeds and carry high power levels. With these difficult operating conditions, an improved analytical capability is paramount to increasing aircraft safety and reliability. Also, literature on the analysis and testing of spiral bevel gears has been very sparse in comparison to that for parallel axis gears. This is due to the complex geometry of this type of gear and to the specialized test equipment necessary to test these components. To develop an analytical model of spiral bevel gears, researchers use differential geometry methods to model the manufacturing kinematics. A three-dimensional spiral bevel gear modeling method was developed that uses finite elements for the structural analysis. This method was used to analyze the three-dimensional contact pattern between the test pinion and gear used in the Spiral Bevel Gear Test Facility at the NASA Glenn Research Center at Lewis Field. Results of this analysis are illustrated in the preceding figure. The development of the analytical method was a joint endeavor between NASA Glenn, the U.S. Army Research Laboratory, and the University of North Dakota.
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183
Initial study of thermal energy storage in unconfined aquifers. [UCATES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haitjema, H.M.; Strack, O.D.L.
1986-04-01
Convective heat transport in unconfined aquifers is modeled in a semi-analytic way. The transient groundwater flow is modeled by superposition of analytic functions, whereby changes in the aquifer storage are represented by a network of triangles, each with a linearly varying sink distribution. This analytic formulation incorporates the nonlinearity of the differential equation for unconfined flow and eliminates numerical dispersion in modeling heat convection. The thermal losses through the aquifer base and vadose zone are modeled rather crudely. Only vertical heat conduction is considered in these boundaries, whereby a linearly varying temperature is assumed at all times. The latter assumptionmore » appears reasonable for thin aquifer boundaries. However, assuming such thin aquifer boundaries may lead to an overestimation of the thermal losses when the aquifer base is regarded as infinitely thick in reality. The approach is implemented in the computer program UCATES, which serves as a first step toward the development of a comprehensive screening tool for ATES systems in unconfined aquifers. In its present form, the program is capable of predicting the relative effects of regional flow on the efficiency of ATES systems. However, only after a more realistic heatloss mechanism is incorporated in UCATES will reliable predictions of absolute ATES efficiencies be possible.« less
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
Computer modeling of a two-junction, monolithic cascade solar cell
NASA Technical Reports Server (NTRS)
Lamorte, M. F.; Abbott, D.
1979-01-01
The theory and design criteria for monolithic, two-junction cascade solar cells are described. The departure from the conventional solar cell analytical method and the reasons for using the integral form of the continuity equations are briefly discussed. The results of design optimization are presented. The energy conversion efficiency that is predicted for the optimized structure is greater than 30% at 300 K, AMO and one sun. The analytical method predicts device performance characteristics as a function of temperature. The range is restricted to 300 to 600 K. While the analysis is capable of determining most of the physical processes occurring in each of the individual layers, only the more significant device performance characteristics are presented.
Collector modulation in high-voltage bipolar transistor in the saturation mode: Analytical approach
NASA Astrophysics Data System (ADS)
Dmitriev, A. P.; Gert, A. V.; Levinshtein, M. E.; Yuferev, V. S.
2018-04-01
A simple analytical model is developed, capable of replacing the numerical solution of a system of nonlinear partial differential equations by solving a simple algebraic equation when analyzing the collector resistance modulation of a bipolar transistor in the saturation mode. In this approach, the leakage of the base current into the emitter and the recombination of non-equilibrium carriers in the base are taken into account. The data obtained are in good agreement with the results of numerical calculations and make it possible to describe both the motion of the front of the minority carriers and the steady state distribution of minority carriers across the collector in the saturation mode.
DIVE: A Graph-based Visual Analytics Framework for Big Data
Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie
2014-01-01
The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197
Low velocity impact analysis of composite laminated plates
NASA Astrophysics Data System (ADS)
Zheng, Daihua
2007-12-01
In the past few decades polymer composites have been utilized more in structures where high strength and light weight are major concerns, e.g., aircraft, high-speed boats and sports supplies. It is well known that they are susceptible to damage resulting from lateral impact by foreign objects, such as dropped tools, hail and debris thrown up from the runway. The impact response of the structures depends not only on the material properties but also on the dynamic behavior of the impacted structure. Although commercial software is capable of analyzing such impact processes, it often requires extensive expertise and rigorous training for design and analysis. Analytical models are useful as they allow parametric studies and provide a foundation for validating the numerical results from large-scale commercial software. Therefore, it is necessary to develop analytical or semi-analytical models to better understand the behaviors of composite structures under impact and their associated failure process. In this study, several analytical models are proposed in order to analyze the impact response of composite laminated plates. Based on Meyer's Power Law, a semi-analytical model is obtained for small mass impact response of infinite composite laminates by the method of asymptotic expansion. The original nonlinear second-order ordinary differential equation is transformed into two linear ordinary differential equations. This is achieved by neglecting high-order terms in the asymptotic expansion. As a result, the semi-analytical solution of the overall impact response can be applied to contact laws with varying coefficients. Then an analytical model accounting for permanent deformation based on an elasto-plastic contact law is proposed to obtain the closed-form solutions of the wave-controlled impact responses of composite laminates. The analytical model is also used to predict the threshold velocity for delamination onset by combining with an existing quasi-static delamination criterion. The predictions are compared with experimental data and explicit finite element LS-DYNA simulation. The comparisons show reasonable agreement. Furthermore, an analytical model is developed to evaluate the combined effects of prestresses and permanent deformation based on the linearized elasto-plastic contact law and the Laplace Transform technique. It is demonstrated that prestresses do not have noticeable effects on the time history of contact force and strains, but they have significant consequences on the plate central displacement. For a impacted composite laminate with the presence of prestresses, the contact force increases with the increasing of the mass of impactor, thickness and interlaminar shear strength of the laminate. The combined analytical and numerical investigations provide validated models for elastic and elasto-plastic impact analysis of composite structures and shed light on the design of impact-resistant composite systems.
Orbital maneuvering engine feed system coupled stability investigation
NASA Technical Reports Server (NTRS)
Kahn, D. R.; Schuman, M. D.; Hunting, J. K.; Fertig, K. W.
1975-01-01
A digital computer model used to analyze and predict engine feed system coupled instabilities over a frequency range of 10 to 1000 Hz was developed and verified. The analytical approach to modeling the feed system hydrodynamics, combustion dynamics, chamber dynamics, and overall engineering model structure is described and the governing equations in each of the technical areas are presented. This is followed by a description of the generalized computer model, including formulation of the discrete subprograms and their integration into an overall engineering model structure. The operation and capabilities of the engineering model were verified by comparing the model's theoretical predictions with experimental data from an OMS-type engine with a known feed system/engine chugging history.
Assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry
Taylor, Howard E.; Garbarino, John R.
1988-01-01
A thorough assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry was conducted for selected analytes of importance in water quality applications and hydrologic research. A multielement calibration curve technique was designed to produce accurate and precise results in analysis times of approximately one minute. The suite of elements included Al, As, B, Ba, Be, Cd, Co, Cr, Cu, Hg, Li, Mn, Mo, Ni, Pb, Se, Sr, V, and Zn. The effects of sample matrix composition on the accuracy of the determinations showed that matrix elements (such as Na, Ca, Mg, and K) that may be present in natural water samples at concentration levels greater than 50 mg/L resulted in as much as a 10% suppression in ion current for analyte elements. Operational detection limits are presented.
Study of helicopterroll control effectiveness criteria
NASA Technical Reports Server (NTRS)
Heffley, Robert K.; Bourne, Simon M.; Curtiss, Howard C., Jr.; Hindson, William S.; Hess, Ronald A.
1986-01-01
A study of helicopter roll control effectiveness based on closed-loop task performance measurement and modeling is presented. Roll control critieria are based on task margin, the excess of vehicle task performance capability over the pilot's task performance demand. Appropriate helicopter roll axis dynamic models are defined for use with analytic models for task performance. Both near-earth and up-and-away large-amplitude maneuvering phases are considered. The results of in-flight and moving-base simulation measurements are presented to support the roll control effectiveness criteria offered. This Volume contains the theoretical analysis, simulation results and criteria development.
Aerodynamic and acoustic test of a United Technologies model scale rotor at DNW
NASA Technical Reports Server (NTRS)
Yu, Yung H.; Liu, Sandy R.; Jordan, Dave E.; Landgrebe, Anton J.; Lorber, Peter F.; Pollack, Michael J.; Martin, Ruth M.
1990-01-01
The UTC model scale rotors, the DNW wind tunnel, the AFDD rotary wing test stand, the UTRC and AFDD aerodynamic and acoustic data acquisition systems, and the scope of test matrices are discussed and an introduction to the test results is provided. It is pointed out that a comprehensive aero/acoustic database of several configurations of the UTC scaled model rotor has been created. The data is expected to improve understanding of rotor aerodynamics, acoustics, and dynamics, and lead to enhanced analytical methodology and design capabilities for the next generation of rotorcraft.
The NASTRAN theoretical manual
NASA Technical Reports Server (NTRS)
1981-01-01
Designed to accommodate additions and modifications, this commentary on NASTRAN describes the problem solving capabilities of the program in a narrative fashion and presents developments of the analytical and numerical procedures that underlie the program. Seventeen major sections and numerous subsections cover; the organizational aspects of the program, utility matrix routines, static structural analysis, heat transfer, dynamic structural analysis, computer graphics, special structural modeling techniques, error analysis, interaction between structures and fluids, and aeroelastic analysis.
Control and characterization of a bistable laminate generated with piezoelectricity
NASA Astrophysics Data System (ADS)
Lee, Andrew J.; Moosavian, Amin; Inman, Daniel J.
2017-08-01
Extensive research has been conducted on utilizing smart materials such as piezoelectric and shape memory alloy actuators to induce snap through of bistable structures for morphing applications. However, there has only been limited success in initiating snap through from both stable states due to the lack of actuation authority. A novel solution in the form of a piezoelectrically generated bistable laminate consisting of only macro fiber composites (MFC), allowing complete configuration control without any external assistance, is explored in detail here. Specifically, this paper presents the full analytical, computational, and experimental results of the laminate’s design, geometry, bifurcation behavior, and snap through capability. By bonding two actuated MFCs in a [0MFC/90MFC]T layup and releasing the voltage post cure, piezoelectric strain anisotropy and the resulting in-plane residual stresses yield two statically stable states that are cylindrically shaped. The analytical model uses the Rayleigh-Ritz minimization of total potential energy and finite element analysis is implemented in MSC Nastran. The [0MFC/90MFC]T laminate is then manufactured and experimentally characterized for model validation. This paper demonstrates the adaptive laminate’s unassisted forward and reverse snap through capability enabled by the efficiencies gained from simultaneously being the actuator and the primary structure.
Using analytic element models to delineate drinking water source protection areas.
Raymond, Heather A; Bondoc, Michael; McGinnis, John; Metropulos, Kathy; Heider, Pat; Reed, Allison; Saines, Steve
2006-01-01
Since 1999, Ohio EPA hydrogeologists have used two analytic element models (AEMs), the proprietary software GFLOW and U.S. EPA's WhAEM, to delineate protection areas for 535 public water systems. Both models now use the GFLOW2001 solution engine, integrate well with Geographic Information System (GIS) technology, have a user-friendly graphical interface, are capable of simulating a variety of complex hydrogeologic settings, and do not rely upon a model grid. These features simplify the modeling process and enable AEMs to bridge the gap between existing simplistic delineation methods and more complex numerical models. Ohio EPA hydrogeologists demonstrated that WhAEM2000 and GFLOW2000 were capable of producing capture zones similar to more widely accepted models by applying the AEMs to eight sites that had been previously delineated using other methods. After the Ohio EPA delineated protection areas using AEMs, more simplistic delineation methods used by other states (volumetric equation and arbitrary fixed radii) were applied to the same water systems to compare the differences between various methods. GIS software and two-tailed paired t-tests were used to quantify the differences in protection areas and analyze the data. The results of this analysis demonstrate that AEMs typically produce significantly different protection areas than the most simplistic delineation methods, in terms of total area and shape. If the volumetric equation had been used instead of AEMs, Ohio would not have protected 265 km2 of critical upgradient area and would have overprotected 269 km2 of primarily downgradient land. Since an increasing number of land-use restrictions are being tied to drinking water protection areas, this analysis has broad policy implications.
Modelling shoreline evolution in the vicinity of a groyne and a river
NASA Astrophysics Data System (ADS)
Valsamidis, Antonios; Reeve, Dominic E.
2017-01-01
Analytical solutions to the equations governing shoreline evolution are well-known and have value both as pedagogical tools and for conceptual design. Nevertheless, solutions have been restricted to a fairly narrow class of conditions with limited applicability to real-life situations. We present a new analytical solution for a widely encountered situation where a groyne is constructed close to a river to control sediment movement. The solution, which employs Laplace transforms, has the advantage that a solution for time-varying conditions may be constructed from the solution for constant conditions by means of the Heaviside procedure. Solutions are presented for various combinations of wave conditions and sediment supply/removal by the river. An innovation introduced in this work is the capability to provide an analytical assessment of the accretion or erosion caused near the groyne due to its proximity to the river which may act either as a source or a sink of sediment material.
On-line soft sensing in upstream bioprocessing.
Randek, Judit; Mandenius, Carl-Fredrik
2018-02-01
This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.
Techniques for sensing methanol concentration in aqueous environments
NASA Technical Reports Server (NTRS)
Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)
2001-01-01
An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.; Blinzler, Brina J.; Binienda, Wieslaw K.
2010-01-01
A macro level finite element-based model has been developed to simulate the mechanical and impact response of triaxially-braided polymer matrix composites. In the analytical model, the triaxial braid architecture is simulated by using four parallel shell elements, each of which is modeled as a laminated composite. For the current analytical approach, each shell element is considered to be a smeared homogeneous material. The commercial transient dynamic finite element code LS-DYNA is used to conduct the simulations, and a continuum damage mechanics model internal to LS-DYNA is used as the material constitutive model. The constitutive model requires stiffness and strength properties of an equivalent unidirectional composite. Simplified micromechanics methods are used to determine the equivalent stiffness properties, and results from coupon level tests on the braided composite are utilized to back out the required strength properties. Simulations of quasi-static coupon tests of several representative braided composites are conducted to demonstrate the correlation of the model. Impact simulations of a represented braided composites are conducted to demonstrate the capability of the model to predict the penetration velocity and damage patterns obtained experimentally.
Research and development activities in unified control-structure modeling and design
NASA Technical Reports Server (NTRS)
Nayak, A. P.
1985-01-01
Results of work to develop a unified control/structures modeling and design capability for large space structures modeling are presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. Parallel research done by other researchers is reviewed. The development of a methodology for global design optimization is recommended as a long-term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization.
A generic multi-flex-body dynamics, controls simulation tool for space station
NASA Technical Reports Server (NTRS)
London, Ken W.; Lee, John F.; Singh, Ramen P.; Schubele, Buddy
1991-01-01
An order (n) multiflex body Space Station simulation tool is introduced. The flex multibody modeling is generic enough to model all phases of Space Station from build up through to Assembly Complete configuration and beyond. Multibody subsystems such as the Mobile Servicing System (MSS) undergoing a prescribed translation and rotation are also allowed. The software includes aerodynamic, gravity gradient, and magnetic field models. User defined controllers can be discrete or continuous. Extensive preprocessing of 'body by body' NASTRAN flex data is built in. A significant aspect, too, is the integrated controls design capability which includes model reduction and analytic linearization.
Analytical solution for shear bands in cold-rolled 1018 steel
NASA Astrophysics Data System (ADS)
Voyiadjis, George Z.; Almasri, Amin H.; Faghihi, Danial; Palazotto, Anthony N.
2012-06-01
Cold-rolled 1018 (CR-1018) carbon steel has been well known for its susceptibility to adiabatic shear banding under dynamic loadings. Analysis of these localizations highly depends on the selection of the constitutive model. To deal with this issue, a constitutive model that takes temperature and strain rate effect into account is proposed. The model is motivated by two physical-based models: the Zerilli and Armstrong and the Voyiadjis and Abed models. This material model, however, incorporates a simple softening term that is capable of simulating the softening behavior of CR-1018 steel. Instability, localization, and evolution of adiabatic shear bands are discussed and presented graphically. In addition, the effect of hydrostatic pressure is illustrated.
Using the MCNP Taylor series perturbation feature (efficiently) for shielding problems
NASA Astrophysics Data System (ADS)
Favorite, Jeffrey
2017-09-01
The Taylor series or differential operator perturbation method, implemented in MCNP and invoked using the PERT card, can be used for efficient parameter studies in shielding problems. This paper shows how only two PERT cards are needed to generate an entire parameter study, including statistical uncertainty estimates (an additional three PERT cards can be used to give exact statistical uncertainties). One realistic example problem involves a detailed helium-3 neutron detector model and its efficiency as a function of the density of its high-density polyethylene moderator. The MCNP differential operator perturbation capability is extremely accurate for this problem. A second problem involves the density of the polyethylene reflector of the BeRP ball and is an example of first-order sensitivity analysis using the PERT capability. A third problem is an analytic verification of the PERT capability.
NASA Technical Reports Server (NTRS)
Ryan, Margaret A.; Shevade, A. V.; Taylor, C. J.; Homer, M. L.; Jewell, A. D.; Kisor, A.; Manatt, K. S .; Yen, S. P. S.; Blanco, M.; Goddard, W. A., III
2006-01-01
An array-based sensing system based on polymer/carbon composite conductometric sensors is under development at JPL for use as an environmental monitor in the International Space Station. Sulfur dioxide has been added to the analyte set for this phase of development. Using molecular modeling techniques, the interaction energy between SO2 and polymer functional groups has been calculated, and polymers selected as potential SO2 sensors. Experiment has validated the model and two selected polymers have been shown to be promising materials for SO2 detection.
Making advanced analytics work for you.
Barton, Dominic; Court, David
2012-10-01
Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.
Multi-analytical Approaches Informing the Risk of Sepsis
NASA Astrophysics Data System (ADS)
Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael
Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...
2017-01-01
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; DeLoach, Richard
2003-01-01
A wind tunnel experiment for characterizing the aerodynamic and propulsion forces and moments acting on a research model airplane is described. The model airplane called the Free-flying Airplane for Sub-scale Experimental Research (FASER), is a modified off-the-shelf radio-controlled model airplane, with 7 ft wingspan, a tractor propeller driven by an electric motor, and aerobatic capability. FASER was tested in the NASA Langley 12-foot Low-Speed Wind Tunnel, using a combination of traditional sweeps and modern experiment design. Power level was included as an independent variable in the wind tunnel test, to allow characterization of power effects on aerodynamic forces and moments. A modeling technique that employs multivariate orthogonal functions was used to develop accurate analytic models for the aerodynamic and propulsion force and moment coefficient dependencies from the wind tunnel data. Efficient methods for generating orthogonal modeling functions, expanding the orthogonal modeling functions in terms of ordinary polynomial functions, and analytical orthogonal blocking were developed and discussed. The resulting models comprise a set of smooth, differentiable functions for the non-dimensional aerodynamic force and moment coefficients in terms of ordinary polynomials in the independent variables, suitable for nonlinear aircraft simulation.
High Performance Visualization using Query-Driven Visualizationand Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E. Wes; Campbell, Scott; Dart, Eli
2006-06-15
Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.
Modeling Progressive Failure of Bonded Joints Using a Single Joint Finite Element
NASA Technical Reports Server (NTRS)
Stapleton, Scott E.; Waas, Anthony M.; Bednarcyk, Brett A.
2010-01-01
Enhanced finite elements are elements with an embedded analytical solution which can capture detailed local fields, enabling more efficient, mesh-independent finite element analysis. In the present study, an enhanced finite element is applied to generate a general framework capable of modeling an array of joint types. The joint field equations are derived using the principle of minimum potential energy, and the resulting solutions for the displacement fields are used to generate shape functions and a stiffness matrix for a single joint finite element. This single finite element thus captures the detailed stress and strain fields within the bonded joint, but it can function within a broader structural finite element model. The costs associated with a fine mesh of the joint can thus be avoided while still obtaining a detailed solution for the joint. Additionally, the capability to model non-linear adhesive constitutive behavior has been included within the method, and progressive failure of the adhesive can be modeled by using a strain-based failure criteria and re-sizing the joint as the adhesive fails. Results of the model compare favorably with experimental and finite element results.
Telematics Options and Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Cabell
This presentation describes the data tracking and analytical capabilities of telematics devices. Federal fleet managers can use the systems to keep their drivers safe, maintain a fuel efficient fleet, ease their reporting burden, and save money. The presentation includes an example of how much these capabilities can save fleets.
Status Report on Ex-Vessel Coolability and Water Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.; Robb, K. R.
Specific to BWR plants, current accident management guidance calls for flooding the drywell to a level of approximately 1.2 m (4 feet) above the drywell floor once vessel breach has been determined. While this action can help to submerge ex-vessel core debris, it can also result in flooding the wetwell and thereby rendering the wetwell vent path unavailable. An alternate strategy is being developed in the industry guidance for responding to the severe accident capable vent Order, EA-13-109. The alternate strategy being proposed would throttle the flooding rate to achieve a stable wetwell water level while preserving the wetwell ventmore » path. The overall objective of this work is to upgrade existing analytical tools (i.e. MELTSPREAD and CORQUENCH - which have been used as part of the DOE-sponsored Fukushima accident analyses) in order to provide flexible, analytically capable, and validated models to support the development of water throttling strategies for BWRs that are aimed at keeping ex-vessel core debris covered with water while preserving the wetwell vent path.« less
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Development of PARMA: PHITS-based analytical radiation model in the atmosphere.
Sato, Tatsuhiko; Yasuda, Hiroshi; Niita, Koji; Endo, Akira; Sihver, Lembit
2008-08-01
Estimation of cosmic-ray spectra in the atmosphere has been essential for the evaluation of aviation doses. We therefore calculated these spectra by performing Monte Carlo simulation of cosmic-ray propagation in the atmosphere using the PHITS code. The accuracy of the simulation was well verified by experimental data taken under various conditions, even near sea level. Based on a comprehensive analysis of the simulation results, we proposed an analytical model for estimating the cosmic-ray spectra of neutrons, protons, helium ions, muons, electrons, positrons and photons applicable to any location in the atmosphere at altitudes below 20 km. Our model, named PARMA, enables us to calculate the cosmic radiation doses rapidly with a precision equivalent to that of the Monte Carlo simulation, which requires much more computational time. With these properties, PARMA is capable of improving the accuracy and efficiency of the cosmic-ray exposure dose estimations not only for aircrews but also for the public on the ground.
Chiap, P; Rbeida, O; Christiaens, B; Hubert, Ph; Lubda, D; Boos, K S; Crommen, J
2002-10-25
A new kind of silica-based restricted-access material (RAM) has been tested in pre-columns for the on-line solid-phase extraction (SPE) of basic drugs from directly injected plasma samples before their quantitative analysis by reversed-phase liquid chromatography (LC), using the column switching technique. The outer surface of the porous RAM particlescontains hydrophilic diol groups while sulphonic acid groups are bound to the internal surface, which gives the sorbent the properties of a strong cation exchanger towards low molecular mass compounds. Macromolecules such as proteins have no access to the internal surface of the pre-column due to their exclusion from the pores and are then flushed directly out. The retention capability of this novel packing material has been tested for some hydrophilic basic drugs, such as atropine, fenoterol, ipratropium, procaine, sotalol and terbutaline, used as model compounds. The influence of the composition of the washing liquid on the retention of the analytes in the pre-column has been investigated. The elution profiles of the different compounds and the plasma matrix as well as the time needed for the transfer of the analytes from the pre-column to the analytical column were determined in order to deduce the most suitable conditions for the clean-up step and develop on-line methods for the LC determination of these compounds in plasma. The cationic exchange sorbent was also compared to another RAM, namely RP-18 ADS (alkyl diol silica) sorbent with respect to retention capability towards basic analytes.
Magnetic cleanliness verification approach on tethered satellite
NASA Technical Reports Server (NTRS)
Messidoro, Piero; Braghin, Massimo; Grande, Maurizio
1990-01-01
Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.
2013-11-01
by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new
Model and Analytic Processes for Export License Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.
2011-09-29
This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less
Walker, Greg; Römann, Philipp; Poller, Bettina; Löbmann, Korbinian; Grohganz, Holger; Rooney, Jeremy S; Huff, Gregory S; Smith, Geoffrey P S; Rades, Thomas; Gordon, Keith C; Strachan, Clare J; Fraser-Miller, Sara J
2017-12-04
This study uses a multimodal analytical approach to evaluate the rates of (co)amorphization of milled drug and excipient and the effectiveness of different analytical methods in detecting these changes. Indomethacin and tryptophan were the model substances, and the analytical methods included low-frequency Raman spectroscopy (785 nm excitation and capable of measuring both low- (10 to 250 cm -1 ) and midfrequency (450 to 1800 cm -1 ) regimes, and a 830 nm system (5 to 250 cm -1 )), conventional (200-3000 cm -1 ) Raman spectroscopy, Fourier transform infrared spectroscopy (FTIR), and X-ray powder diffraction (XRPD). The kinetics of amorphization were found to be faster for the mixture, and indeed, for indomethacin, only partial amorphization occurred (after 360 min of milling). Each technique was capable of identifying the transformations, but some, such as low-frequency Raman spectroscopy and XRPD, provided less ambiguous signatures than the midvibrational frequency techniques (conventional Raman and FTIR). The low-frequency Raman spectra showed intense phonon mode bands for the crystalline and cocrystalline samples that could be used as a sensitive probe of order. Multivariate analysis has been used to further interpret the spectral changes. Overall, this study demonstrates the potential of low-frequency Raman spectroscopy, which has several practical advantages over XRPD, for probing (dis-)order during pharmaceutical processing, showcasing its potential for future development, and implementation as an in-line process monitoring method.
A model system to mimic environmentally active surface film roughness and hydrophobicity.
Grant, Jacob S; Shaw, Scott K
2017-10-01
This work presents the development and initial assessment of a laboratory platform to allow quantitative studies on model urban films. The platform consists of stearic acid and eicosane mixtures that are solution deposited from hexanes onto smooth, solid substrates. We show that this model has distinctive capabilities to better mimic a naturally occurring film's morphology and hydrophobicity, two important parameters that have not previously been incorporated into model film systems. The physical and chemical properties of the model films are assessed using a variety of analytical instruments. The film thickness and roughness are probed via atomic force microscopy while the film composition, wettability, and water uptake are analyzed by Fourier transform infrared spectroscopy, contact angle goniometry, and quartz crystal microbalance, respectively. Simulated environmental maturation is achieved by exposing the film to regulated amounts of UV/ozone. Ultimately, oxidation of the film is monitored by the analytical techniques mentioned above and proceeds as expected to produce a utile model film system. Including variable roughness and tunable surface coverage results in several key advantages over prior model systems, and will more accurately represent native urban film behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.
Applying deep bidirectional LSTM and mixture density network for basketball trajectory prediction
NASA Astrophysics Data System (ADS)
Zhao, Yu; Yang, Rennong; Chevalier, Guillaume; Shah, Rajiv C.; Romijnders, Rob
2018-04-01
Data analytics helps basketball teams to create tactics. However, manual data collection and analytics are costly and ineffective. Therefore, we applied a deep bidirectional long short-term memory (BLSTM) and mixture density network (MDN) approach. This model is not only capable of predicting a basketball trajectory based on real data, but it also can generate new trajectory samples. It is an excellent application to help coaches and players decide when and where to shoot. Its structure is particularly suitable for dealing with time series problems. BLSTM receives forward and backward information at the same time, while stacking multiple BLSTMs further increases the learning ability of the model. Combined with BLSTMs, MDN is used to generate a multi-modal distribution of outputs. Thus, the proposed model can, in principle, represent arbitrary conditional probability distributions of output variables. We tested our model with two experiments on three-pointer datasets from NBA SportVu data. In the hit-or-miss classification experiment, the proposed model outperformed other models in terms of the convergence speed and accuracy. In the trajectory generation experiment, eight model-generated trajectories at a given time closely matched real trajectories.
The Transfer Function Model as a Tool to Study and Describe Space Weather Phenomena
NASA Technical Reports Server (NTRS)
Porter, Hayden S.; Mayr, Hans G.; Bhartia, P. K. (Technical Monitor)
2001-01-01
The Transfer Function Model (TFM) is a semi-analytical, linear model that is designed especially to describe thermospheric perturbations associated with magnetic storms and substorm. activity. It is a multi-constituent model (N2, O, He H, Ar) that accounts for wind induced diffusion, which significantly affects not only the composition and mass density but also the temperature and wind fields. Because the TFM adopts a semianalytic approach in which the geometry and temporal dependencies of the driving sources are removed through the use of height-integrated Green's functions, it provides physical insight into the essential properties of processes being considered, which are uncluttered by the accidental complexities that arise from particular source geometrie and time dependences. Extending from the ground to 700 km, the TFM eliminates spurious effects due to arbitrarily chosen boundary conditions. A database of transfer functions, computed only once, can be used to synthesize a wide range of spatial and temporal sources dependencies. The response synthesis can be performed quickly in real-time using only limited computing capabilities. These features make the TFM unique among global dynamical models. Given these desirable properties, a version of the TFM has been developed for personal computers (PC) using advanced platform-independent 3D visualization capabilities. We demonstrate the model capabilities with simulations for different auroral sources, including the response of ducted gravity waves modes that propagate around the globe. The thermospheric response is found to depend strongly on the spatial and temporal frequency spectra of the storm. Such varied behavior is difficult to describe in statistical empirical models. To improve the capability of space weather prediction, the TFM thus could be grafted naturally onto existing statistical models using data assimilation.
Aerodynamic parameter studies and sensitivity analysis for rotor blades in axial flight
NASA Technical Reports Server (NTRS)
Chiu, Y. Danny; Peters, David A.
1991-01-01
The analytical capability is offered for aerodynamic parametric studies and sensitivity analyses of rotary wings in axial flight by using a 3-D undistorted wake model in curved lifting line theory. The governing equations are solved by both the Multhopp Interpolation technique and the Vortex Lattice method. The singularity from the bound vortices is eliminated through the Hadamard's finite part concept. Good numerical agreement between both analytical methods and finite differences methods are found. Parametric studies were made to assess the effects of several shape variables on aerodynamic loads. It is found, e.g., that a rotor blade with out-of-plane and inplane curvature can theoretically increase lift in the inboard and outboard regions respectively without introducing an additional induced drag.
An Economical Semi-Analytical Orbit Theory for Retarded Satellite Motion About an Oblate Planet
NASA Technical Reports Server (NTRS)
Gordon, R. A.
1980-01-01
Brouwer and Brouwer-Lyddanes' use of the Von Zeipel-Delaunay method is employed to develop an efficient analytical orbit theory suitable for microcomputers. A succinctly simple pseudo-phenomenologically conceptualized algorithm is introduced which accurately and economically synthesizes modeling of drag effects. The method epitomizes and manifests effortless efficient computer mechanization. Simulated trajectory data is employed to illustrate the theory's ability to accurately accommodate oblateness and drag effects for microcomputer ground based or onboard predicted orbital representation. Real tracking data is used to demonstrate that the theory's orbit determination and orbit prediction capabilities are favorably adaptable to and are comparable with results obtained utilizing complex definitive Cowell method solutions on satellites experiencing significant drag effects.
Physics-based and human-derived information fusion for analysts
NASA Astrophysics Data System (ADS)
Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael
2017-05-01
Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.
Flat-plate solar array project. Volume 8: Project analysis and integration
NASA Technical Reports Server (NTRS)
Mcguire, P.; Henry, P.
1986-01-01
Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.
NASA Astrophysics Data System (ADS)
Johnson, S. P.; Rohrer, M. E.
2017-12-01
The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention, intervention, and prediction of emerging environmental threats.
Analysis of Well-Clear Boundary Models for the Integration of UAS in the NAS
NASA Technical Reports Server (NTRS)
Upchurch, Jason M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Chamberlain, James P.; Consiglio, Maria C.
2014-01-01
The FAA-sponsored Sense and Avoid Workshop for Unmanned Aircraft Systems (UAS) defnes the concept of sense and avoid for remote pilots as "the capability of a UAS to remain well clear from and avoid collisions with other airborne traffic." Hence, a rigorous definition of well clear is fundamental to any separation assurance concept for the integration of UAS into civil airspace. This paper presents a family of well-clear boundary models based on the TCAS II Resolution Advisory logic. Analytical techniques are used to study the properties and relationships satisfied by the models. Some of these properties are numerically quantifed using statistical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cottam, Joseph A.; Blaha, Leslie M.
Systems have biases. Their interfaces naturally guide a user toward specific patterns of action. For example, modern word-processors and spreadsheets are both capable of taking word wrapping, checking spelling, storing tables, and calculating formulas. You could write a paper in a spreadsheet or could do simple business modeling in a word-processor. However, their interfaces naturally communicate which function they are designed for. Visual analytic interfaces also have biases. In this paper, we outline why simple Markov models are a plausible tool for investigating that bias and how they might be applied. We also discuss some anticipated difficulties in such modelingmore » and touch briefly on what some Markov model extensions might provide.« less
Visual Analytics in Public Safety: Example Capabilities for Example Government Agencies
2011-10-01
is not limited to: the Police Records Information Management Environment for British Columbia (PRIME-BC), the Police Reporting and Occurrence System...and filtering for rapid identification of relevant documents - Graphical environment for visual evidence marshaling - Interactive linking and...analytical reasoning facilitated by interactive visual interfaces and integration with computational analytics. Indeed, a wide variety of technologies
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward
2016-01-01
In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.
Development and Applications of Liquid Sample Desorption Electrospray Ionization Mass Spectrometry
NASA Astrophysics Data System (ADS)
Zheng, Qiuling; Chen, Hao
2016-06-01
Desorption electrospray ionization mass spectrometry (DESI-MS) is a recent advance in the field of analytical chemistry. This review surveys the development of liquid sample DESI-MS (LS-DESI-MS), a variant form of DESI-MS that focuses on fast analysis of liquid samples, and its novel analy-tical applications in bioanalysis, proteomics, and reaction kinetics. Due to the capability of directly ionizing liquid samples, liquid sample DESI (LS-DESI) has been successfully used to couple MS with various analytical techniques, such as microfluidics, microextraction, electrochemistry, and chromatography. This review also covers these hyphenated techniques. In addition, several closely related ionization methods, including transmission mode DESI, thermally assisted DESI, and continuous flow-extractive DESI, are briefly discussed. The capabilities of LS-DESI extend and/or complement the utilities of traditional DESI and electrospray ionization and will find extensive and valuable analytical application in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kang, Shujiang; Kline, Keith L; Nair, S. Surendran
A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulatedmore » a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.« less
High Technology Service Value Maximization through an MCDM-Based Innovative e-Business Model
NASA Astrophysics Data System (ADS)
Huang, Chi-Yo; Tzeng, Gwo-Hshiung; Ho, Wen-Rong; Chuang, Hsiu-Tyan; Lue, Yeou-Feng
The emergence of the Internet has changed the high technology marketing channels thoroughly in the past decade while E-commerce has already become one of the most efficient channels which high technology firms may skip the intermediaries and reach end customers directly. However, defining appropriate e-business models for commercializing new high technology products or services through Internet are not that easy. To overcome the above mentioned problems, a novel analytic framework based on the concept of high technology customers’ competence set expansion by leveraging high technology service firms’ capabilities and resources as well as novel multiple criteria decision making (MCDM) techniques, will be proposed in order to define an appropriate e-business model. An empirical example study of a silicon intellectual property (SIP) commercialization e-business model based on MCDM techniques will be provided for verifying the effectiveness of this novel analytic framework. The analysis successful assisted a Taiwanese IC design service firm to define an e-business model for maximizing its customer’s SIP transactions. In the future, the novel MCDM framework can be applied successful to novel business model definitions in the high technology industry.
The Distributed Geothermal Market Demand Model (dGeo): Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCabe, Kevin; Mooney, Meghan E; Sigrin, Benjamin O
The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistentmore » with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.« less
NASA Technical Reports Server (NTRS)
Boyd, D. Douglas, Jr.; Burley, Casey L.; Conner, David A.
2005-01-01
The Comprehensive Analytical Rotorcraft Model for Acoustics (CARMA) is being developed under the Quiet Aircraft Technology Project within the NASA Vehicle Systems Program. The purpose of CARMA is to provide analysis tools for the design and evaluation of efficient low-noise rotorcraft, as well as support the development of safe, low-noise flight operations. The baseline prediction system of CARMA is presented and current capabilities are illustrated for a model rotor in a wind tunnel, a rotorcraft in flight and for a notional coaxial rotor configuration; however, a complete validation of the CARMA system capabilities with respect to a variety of measured databases is beyond the scope of this work. For the model rotor illustration, predicted rotor airloads and acoustics for a BO-105 model rotor are compared to test data from HART-II. For the flight illustration, acoustic data from an MD-520N helicopter flight test, which was conducted at Eglin Air Force Base in September 2003, are compared with CARMA full vehicle flight predictions. Predicted acoustic metrics at three microphone locations are compared for limited level flight and descent conditions. Initial acoustic predictions using CARMA for a notional coaxial rotor system are made. The effect of increasing the vertical separation between the rotors on the predicted airloads and acoustic results are shown for both aerodynamically non-interacting and aerodynamically interacting rotors. The sensitivity of including the aerodynamic interaction effects of each rotor on the other, especially when the rotors are in close proximity to one another is initially examined. The predicted coaxial rotor noise is compared to that of a conventional single rotor system of equal thrust, where both are of reasonable size for an unmanned aerial vehicle (UAV).
Dynamical properties of a prey-predator-scavenger model with quadratic harvesting
NASA Astrophysics Data System (ADS)
Gupta, R. P.; Chandra, Peeyush
2017-08-01
In this paper, we propose and analyze an extended model for the prey-predator-scavenger in presence of harvesting to study the effects of harvesting of predator as well as scavenger. The positivity, boundedness and persistence conditions are derived for the proposed model. The model undergoes a Hopf-bifurcation around the co-existing equilibrium point. It is also observed that the model is capable of exhibiting period doubling route to chaos. It is pointed out that a suitable amount of harvesting of predator can control the chaotic dynamics and make the system stable. An extensive numerical simulation is performed to validate the analytic findings. The associated control problem for the proposed model has been analyzed for optimal harvesting.
K-ε Turbulence Model Parameter Estimates Using an Approximate Self-similar Jet-in-Crossflow Solution
DeChant, Lawrence; Ray, Jaideep; Lefantzi, Sophia; ...
2017-06-09
The k-ε turbulence model has been described as perhaps “the most widely used complete turbulence model.” This family of heuristic Reynolds Averaged Navier-Stokes (RANS) turbulence closures is supported by a suite of model parameters that have been estimated by demanding the satisfaction of well-established canonical flows such as homogeneous shear flow, log-law behavior, etc. While this procedure does yield a set of so-called nominal parameters, it is abundantly clear that they do not provide a universally satisfactory turbulence model that is capable of simulating complex flows. Recent work on the Bayesian calibration of the k-ε model using jet-in-crossflow wind tunnelmore » data has yielded parameter estimates that are far more predictive than nominal parameter values. In this paper, we develop a self-similar asymptotic solution for axisymmetric jet-in-crossflow interactions and derive analytical estimates of the parameters that were inferred using Bayesian calibration. The self-similar method utilizes a near field approach to estimate the turbulence model parameters while retaining the classical far-field scaling to model flow field quantities. Our parameter values are seen to be far more predictive than the nominal values, as checked using RANS simulations and experimental measurements. They are also closer to the Bayesian estimates than the nominal parameters. A traditional simplified jet trajectory model is explicitly related to the turbulence model parameters and is shown to yield good agreement with measurement when utilizing the analytical derived turbulence model coefficients. Finally, the close agreement between the turbulence model coefficients obtained via Bayesian calibration and the analytically estimated coefficients derived in this paper is consistent with the contention that the Bayesian calibration approach is firmly rooted in the underlying physical description.« less
NASA Technical Reports Server (NTRS)
Majumdar, Alok
2013-01-01
The purpose of the paper is to present the analytical capability developed to model no vent chill and fill of cryogenic tank to support CPST (Cryogenic Propellant Storage and Transfer) program. Generalized Fluid System Simulation Program (GFSSP) was adapted to simulate charge-holdvent method of Tank Chilldown. GFSSP models were developed to simulate chilldown of LH2 tank in K-site Test Facility and numerical predictions were compared with test data. The report also describes the modeling technique of simulating the chilldown of a cryogenic transfer line and GFSSP models were developed to simulate the chilldown of a long transfer line and compared with test data.
Ultra-sensitive fluorescent imaging-biosensing using biological photonic crystals
NASA Astrophysics Data System (ADS)
Squire, Kenny; Kong, Xianming; Wu, Bo; Rorrer, Gregory; Wang, Alan X.
2018-02-01
Optical biosensing is a growing area of research known for its low limits of detection. Among optical sensing techniques, fluorescence detection is among the most established and prevalent. Fluorescence imaging is an optical biosensing modality that exploits the sensitivity of fluorescence in an easy-to-use process. Fluorescence imaging allows a user to place a sample on a sensor and use an imager, such as a camera, to collect the results. The image can then be processed to determine the presence of the analyte. Fluorescence imaging is appealing because it can be performed with as little as a light source, a camera and a data processor thus being ideal for nontrained personnel without any expensive equipment. Fluorescence imaging sensors generally employ an immunoassay procedure to selectively trap analytes such as antigens or antibodies. When the analyte is present, the sensor fluoresces thus transducing the chemical reaction into an optical signal capable of imaging. Enhancement of this fluorescence leads to an enhancement in the detection capabilities of the sensor. Diatoms are unicellular algae with a biosilica shell called a frustule. The frustule is porous with periodic nanopores making them biological photonic crystals. Additionally, the porous nature of the frustule allows for large surface area capable of multiple analyte binding sites. In this paper, we fabricate a diatom based ultra-sensitive fluorescence imaging biosensor capable of detecting the antibody mouse immunoglobulin down to a concentration of 1 nM. The measured signal has an enhancement of 6× when compared to sensors fabricated without diatoms.
NASA Astrophysics Data System (ADS)
Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris
2015-04-01
Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.
Packet Radio Temporary Note Index.
1984-05-07
Dynamic Control in Carrier Sense Multiple Access 180 Cross-Radio Debugger Beeler 06/76 BBN 179 New Capabilities of the PR Gitman 05/76 NAC Simulation...Program 178 An Approximate Analytical Model for Gitman 05/76 NAC Initialization of Single Hop PRNETs 177 SPP Definition Beeler 04/76 BBN 176 PR Protocol...Sussman 03/79 BBN Labeling Process (Revision 7) 173 Interfacing Terminals to the PRN Fralick 04/76 BBN 172 Connectivity Issues in Mobile PR Gitman 03/76
Theoretical and experimental investigations of thermal conditions of household biogas plant
NASA Astrophysics Data System (ADS)
Zhelykh, Vasil; Furdas, Yura; Dzeryn, Oleksandra
2016-06-01
The construction of domestic continuous bioreactor is proposed. The modeling of thermal modes of household biogas plant using graph theory was done. The correction factor taking into account with the influence of variables on its value was determined. The system of balance equations for the desired thermal conditions in the bioreactor was presented. The graphical and analytical capabilities were represented that can be applied in the design of domestic biogas plants of organic waste recycling.
NASA Astrophysics Data System (ADS)
Bencherif, H.; Djeffal, F.; Ferhati, H.
2016-09-01
This paper presents a hybrid approach based on an analytical and metaheuristic investigation to study the impact of the interdigitated electrodes engineering on both speed and optical performance of an Interdigitated Metal-Semiconductor-Metal Ultraviolet Photodetector (IMSM-UV-PD). In this context, analytical models regarding the speed and optical performance have been developed and validated by experimental results, where a good agreement has been recorded. Moreover, the developed analytical models have been used as objective functions to determine the optimized design parameters, including the interdigit configuration effect, via a Multi-Objective Genetic Algorithm (MOGA). The ultimate goal of the proposed hybrid approach is to identify the optimal design parameters associated with the maximum of electrical and optical device performance. The optimized IMSM-PD not only reveals superior performance in terms of photocurrent and response time, but also illustrates higher optical reliability against the optical losses due to the active area shadowing effects. The advantages offered by the proposed design methodology suggest the possibility to overcome the most challenging problem with the communication speed and power requirements of the UV optical interconnect: high derived current and commutation speed in the UV receiver.
Dynamic Investigation of Static Divergence: Analysis and Testing
NASA Technical Reports Server (NTRS)
Heeg, Jennifer
2000-01-01
The phenomenon known as aeroelastic divergence is the focus of this work. The analyses and experiment presented here show that divergence can occur without a structural dynamic mode losing its oscillatory nature. Aeroelastic divergence occurs when the structural restorative capability or stiffness of a structure is overwhelmed by the static aerodynamic moment. This static aeroelastic coupling does not require the structural dynamic system behavior to cease, however. Aeroelastic changes in the dynamic mode behavior are governed not only by the stiffness, but by damping and inertial properties. The work presented here supports these fundamental assertions by examining a simple system: a typical section airfoil with only a rotational structural degree of freedom. Analytical results identified configurations that exhibit different types of dynamic mode behavior as the system encounters divergence. A wind tunnel model was designed and tested to examine divergence experimentally. The experimental results validate the analytical calculations and explicitly examine the divergence phenomenon where the dynamic mode persists. Three configurations of the wind tunnel model were tested. The experimental results agree very well with the analytical predictions of subcritical characteristics, divergence velocity, and behavior of the noncritical dynamic mode at divergence.
Simple Analytic Expressions for the Magnetic Field of a Circular Current Loop
NASA Technical Reports Server (NTRS)
Simpson, James C.; Lane, John E.; Immer, Christopher D.; Youngquist, Robert C.
2001-01-01
Analytic expressions for the magnetic induction (magnetic flux density, B) of a simple planar circular current loop have been published in Cartesian and cylindrical coordinates [1,2], and are also known implicitly in spherical coordinates [3]. In this paper, we present explicit analytic expressions for B and its spatial derivatives in Cartesian, cylindrical, and spherical coordinates for a filamentary current loop. These results were obtained with extensive use of Mathematica "TM" and are exact throughout all space outside of the conductor. The field expressions reduce to the well-known limiting cases and satisfy V · B = 0 and V x B = 0 outside the conductor. These results are general and applicable to any model using filamentary circular current loops. Solenoids of arbitrary size may be easily modeled by approximating the total magnetic induction as the sum of those for the individual loops. The inclusion of the spatial derivatives expands their utility to magnetohydrodynamics where the derivatives are required. The equations can be coded into any high-level programming language. It is necessary to numerically evaluate complete elliptic integrals of the first and second kind, but this capability is now available with most programming packages.
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S. R.
2014-12-01
Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.
NASA Astrophysics Data System (ADS)
Trombetti, Tomaso
This thesis presents an Experimental/Analytical approach to modeling and calibrating shaking tables for structural dynamic applications. This approach was successfully applied to the shaking table recently built in the structural laboratory of the Civil Engineering Department at Rice University. This shaking table is capable of reproducing model earthquake ground motions with a peak acceleration of 6 g's, a peak velocity of 40 inches per second, and a peak displacement of 3 inches, for a maximum payload of 1500 pounds. It has a frequency bandwidth of approximately 70 Hz and is designed to test structural specimens up to 1/5 scale. The rail/table system is mounted on a reaction mass of about 70,000 pounds consisting of three 12 ft x 12 ft x 1 ft reinforced concrete slabs, post-tensioned together and connected to the strong laboratory floor. The slip table is driven by a hydraulic actuator governed by a 407 MTS controller which employs a proportional-integral-derivative-feedforward-differential pressure algorithm to control the actuator displacement. Feedback signals are provided by two LVDT's (monitoring the slip table relative displacement and the servovalve main stage spool position) and by one differential pressure transducer (monitoring the actuator force). The dynamic actuator-foundation-specimen system is modeled and analyzed by combining linear control theory and linear structural dynamics. The analytical model developed accounts for the effects of actuator oil compressibility, oil leakage in the actuator, time delay in the response of the servovalve spool to a given electrical signal, foundation flexibility, and dynamic characteristics of multi-degree-of-freedom specimens. In order to study the actual dynamic behavior of the shaking table, the transfer function between target and actual table accelerations were identified using experimental results and spectral estimation techniques. The power spectral density of the system input and the cross power spectral density of the table input and output were estimated using the Bartlett's spectral estimation method. The experimentally-estimated table acceleration transfer functions obtained for different working conditions are correlated with their analytical counterparts. As a result of this comprehensive correlation study, a thorough understanding of the shaking table dynamics and its sensitivities to control and payload parameters is obtained. Moreover, the correlation study leads to a calibrated analytical model of the shaking table of high predictive ability. It is concluded that, in its present conditions, the Rice shaking table is able to reproduce, with a high degree of accuracy, model earthquake accelerations time histories in the frequency bandwidth from 0 to 75 Hz. Furthermore, the exhaustive analysis performed indicates that the table transfer function is not significantly affected by the presence of a large (in terms of weight) payload with a fundamental frequency up to 20 Hz. Payloads having a higher fundamental frequency do affect significantly the shaking table performance and require a modification of the table control gain setting that can be easily obtained using the predictive analytical model of the shaking table. The complete description of a structural dynamic experiment performed using the Rice shaking table facility is also reported herein. The object of this experimentation was twofold: (1) to verify the testing capability of the shaking table and, (2) to experimentally validate a simplified theory developed by the author, which predicts the maximum rotational response developed by seismic isolated building structures characterized by non-coincident centers of mass and rigidity, when subjected to strong earthquake ground motions.
Modeling AWSoM CMEs with EEGGL: A New Approach for Space Weather Forecasting
NASA Astrophysics Data System (ADS)
Jin, M.; Manchester, W.; van der Holst, B.; Sokolov, I.; Toth, G.; Vourlidas, A.; de Koning, C. A.; Gombosi, T. I.
2015-12-01
The major source of destructive space weather is coronal mass ejections (CMEs). However, our understanding of CMEs and their propagation in the heliosphere is limited by the insufficient observations. Therefore, the development of first-principals numerical models plays a vital role in both theoretical investigation and providing space weather forecasts. Here, we present results of the simulation of CME propagation from the Sun to 1AU by combining the analytical Gibson & Low (GL) flux rope model with the state-of-art solar wind model AWSoM. We also provide an approach for transferring this research model to a space weather forecasting tool by demonstrating how the free parameters of the GL flux rope can be prescribed based on remote observations via the new Eruptive Event Generator by Gibson-Low (EEGGL) toolkit. This capability allows us to predict the long-term evolution of the CME in interplanetary space. We perform proof-of-concept case studies to show the capability of the model to capture physical processes that determine CME evolution while also reproducing many observed features both in the corona and at 1 AU. We discuss the potential and limitations of this model as a future space weather forecasting tool.
A simulation technique for predicting thickness of thermal sprayed coatings
NASA Technical Reports Server (NTRS)
Goedjen, John G.; Miller, Robert A.; Brindley, William J.; Leissler, George W.
1995-01-01
The complexity of many of the components being coated today using the thermal spray process makes the trial and error approach traditionally followed in depositing a uniform coating inadequate, thereby necessitating a more analytical approach to developing robotic trajectories. A two dimensional finite difference simulation model has been developed to predict the thickness of coatings deposited using the thermal spray process. The model couples robotic and component trajectories and thermal spraying parameters to predict coating thickness. Simulations and experimental verification were performed on a rotating disk to evaluate the predictive capabilities of the approach.
Simulating ground water-lake interactions: Approaches and insights
Hunt, R.J.; Haitjema, H.M.; Krohelski, J.T.; Feinstein, D.T.
2003-01-01
Approaches for modeling lake-ground water interactions have evolved significantly from early simulations that used fixed lake stages specified as constant head to sophisticated LAK packages for MODFLOW. Although model input can be complex, the LAK package capabilities and output are superior to methods that rely on a fixed lake stage and compare well to other simple methods where lake stage can be calculated. Regardless of the approach, guidelines presented here for model grid size, location of three-dimensional flow, and extent of vertical capture can facilitate the construction of appropriately detailed models that simulate important lake-ground water interactions without adding unnecessary complexity. In addition to MODFLOW approaches, lake simulation has been formulated in terms of analytic elements. The analytic element lake package had acceptable agreement with a published LAK1 problem, even though there were differences in the total lake conductance and number of layers used in the two models. The grid size used in the original LAK1 problem, however, violated a grid size guideline presented in this paper. Grid sensitivity analyses demonstrated that an appreciable discrepancy in the distribution of stream and lake flux was related to the large grid size used in the original LAK1 problem. This artifact is expected regardless of MODFLOW LAK package used. When the grid size was reduced, a finite-difference formulation approached the analytic element results. These insights and guidelines can help ensure that the proper lake simulation tool is being selected and applied.
NASA Technical Reports Server (NTRS)
Heeg, Jennifer
1991-01-01
The objective was to analytically and experimentally study the capabilities of adaptive material plate actuators for suppressing flutter. The validity of analytical modeling techniques for piezoelectric materials was also investigated. Piezoelectrics are materials which are characterized by their ability to produce voltage when subjected to a mechanical strain. The converse piezoelectric effect can be utilized to actuate a structure by applying a voltage. For this investigation, a two degree of freedom wind tunnel model was designed, analyzed, and tested. The model consisted of a rigid airfoil and a flexible mount system which permitted a translational and a rotational degree of freedom. It was designed such that flutter was encounted within the testing envelope of the wind tunnel. Actuators, made of piezoelectric material were affixed to leaf springs of the mount system. Each degree of freedom was controlled by a separate leaf spring. Command signals, applied to the piezoelectric actuators, exerted control over the damping and stiffness properties. A mathematical aeroservoelastic model was constructed using finite element methods, laminated plate theory, and aeroelastic analysis tools. Plant characteristics were determined from this model and verified by open loop experimental tests. A flutter suppression control law was designed and implemented on a digital control computer. Closed loop flutter testing was conducted. The experimental results represent the first time that adaptive materials have been used to actively suppress flutter. It demonstrates that small, carefully placed actuating plates can be used effectively to control aeroelastic response.
Analytical optical scattering in clouds
NASA Technical Reports Server (NTRS)
Phanord, Dieudonne D.
1989-01-01
An analytical optical model for scattering of light due to lightning by clouds of different geometry is being developed. The self-consistent approach and the equivalent medium concept of Twersky was used to treat the case corresponding to outside illumination. Thus, the resulting multiple scattering problem is transformed with the knowledge of the bulk parameters, into scattering by a single obstacle in isolation. Based on the size parameter of a typical water droplet as compared to the incident wave length, the problem for the single scatterer equivalent to the distribution of cloud particles can be solved either by Mie or Rayleigh scattering theory. The super computing code of Wiscombe can be used immediately to produce results that can be compared to the Monte Carlo computer simulation for outside incidence. A fairly reasonable inverse approach using the solution of the outside illumination case was proposed to model analytically the situation for point sources located inside the thick optical cloud. Its mathematical details are still being investigated. When finished, it will provide scientists an enhanced capability to study more realistic clouds. For testing purposes, the direct approach to the inside illumination of clouds by lightning is under consideration. Presently, an analytical solution for the cubic cloud will soon be obtained. For cylindrical or spherical clouds, preliminary results are needed for scattering by bounded obstacles above or below a penetrable surface interface.
Comparison of closed loop model with flight test results
NASA Technical Reports Server (NTRS)
George, F. L.
1981-01-01
An analytic technique capable of predicting the landing characteristics of proposed aircraft configurations in the early stages of design was developed. In this analysis, a linear pilot-aircraft closed loop model was evaluated using experimental data generated with the NT-33 variable stability in-flight simulator. The pilot dynamics are modeled as inner and outer servo loop closures around aircraft pitch attitude, and altitude rate-of-change respectively. The landing flare maneuver is of particular interest as recent experience with military and other highly augmented vehicles shows this task to be relatively demanding, and potentially a critical design point. A unique feature of the pilot model is the incorporation of an internal model of the pilot's desired flight path for the flare maneuver.
Interfacing the Generalized Fluid System Simulation Program with the SINDA/G Thermal Program
NASA Technical Reports Server (NTRS)
Schallhorn, Paul; Palmiter, Christopher; Farmer, Jeffery; Lycans, Randall; Tiller, Bruce
2000-01-01
A general purpose, one dimensional fluid flow code has been interfaced with the thermal analysis program SINDA/G. The flow code, GFSSP, is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development was conducted in two phases. This paper describes the first (which allows for steady and quasi-steady - unsteady solid, steady fluid - conjugate heat transfer modeling). The second (full transient conjugate heat transfer modeling) phase of the interface development will be addressed in a later paper. Phase 1 development has been benchmarked to an analytical solution with excellent agreement. Additional test cases for each development phase demonstrate desired features of the interface. The results of the benchmark case, three additional test cases and a practical application are presented herein.
Large Angle Transient Dynamics (LATDYN) user's manual
NASA Technical Reports Server (NTRS)
Abrahamson, A. Louis; Chang, Che-Wei; Powell, Michael G.; Wu, Shih-Chin; Bingel, Bradford D.; Theophilos, Paula M.
1991-01-01
A computer code for modeling the large angle transient dynamics (LATDYN) of structures was developed to investigate techniques for analyzing flexible deformation and control/structure interaction problems associated with large angular motions of spacecraft. This type of analysis is beyond the routine capability of conventional analytical tools without simplifying assumptions. In some instances, the motion may be sufficiently slow and the spacecraft (or component) sufficiently rigid to simplify analyses of dynamics and controls by making pseudo-static and/or rigid body assumptions. The LATDYN introduces a new approach to the problem by combining finite element structural analysis, multi-body dynamics, and control system analysis in a single tool. It includes a type of finite element that can deform and rotate through large angles at the same time, and which can be connected to other finite elements either rigidly or through mechanical joints. The LATDYN also provides symbolic capabilities for modeling control systems which are interfaced directly with the finite element structural model. Thus, the nonlinear equations representing the structural model are integrated along with the equations representing sensors, processing, and controls as a coupled system.
Analysis of gene network robustness based on saturated fixed point attractors
2014-01-01
The analysis of gene network robustness to noise and mutation is important for fundamental and practical reasons. Robustness refers to the stability of the equilibrium expression state of a gene network to variations of the initial expression state and network topology. Numerical simulation of these variations is commonly used for the assessment of robustness. Since there exists a great number of possible gene network topologies and initial states, even millions of simulations may be still too small to give reliable results. When the initial and equilibrium expression states are restricted to being saturated (i.e., their elements can only take values 1 or −1 corresponding to maximum activation and maximum repression of genes), an analytical gene network robustness assessment is possible. We present this analytical treatment based on determination of the saturated fixed point attractors for sigmoidal function models. The analysis can determine (a) for a given network, which and how many saturated equilibrium states exist and which and how many saturated initial states converge to each of these saturated equilibrium states and (b) for a given saturated equilibrium state or a given pair of saturated equilibrium and initial states, which and how many gene networks, referred to as viable, share this saturated equilibrium state or the pair of saturated equilibrium and initial states. We also show that the viable networks sharing a given saturated equilibrium state must follow certain patterns. These capabilities of the analytical treatment make it possible to properly define and accurately determine robustness to noise and mutation for gene networks. Previous network research conclusions drawn from performing millions of simulations follow directly from the results of our analytical treatment. Furthermore, the analytical results provide criteria for the identification of model validity and suggest modified models of gene network dynamics. The yeast cell-cycle network is used as an illustration of the practical application of this analytical treatment. PMID:24650364
Analytical Chemistry Laboratory
NASA Technical Reports Server (NTRS)
Anderson, Mark
2013-01-01
The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.
INTEGRATING BIOANALYTICAL CAPABILITY IN AN ENVIRONMENTAL ANALYTICAL LABORATORY
The product is a book chapter which is an introductory and summary chapter for the reference work "Immunoassays and Other Bianalytical Techniques" to be published by CRC Press, Taylor and Francis Books. The chapter provides analytical chemists information on new techni...
Manipulability, force, and compliance analysis for planar continuum manipulators
NASA Technical Reports Server (NTRS)
Gravagne, Ian A.; Walker, Ian D.
2002-01-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Manipulability, force, and compliance analysis for planar continuum manipulators.
Gravagne, Ian A; Walker, Ian D
2002-06-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Analytical coupled modeling of a magneto-based acoustic metamaterial harvester
NASA Astrophysics Data System (ADS)
Nguyen, H.; Zhu, R.; Chen, J. K.; Tracy, S. L.; Huang, G. L.
2018-05-01
Membrane-type acoustic metamaterials (MAMs) have demonstrated unusual capacity in controlling low-frequency sound transmission, reflection, and absorption. In this paper, an analytical vibro-acoustic-electromagnetic coupling model is developed to study MAM harvester sound absorption, energy conversion, and energy harvesting behavior under a normal sound incidence. The MAM harvester is composed of a prestressed membrane with an attached rigid mass, a magnet coil, and a permanent magnet coin. To accurately capture finite-dimension rigid mass effects on the membrane deformation under the variable magnet force, a theoretical model based on the deviating acoustic surface Green’s function approach is developed by considering the acoustic near field and distributed effective shear force along the interfacial boundary between the mass and the membrane. The accuracy and capability of the theoretical model is verified through comparison with the finite element method. In particular, sound absorption, acoustic-electric energy conversion, and harvesting coefficient are quantitatively investigated by varying the weight and size of the attached mass, prestress and thickness of the membrane. It is found that the highest achievable conversion and harvesting coefficients can reach up to 48%, and 36%, respectively. The developed model can serve as an efficient tool for designing MAM harvesters.
Ultrasensitive SERS Flow Detector Using Hydrodynamic Focusing
Negri, Pierre; Jacobs, Kevin T.; Dada, Oluwatosin O.; Schultz, Zachary D.
2013-01-01
Label-free, chemical specific detection in flow is important for high throughput characterization of analytes in applications such as flow injection analysis, electrophoresis, and chromatography. We have developed a surface-enhanced Raman scattering (SERS) flow detector capable of ultrasensitive optical detection on the millisecond time scale. The device employs hydrodynamic focusing to improve SERS detection in a flow channel where a sheath flow confines analyte molecules eluted from a fused silica capillary over a planar SERS-active substrate. Increased analyte interactions with the SERS substrate significantly improve detection sensitivity. The performance of this flow detector was investigated using a combination of finite element simulations, fluorescence imaging, and Raman experiments. Computational fluid dynamics based on finite element analysis was used to optimize the flow conditions. The modeling indicates that a number of factors, such as the capillary dimensions and the ratio of the sheath flow to analyte flow rates, are critical for obtaining optimal results. Sample confinement resulting from the flow dynamics was confirmed using wide-field fluorescence imaging of rhodamine 6G (R6G). Raman experiments at different sheath flow rates showed increased sensitivity compared with the modeling predictions, suggesting increased adsorption. Using a 50-millisecond acquisitions, a sheath flow rate of 180 μL/min, and a sample flow rate of 5 μL/min, a linear dynamic range from nanomolar to micromolar concentrations of R6G with a LOD of 1 nM is observed. At low analyte concentrations, rapid analyte desorption is observed, enabling repeated and high-throughput SERS detection. The flow detector offers substantial advantages over conventional SERS-based assays such as minimal sample volumes and high detection efficiency. PMID:24074461
Development of computer-based analytical tool for assessing physical protection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less
NASA Astrophysics Data System (ADS)
Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.
2017-12-01
Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).
Development of computer-based analytical tool for assessing physical protection system
NASA Astrophysics Data System (ADS)
Mardhi, Alim; Pengvanich, Phongphaeth
2016-01-01
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.
Block, Stephan
2018-05-22
The capability of lipid bilayers to exhibit fluid-phase behavior is a fascinating property, which enables, for example, membrane-associated components, such as lipids (domains) and transmembrane proteins, to diffuse within the membrane. These diffusion processes are of paramount importance for cells, as they are for example involved in cell signaling processes or the recycling of membrane components, but also for recently developed analytical approaches, which use differences in the mobility for certain analytical purposes, such as in-membrane purification of membrane proteins or the analysis of multivalent interactions. Here, models describing the Brownian motion of membrane inclusions (lipids, peptides, proteins, and complexes thereof) in model bilayers (giant unilamellar vesicles, black lipid membranes, supported lipid bilayers) are summarized and model predictions are compared with the available experimental data, thereby allowing for evaluating the validity of the introduced models. It will be shown that models describing the diffusion in freestanding (Saffman-Delbrück and Hughes-Pailthorpe-White model) and supported bilayers (the Evans-Sackmann model) are well supported by experiments, though only few experimental studies have been published so far for the latter case, calling for additional tests to reach the same level of experimental confirmation that is currently available for the case of freestanding bilayers.
SmartR: an open-source platform for interactive visual analytics for translational research data
Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard
2017-01-01
Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291
SmartR: an open-source platform for interactive visual analytics for translational research data.
Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard
2017-07-15
In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Effects of floor location on response of composite fuselage frames
NASA Technical Reports Server (NTRS)
Carden, Huey D.; Jones, Lisa E.; Fasanella, Edwin L.
1992-01-01
Experimental and analytical results are presented which show the effect of floor placement on the structural response and strength of circular fuselage frames constructed of graphite-epoxy composite material. The research was conducted to study the behavior of conventionally designed advanced composite aircraft components. To achieve desired new designs which incorporate improved energy absorption capabilities requires an understanding of how these conventional designs behave under crash type loadings. Data are presented on the static behavior of the composite structure through photographs of the frame specimen, experimental strain distributions, and through analytical data from composite structural models. An understanding of this behavior can aid the dynamist in predicting the crash behavior of these structures and may assist the designer in achieving improved designs for energy absorption and crash behavior of future structures.
Failure behavior of generic metallic and composite aircraft structural components under crash loads
NASA Technical Reports Server (NTRS)
Carden, Huey D.; Robinson, Martha P.
1990-01-01
Failure behavior results are presented from crash dynamics research using concepts of aircraft elements and substructure not necessarily designed or optimized for energy absorption or crash loading considerations. To achieve desired new designs incorporating improved energy absorption capabilities often requires an understanding of how more conventional designs behave under crash loadings. Experimental and analytical data are presented which indicate some general trends in the failure behavior of a class of composite structures including individual fuselage frames, skeleton subfloors with stringers and floor beams without skin covering, and subfloors with skin added to the frame-stringer arrangement. Although the behavior is complex, a strong similarity in the static/dynamic failure behavior among these structures is illustrated through photographs of the experimental results and through analytical data of generic composite structural models.
Flow reversal power limit for the HFBR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Lap Y.; Tichler, P.R.
The High Flux Beam Reactor (HFBR) undergoes a buoyancy-driven reversal of flow in the reactor core following certain postulated accidents. Uncertainties about the afterheat removal capability during the flow reversal has limited the reactor operating power to 30 MW. An experimental and analytical program to address these uncertainties is described in this report. The experiments were single channel flow reversal tests under a range of conditions. The analytical phase involved simulations of the tests to benchmark the physical models and development of a criterion for dryout. The criterion is then used in simulations of reactor accidents to determine a safemore » operating power level. It is concluded that the limit on the HFBR operating power with respect to the issue of flow reversal is in excess of 60 MW.« less
Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania
2015-04-15
In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.
EPA Region 6 Laboratory Method Specific Analytical Capabilities with Sample Concentration Range
EPA Region 6 Environmental Services Branch (ESB) Laboratory is capable of analyzing a wide range of samples with concentrations ranging for low part-per trillion (ppt) to low percent () levels, depending on the sample matrix.
Direct Laser Writing of Single-Material Sheets with Programmable Self-Rolling Capability
NASA Astrophysics Data System (ADS)
Bauhofer, Anton; KröDel, Sebastian; Bilal, Osama; Daraio, Chiara; Constantinescu, Andrei
Direct laser writing, a sub-class of two-photon polymerization, facilitates 3D-printing of single-material microstructures with inherent residual stresses. Here we show that controlled distribution of these stresses allows for fast and cost-effective fabrication of structures with programmable self-rolling capability. We investigate 2D sheets that evolve into versatile 3D structures. Precise control over the shape morphing potential is acquired through variations in geometry and writing parameters. Effects of capillary action and gravity were shown to be relevant for very thin sheets (thickness <1.5um) and have been analytically and experimentally quantified. In contrast to that, the deformations of sheets with larger thickness (>1.5um) are dominated by residual stresses and adhesion forces. The presented structures create local tensions up to 180MPa, causing rolling curvatures of 25E3m-1. A comprehensive analytical model that captures the relevant influence factors was developed based on laminate plate theory. The predicted curvature and directionality correspond well with the experimentally obtained data. Potential applications are found in drug encapsulation and particle traps for emulsions with differing surface energies. This work was supported by the Swiss National Science Foundation.
Estimated Benefits of Variable-Geometry Wing Camber Control for Transport Aircraft
NASA Technical Reports Server (NTRS)
Bolonkin, Alexander; Gilyard, Glenn B.
1999-01-01
Analytical benefits of variable-camber capability on subsonic transport aircraft are explored. Using aerodynamic performance models, including drag as a function of deflection angle for control surfaces of interest, optimal performance benefits of variable camber are calculated. Results demonstrate that if all wing trailing-edge surfaces are available for optimization, drag can be significantly reduced at most points within the flight envelope. The optimization approach developed and illustrated for flight uses variable camber for optimization of aerodynamic efficiency (maximizing the lift-to-drag ratio). Most transport aircraft have significant latent capability in this area. Wing camber control that can affect performance optimization for transport aircraft includes symmetric use of ailerons and flaps. In this paper, drag characteristics for aileron and flap deflections are computed based on analytical and wind-tunnel data. All calculations based on predictions for the subject aircraft and the optimal surface deflection are obtained by simple interpolation for given conditions. An algorithm is also presented for computation of optimal surface deflection for given conditions. Benefits of variable camber for a transport configuration using a simple trailing-edge control surface system can approach more than 10 percent, especially for nonstandard flight conditions. In the cruise regime, the benefit is 1-3 percent.
SPECIATION OF ARSENIC IN EXPOSURE ASSESSMENT MATRICES
The speciaton of arsenic in water, food and urine are analytical capabilities which are an essential part in arsenic risk assessment. The cancer risk associated with arsenic has been the driving force in generating the analytical research in each of these matrices. This presentat...
ENVIRONMENTAL TECHNOLOGICAL VERIFICATION REPORT - L2000 PCB/CHLORIDE ANALYZER - DEXSIL CORPORATION
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - ENVIROGARD PCB TEST KIT - STRATEGIC DIAGNOSTICS INC
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
Flow over Canopies with Complex Morphologies
NASA Astrophysics Data System (ADS)
Rubol, S.; Ling, B.; Battiato, I.
2017-12-01
Quantifying and predicting how submerged vegetation affects the velocity profile of riverine systems is crucial in ecohydraulics to properly assess the water quality and ecological functions or rivers. The state of the art includes a plethora of models to study the flow and transport over submerged canopies. However, most of them are validated against data collected in flume experiments with rigid cylinders. With the objective of investigating the capability of a simple analytical solution for vegetated flow to reproduce and predict the velocity profile of complex shaped flexible canopies, we use the flow model proposed by Battiato and Rubol [WRR 2013] as the analytical approximation of the mean velocity profile above and within the canopy layer. This model has the advantages (i) to threat the canopy layer as a porous medium, whose geometrical properties are associated with macroscopic effective permeability and (ii) to use input parameters that can be estimated by remote sensing techniques, such us the heights of the water level and the canopy. The analytical expressions for the average velocity profile and the discharge are tested against data collected across a wide range of canopy morphologies commonly encountered in riverine systems, such as grasses, woody vegetation and bushes. Results indicate good agreement between the analytical expressions and the data for both simple and complex plant geometry shapes. The rescaled low submergence velocities in the canopy layer followed the same scaling found in arrays of rigid cylinders. In addition, for the dataset analyzed, the Darcy friction factor scaled with the inverse of the bulk Reynolds number multiplied by the ratio of the fluid to turbulent viscosity.
An efficient approach for treating composition-dependent diffusion within organic particles
O'Meara, Simon; Topping, David O.; Zaveri, Rahul A.; ...
2017-09-07
Mounting evidence demonstrates that under certain conditions the rate of component partitioning between the gas and particle phase in atmospheric organic aerosol is limited by particle-phase diffusion. To date, however, particle-phase diffusion has not been incorporated into regional atmospheric models. An analytical rather than numerical solution to diffusion through organic particulate matter is desirable because of its comparatively small computational expense in regional models. Current analytical models assume diffusion to be independent of composition and therefore use a constant diffusion coefficient. To realistically model diffusion, however, it should be composition-dependent (e.g. due to the partitioning of components that plasticise, vitrifymore » or solidify). This study assesses the modelling capability of an analytical solution to diffusion corrected to account for composition dependence against a numerical solution. Results show reasonable agreement when the gas-phase saturation ratio of a partitioning component is constant and particle-phase diffusion limits partitioning rate (<10% discrepancy in estimated radius change). However, when the saturation ratio of the partitioning component varies, a generally applicable correction cannot be found, indicating that existing methodologies are incapable of deriving a general solution. Until such time as a general solution is found, caution should be given to sensitivity studies that assume constant diffusivity. Furthermore, the correction was implemented in the polydisperse, multi-process Model for Simulating Aerosol Interactions and Chemistry (MOSAIC) and is used to illustrate how the evolution of number size distribution may be accelerated by condensation of a plasticising component onto viscous organic particles.« less
An efficient approach for treating composition-dependent diffusion within organic particles
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Meara, Simon; Topping, David O.; Zaveri, Rahul A.
Mounting evidence demonstrates that under certain conditions the rate of component partitioning between the gas and particle phase in atmospheric organic aerosol is limited by particle-phase diffusion. To date, however, particle-phase diffusion has not been incorporated into regional atmospheric models. An analytical rather than numerical solution to diffusion through organic particulate matter is desirable because of its comparatively small computational expense in regional models. Current analytical models assume diffusion to be independent of composition and therefore use a constant diffusion coefficient. To realistically model diffusion, however, it should be composition-dependent (e.g. due to the partitioning of components that plasticise, vitrifymore » or solidify). This study assesses the modelling capability of an analytical solution to diffusion corrected to account for composition dependence against a numerical solution. Results show reasonable agreement when the gas-phase saturation ratio of a partitioning component is constant and particle-phase diffusion limits partitioning rate (<10% discrepancy in estimated radius change). However, when the saturation ratio of the partitioning component varies, a generally applicable correction cannot be found, indicating that existing methodologies are incapable of deriving a general solution. Until such time as a general solution is found, caution should be given to sensitivity studies that assume constant diffusivity. Furthermore, the correction was implemented in the polydisperse, multi-process Model for Simulating Aerosol Interactions and Chemistry (MOSAIC) and is used to illustrate how the evolution of number size distribution may be accelerated by condensation of a plasticising component onto viscous organic particles.« less
NASA Astrophysics Data System (ADS)
Hudspeth, W. B.; Sanchez-Silva, R.; Cavner, J. A.
2010-12-01
New Mexico's Environmental Public Health Tracking System (EPHTS), funded by the Centers for Disease Control (CDC) Environmental Public Health Tracking Network (EPHTN), aims to improve health awareness and services by linking health effects data with levels and frequency of environmental exposure. As a public health decision-support system, EPHTS systems include: state-of-the-art statistical analysis tools; geospatial visualization tools; data discovery, extraction, and delivery tools; and environmental/public health linkage information. As part of its mandate, EPHTS issues public health advisories and forecasts of environmental conditions that have consequences for human health. Through a NASA-funded partnership between the University of New Mexico and the University of Arizona, NASA Earth Science results are fused into two existing models (the Dust Regional Atmospheric Model (DREAM) and the Community Multiscale Air Quality (CMAQ) model) in order to improve forecasts of atmospheric dust, ozone, and aerosols. The results and products derived from the outputs of these models are made available to an Open Source mapping component of the New Mexico EPHTS. In particular, these products are integrated into a Django content management system using GeoDjango, GeoAlchemy, and other OGC-compliant geospatial libraries written in the Python and C++ programming languages. Capabilities of the resultant mapping system include indicator-based thematic mapping, data delivery, and analytical capabilities. DREAM and CMAQ outputs can be inspected, via REST calls, through temporal and spatial subsetting of the atmospheric concentration data across analytical units employed by the public health community. This paper describes details of the architecture and integration of NASA Earth Science into the EPHTS decision-support system.
A Simulated Geochemical Rover Mission to the Taurus-Littrow Valley of the Moon
NASA Technical Reports Server (NTRS)
Korotev, Randy L.; Haskin, Larry A.; Jolliff, Bradley L.
1995-01-01
We test the effectiveness of using an alpha backscatter, alpha-proton, X ray spectrometer on a remotely operated rover to analyze soils and provide geologically useful information about the Moon during a simulated mission to a hypothetical site resembling the Apollo 17 landing site. On the mission, 100 soil samples are "analyzed" for major elements at moderate analytical precision (e.g., typical relative sample standard deviation from counting statistics: Si[11%], Al[18%], Fe[6%], Mg[20%], Ca[5%]). Simulated compositions of soils are generated by combining compositions of components representing the major lithologies occurring at the site in known proportions. Simulated analyses are generated by degrading the simulated compositions according to the expected analytical precision of the analyzer. Compositions obtained from the simulated analyses are modeled by least squares mass balance as mixtures of the components, and the relative proportions of those components as predicted by the model are compared with the actual proportions used to generate the simulated composition. Boundary conditions of the modeling exercise are that all important lithologic components of the regolith are known and are represented by model components, and that the compositions of these components are well known. The effect of having the capability of determining one incompatible element at moderate precision (25%) is compared with the effect of the lack of this capability. We discuss likely limitations and ambiguities that would be encountered, but conclude that much of our knowledge about the Apollo 17 site (based on the return samples) regarding the distribution and relative abundances of lithologies in the regolith could be obtained. This success requires, however, that at least one incompatible element be determined.
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
NASA Astrophysics Data System (ADS)
Hou, X. Y.; Koh, C. G.; Kuang, K. S. C.; Lee, W. H.
2017-07-01
This paper investigates the capability of a novel piezoelectric sensor for low-frequency and low-amplitude vibration measurement. The proposed design effectively amplifies the input acceleration via two amplifying mechanisms and thus eliminates the use of the external charge amplifier or conditioning amplifier typically employed for measurement system. The sensor is also self-powered, i.e. no external power unit is required. Consequently, wiring and electrical insulation for on-site measurement are considerably simpler. In addition, the design also greatly reduces the interference from rotational motion which often accompanies the translational acceleration to be measured. An analytical model is developed based on a set of piezoelectric constitutive equations and beam theory. Closed-form expression is derived to correlate sensor geometry and material properties with its dynamic performance. Experimental calibration is then carried out to validate the analytical model. After calibration, experiments are carried out to check the feasibility of the new sensor in structural vibration detection. From experimental results, it is concluded that the proposed sensor is suitable for measuring low-frequency and low-amplitude vibrations.
Modeling and Evaluation of Miles-in-Trail Restrictions in the National Air Space
NASA Technical Reports Server (NTRS)
Grabbe, Shon; Sridhar, Banavar
2003-01-01
Miles-in-trail restrictions impact flights in the national air space on a daily basis and these restrictions routinely propagate between adjacent Air Route Traffic Control Centers. Since overly restrictive or ineffective miles-in-trail restrictions can reduce the overall efficiency of the national air space, decision support capabilities that model miles-in-trail restrictions should prove to be very beneficial. This paper presents both an analytical formulation and a linear programming approach for modeling the effects of miles-in-trail restrictions. A methodology for monitoring the conformance of an existing miles-in-trail restriction is also presented. These capabilities have been implemented in the Future ATM Concepts Evaluation Tool for testing purposes. To allow alternative restrictions to be evaluated in post-operations, a new mode of operation, which is referred to as the hybrid-playback mode, has been implemented in the simulation environment. To demonstrate the capabilities of these new algorithms, the miles-in-trail restrictions, which were in effect on June 27, 2002 in the New York Terminal Radar Approach Control, are examined. Results from the miles-in-trail conformance monitoring functionality are presented for the ELIOT, PARKE and WHITE departure fixes. In addition, the miles-in-trail algorithms are used to assess the impact of alternative restrictions at the PARKE departure fix.
NASA Technical Reports Server (NTRS)
Tilley, roger; Dowla, Farid; Nekoogar, Faranak; Sadjadpour, Hamid
2012-01-01
Conventional use of Ground Penetrating Radar (GPR) is hampered by variations in background environmental conditions, such as water content in soil, resulting in poor repeatability of results over long periods of time when the radar pulse characteristics are kept the same. Target objects types might include voids, tunnels, unexploded ordinance, etc. The long-term objective of this work is to develop methods that would extend the use of GPR under various environmental and soil conditions provided an optimal set of radar parameters (such as frequency, bandwidth, and sensor configuration) are adaptively employed based on the ground conditions. Towards that objective, developing Finite Difference Time Domain (FDTD) GPR models, verified by experimental results, would allow us to develop analytical and experimental techniques to control radar parameters to obtain consistent GPR images with changing ground conditions. Reported here is an attempt at developing 20 and 3D FDTD models of buried targets verified by two different radar systems capable of operating over different soil conditions. Experimental radar data employed were from a custom designed high-frequency (200 MHz) multi-static sensor platform capable of producing 3-D images, and longer wavelength (25 MHz) COTS radar (Pulse EKKO 100) capable of producing 2-D images. Our results indicate different types of radar can produce consistent images.
Ion Beam Analysis of Diffusion in Diamondlike Carbon Films
NASA Astrophysics Data System (ADS)
Chaffee, Kevin Paul
The van de Graaf accelerator facility at Case Western Reserve University was developed into an analytical research center capable of performing Rutherford Backscattering Spectrometry, Elastic Recoil Detection Analysis for hydrogen profiling, Proton Enhanced Scattering, and ^4 He resonant scattering for ^{16 }O profiling. These techniques were applied to the study of Au, Na^+, Cs ^+, and H_2O water diffusion in a-C:H films. The results are consistent with the fully constrained network model of the microstructure as described by Angus and Jansen.
ERIC Educational Resources Information Center
Bock, H. Darrell
The hardware and software system used to create the National Opinion Research Center/Center for Research on Evaluation, Standards, and Student Testing (NORC/CRESST) item databases and test booklets for the 12th-grade science assessment are described. A general description of the capabilities of the system is given, with some specific information…
Modeling the Fault Tolerant Capability of a Flight Control System: An Exercise in SCR Specification
NASA Technical Reports Server (NTRS)
Alexander, Chris; Cortellessa, Vittorio; DelGobbo, Diego; Mili, Ali; Napolitano, Marcello
2000-01-01
In life-critical and mission-critical applications, it is important to make provisions for a wide range of contingencies, by providing means for fault tolerance. In this paper, we discuss the specification of a flight control system that is fault tolerant with respect to sensor faults. Redundancy is provided by analytical relations that hold between sensor readings; depending on the conditions, this redundancy can be used to detect, identify and accommodate sensor faults.
2003-09-18
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2003-09-18
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2003-09-18
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
Heat storage capability of a rolling cylinder using Glauber's salt
NASA Technical Reports Server (NTRS)
Herrick, C. S.; Zarnoch, K. P.
1980-01-01
The rolling cylinder phase change heat storage concept was developed to the point where a prototype design is completed and a cost analysis is prepared. A series of experimental and analytical tasks are defined to establish the thermal, mechanical, and materials behavior of rolling cylinder devices. These tasks include: analyses of internal and external heat transfer; performance and lifetime testing of the phase change materials; corrosion evaluation; development of a mathematical model; and design of a prototype and associated test equipment.
Modeling of ion acceleration through drift and diffusion at interplanetary shocks
NASA Technical Reports Server (NTRS)
Decker, R. B.; Vlahos, L.
1986-01-01
A test particle simulation designed to model ion acceleration through drift and diffusion at interplanetary shocks is described. The technique consists of integrating along exact particle orbits in a system where the angle between the shock normal and mean upstream magnetic field, the level of magnetic fluctuations, and the energy of injected particles can assume a range of values. The technique makes it possible to study time-dependent shock acceleration under conditions not amenable to analytical techniques. To illustrate the capability of the numerical model, proton acceleration was considered under conditions appropriate for interplanetary shocks at 1 AU, including large-amplitude transverse magnetic fluctuations derived from power spectra of both ambient and shock-associated MHD waves.
Aerodynamic stability analysis of NASA J85-13/planar pressure pulse generator installation
NASA Technical Reports Server (NTRS)
Chung, K.; Hosny, W. M.; Steenken, W. G.
1980-01-01
A digital computer simulation model for the J85-13/Planar Pressure Pulse Generator (P3 G) test installation was developed by modifying an existing General Electric compression system model. This modification included the incorporation of a novel method for describing the unsteady blade lift force. This approach significantly enhanced the capability of the model to handle unsteady flows. In addition, the frequency response characteristics of the J85-13/P3G test installation were analyzed in support of selecting instrumentation locations to avoid standing wave nodes within the test apparatus and thus, low signal levels. The feasibility of employing explicit analytical expression for surge prediction was also studied.
System performance predictions for Space Station Freedom's electric power system
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Hojnicki, Jeffrey S.; Green, Robert D.; Follo, Jeffrey C.
1993-01-01
Space Station Freedom Electric Power System (EPS) capability to effectively deliver power to housekeeping and user loads continues to strongly influence Freedom's design and planned approaches for assembly and operations. The EPS design consists of silicon photovoltaic (PV) arrays, nickel-hydrogen batteries, and direct current power management and distribution hardware and cabling. To properly characterize the inherent EPS design capability, detailed system performance analyses must be performed for early stages as well as for the fully assembled station up to 15 years after beginning of life. Such analyses were repeatedly performed using the FORTRAN code SPACE (Station Power Analysis for Capability Evaluation) developed at the NASA Lewis Research Center over a 10-year period. SPACE combines orbital mechanics routines, station orientation/pointing routines, PV array and battery performance models, and a distribution system load-flow analysis to predict EPS performance. Time-dependent, performance degradation, low earth orbit environmental interactions, and EPS architecture build-up are incorporated in SPACE. Results from two typical SPACE analytical cases are presented: (1) an electric load driven case and (2) a maximum EPS capability case.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.
2013-12-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.
2014-01-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus
NASA Astrophysics Data System (ADS)
Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.
2017-12-01
Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.
NASA Technical Reports Server (NTRS)
Rana, D. S.
1980-01-01
The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.
INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS
A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...
Anticipating Surprise: Analysis for Strategic Warning
2002-12-01
Intentions versus Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2 Introduction to the Analytical Method ...Analysis . . . . . . . . . . . . . . . . . . . . . . 32 Specifics of the Analytical Method . . . . . . . . . . . . . . . . . . . . . . . . 42 3...intelligence. Why is it that “no one’’—a slight but not great exaggeration—believes in the indications method , despite its demonstrably good record in these
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...
Delin, G.N.; Almendinger, James Edward
1991-01-01
Hydrogeologic mapping and numerical modeling were used to delineate zones of contribution to wells, defined as all parts of a ground-water-flow system that could supply water to a well. The zones of contribution delineated by use of numerical modeling have similar orientation (parallel to regional flow directions) but significantly different areas than the zones of contribution delineated by use of hydrogeologic mapping. Differences in computed areas of recharge are attributed to the capability of the numerical model to more accurately represent (1) the three-dimensional flow system, (2) hydrologic boundaries like streams, (3) variable recharge, and (4) the influence of nearby pumped wells, compared to the analytical models.
Delin, G.N.; Almendinger, James Edward
1993-01-01
Hydrogeologic mapping and numerical modeling were used to delineate zones of contribution to wells, defined as all parts of a ground-water-flow system that could supply water to a well. The zones of contribution delineated by use of numerical modeling have similar orientation (parallel to regional flow directions) but significantly different areas than the zones of contribution delineated by use of hydrogeologic mapping. Differences in computed areas of recharge are attributed to the capability of the numerical model to more accurately represent (1) the three-dimensional flow system, (2) hydrologic boundaries such as streams, (3) variable recharge, and (4) the influence of nearby pumped wells, compared to the analytical models.
Methods for evaluating the predictive accuracy of structural dynamic models
NASA Technical Reports Server (NTRS)
Hasselman, T. K.; Chrostowski, Jon D.
1990-01-01
Uncertainty of frequency response using the fuzzy set method and on-orbit response prediction using laboratory test data to refine an analytical model are emphasized with respect to large space structures. Two aspects of the fuzzy set approach were investigated relative to its application to large structural dynamics problems: (1) minimizing the number of parameters involved in computing possible intervals; and (2) the treatment of extrema which may occur in the parameter space enclosed by all possible combinations of the important parameters of the model. Extensive printer graphics were added to the SSID code to help facilitate model verification, and an application of this code to the LaRC Ten Bay Truss is included in the appendix to illustrate this graphics capability.
Models for randomly distributed nanoscopic domains on spherical vesicles
NASA Astrophysics Data System (ADS)
Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John
2018-06-01
The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.
Heterogeneous network epidemics: real-time growth, variance and extinction of infection.
Ball, Frank; House, Thomas
2017-09-01
Recent years have seen a large amount of interest in epidemics on networks as a way of representing the complex structure of contacts capable of spreading infections through the modern human population. The configuration model is a popular choice in theoretical studies since it combines the ability to specify the distribution of the number of contacts (degree) with analytical tractability. Here we consider the early real-time behaviour of the Markovian SIR epidemic model on a configuration model network using a multitype branching process. We find closed-form analytic expressions for the mean and variance of the number of infectious individuals as a function of time and the degree of the initially infected individual(s), and write down a system of differential equations for the probability of extinction by time t that are numerically fast compared to Monte Carlo simulation. We show that these quantities are all sensitive to the degree distribution-in particular we confirm that the mean prevalence of infection depends on the first two moments of the degree distribution and the variance in prevalence depends on the first three moments of the degree distribution. In contrast to most existing analytic approaches, the accuracy of these results does not depend on having a large number of infectious individuals, meaning that in the large population limit they would be asymptotically exact even for one initial infectious individual.
Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun
2017-11-01
This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.
Human performance modeling for system of systems analytics: combat performance-shaping factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.; Miller, Dwight Peter
The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate theymore » would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.« less
NASA Astrophysics Data System (ADS)
Phipps, Marja; Capel, David; Srinivasan, James
2014-06-01
Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.
A Learning Framework for Control-Oriented Modeling of Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.
Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less
Intelligent model-based diagnostics for vehicle health management
NASA Astrophysics Data System (ADS)
Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki
2003-08-01
The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.
NASA Astrophysics Data System (ADS)
James, C. M.; Gildfind, D. E.; Lewis, S. W.; Morgan, R. G.; Zander, F.
2018-03-01
Expansion tubes are an important type of test facility for the study of planetary entry flow-fields, being the only type of impulse facility capable of simulating the aerothermodynamics of superorbital planetary entry conditions from 10 to 20 km/s. However, the complex flow processes involved in expansion tube operation make it difficult to fully characterise flow conditions, with two-dimensional full facility computational fluid dynamics simulations often requiring tens or hundreds of thousands of computational hours to complete. In an attempt to simplify this problem and provide a rapid flow condition prediction tool, this paper presents a validated and comprehensive analytical framework for the simulation of an expansion tube facility. It identifies central flow processes and models them from state to state through the facility using established compressible and isentropic flow relations, and equilibrium and frozen chemistry. How the model simulates each section of an expansion tube is discussed, as well as how the model can be used to simulate situations where flow conditions diverge from ideal theory. The model is then validated against experimental data from the X2 expansion tube at the University of Queensland.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosnitskiy, P., E-mail: pavrosni@yandex.ru; Yuldashev, P., E-mail: petr@acs366.phys.msu.ru; Khokhlova, V., E-mail: vera@acs366.phys.msu.ru
2015-10-28
An equivalent source model was proposed as a boundary condition to the nonlinear parabolic Khokhlov-Zabolotskaya (KZ) equation to simulate high intensity focused ultrasound (HIFU) fields generated by medical ultrasound transducers with the shape of a spherical shell. The boundary condition was set in the initial plane; the aperture, the focal distance, and the initial pressure of the source were chosen based on the best match of the axial pressure amplitude and phase distributions in the Rayleigh integral analytic solution for a spherical transducer and the linear parabolic approximation solution for the equivalent source. Analytic expressions for the equivalent source parametersmore » were derived. It was shown that the proposed approach allowed us to transfer the boundary condition from the spherical surface to the plane and to achieve a very good match between the linear field solutions of the parabolic and full diffraction models even for highly focused sources with F-number less than unity. The proposed method can be further used to expand the capabilities of the KZ nonlinear parabolic equation for efficient modeling of HIFU fields generated by strongly focused sources.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stagg, Alan K; Yoon, Su-Jong
This report describes the Consortium for Advanced Simulation of Light Water Reactors (CASL) work conducted for completion of the Thermal Hydraulics Methods (THM) Level 3 Milestone THM.CFD.P11.02: Hydra-TH Extensions for Multispecies and Thermosolutal Convection. A critical requirement for modeling reactor thermal hydraulics is to account for species transport within the fluid. In particular, this capability is needed for modeling transport and diffusion of boric acid within water for emergency, reactivity-control scenarios. To support this need, a species transport capability has been implemented in Hydra-TH for binary systems (for example, solute within a solvent). A species transport equation is solved formore » the species (solute) mass fraction, and both thermal and solutal buoyancy effects are handled with specification of a Boussinesq body force. Species boundary conditions can be specified with a Dirichlet condition on mass fraction or a Neumann condition on diffusion flux. To enable enhanced species/fluid mixing in turbulent flow, the molecular diffusivity for the binary system is augmented with a turbulent diffusivity in the species transport calculation. The new capabilities are demonstrated by comparison of Hydra-TH calculations to the analytic solution for a thermosolutal convection problem, and excellent agreement is obtained.« less
The Case for Adopting Server-side Analytics
NASA Astrophysics Data System (ADS)
Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.
2017-12-01
The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for future applications.
Analytical Chemistry Developmental Work Using a 243Am Solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.
2015-02-24
This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .
NASA Astrophysics Data System (ADS)
Boonprasert, Lapisarin; Tupsai, Jiraporn; Yuenyong, Chokchai
2018-01-01
This study reported Grade 8 students' analytical thinking and attitude toward science in teaching and learning about soil and its' pollution through science technology and society (STS) approach. The participants were 36 Grade 8 students in Naklang, Nongbualumphu, Thailand. The teaching and learning about soil and its' pollution through STS approach had carried out for 6 weeks. The soil and its' pollution unit through STS approach was developed based on framework of Yuenyong (2006) that consisted of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision-making, and (5) socialization stage. Students' analytical thinking and attitude toward science was collected during their learning by participant observation, analytical thinking test, students' tasks, and journal writing. The findings revealed that students could gain their capability of analytical thinking. They could give ideas or behave the characteristics of analytical thinking such as thinking for classifying, compare and contrast, reasoning, interpreting, collecting data and decision making. Students' journal writing reflected that the STS class of soil and its' pollution motivated students. The paper will discuss implications of these for science teaching and learning through STS in Thailand.
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...
A method for modeling laterally asymmetric proton beamlets resulting from collimation
Gelover, Edgar; Wang, Dongxu; Hill, Patrick M.; Flynn, Ryan T.; Gao, Mingcheng; Laub, Steve; Pankuch, Mark; Hyer, Daniel E.
2015-01-01
Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σx1,σx2,σy1,σy2) together with the spatial location of the maximum dose (μx,μy). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets. PMID:25735287
Propeller aircraft interior noise model. II - Scale-model and flight-test comparisons
NASA Technical Reports Server (NTRS)
Willis, C. M.; Mayes, W. H.
1987-01-01
A program for predicting the sound levels inside propeller driven aircraft arising from sidewall transmission of airborne exterior noise is validated through comparisons of predictions with both scale-model test results and measurements obtained in flight tests on a turboprop aircraft. The program produced unbiased predictions for the case of the scale-model tests, with a standard deviation of errors of about 4 dB. For the case of the flight tests, the predictions revealed a bias of 2.62-4.28 dB (depending upon whether or not the data for the fourth harmonic were included) and the standard deviation of the errors ranged between 2.43 and 4.12 dB. The analytical model is shown to be capable of taking changes in the flight environment into account.
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.; ...
2016-09-18
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem
NASA Astrophysics Data System (ADS)
Doyle, R. J.; Crichton, D.
2017-12-01
NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer opportunities to gain new insights from space missions and their vast data collections. We are working to innovate new architectures, exploit emerging technologies, develop new data-driven methodologies, and transfer them across disciplines, while working across the dual dimensions of the data lifecycle and the data ecosystem.
Definition of ground test for verification of large space structure control
NASA Technical Reports Server (NTRS)
Seltzer, S. M.; Doane, G. B., III
1985-01-01
Directions regarding the analytical models were received. A counter balance arm with weights was added at the top of the ASTROMAST to offset the arm with the gimbals. In addition to this model, three more models were requested from MSFC: structure as in the revised model with the addition of lumped masses at bays 46 and 91 of the ASTROMAST; cantilevered cruciform structure with lumped masses at bays 46 and 91, and an all up cruciform structure with lumped masses at bays 46 and 91. Figures for each model and their corresponding natural frequencies and general mode shapes associated with these frequencies are included. The drawbar in use in the cruciform models must be incorporated into the antenna and ASTROMAST models. The total tensile load carrying capability of the ASTROMAST is approximately 840 pounds.
A comparison between numerically modelled and experimentally measured loss mechanisms in wave rotors
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.
1993-01-01
A numerical model has been developed which is capable of predicting the performance of a wave rotor (pressure exchanger) of specified geometry over a wide range of operating conditions. The model can account for the major loss mechanisms of leakage from the tube ends, fluid viscosity, heat transfer to the tube wails, finite tube opening time, shock waves, and non-uniform port flows. It is a one dimensional flow model which follows a single tube as it rotates past the various stationary ports. Since the model is relatively simple (i.e., one dimensional) it uses little computer time. This makes it suitable for design as well as analytical purposes. This paper will present a brief description of the model then discuss a comparison between the model predictions and several wave rotor experiments.
A Comparison Between Numerically Modelled and Experimentally Measured Loss Mechanisms in Wave Rotors
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.
1993-01-01
A numerical model has been developed which is capable of predicting the performance of a wave rotor (pressure exchanger) of specified geometry over a wide range of operating conditions. The model can account for the major loss mechanisms of leakage from the tube ends, fluid viscosity, heat transfer to the tube walls, finite tube opening time, shock waves, and non-uniform port flows. It is a one dimensional flow model which follows a single tube as it rotates past the various stationary ports. Since the model is relatively simple (i.e. one dimensional) it uses little computer time. This makes it suitable for design as well as analytical purposes. This paper will present a brief description of the model then discuss a comparison between the model predictions and several wave rotor experiments.
A viscoelastic fluid-structure interaction model for carotid arteries under pulsatile flow.
Wang, Zhongjie; Wood, Nigel B; Xu, Xiao Yun
2015-05-01
In this study, a fluid-structure interaction model (FSI) incorporating viscoelastic wall behaviour is developed and applied to an idealized model of the carotid artery under pulsatile flow. The shear and bulk moduli of the arterial wall are described by Prony series, where the parameters can be derived from in vivo measurements. The aim is to develop a fully coupled FSI model that can be applied to realistic arterial geometries with normal or pathological viscoelastic wall behaviour. Comparisons between the numerical and analytical solutions for wall displacements demonstrate that the coupled model is capable of predicting the viscoelastic behaviour of carotid arteries. Comparisons are also made between the solid only and FSI viscoelastic models, and the results suggest that the difference in radial displacement between the two models is negligible. Copyright © 2015 John Wiley & Sons, Ltd.
Shifting from Stewardship to Analytics of Massive Science Data
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.
2015-12-01
Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.
Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.
Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark
2016-03-16
The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.
Acoustic Prediction State of the Art Assessment
NASA Technical Reports Server (NTRS)
Dahl, Milo D.
2007-01-01
The acoustic assessment task for both the Subsonic Fixed Wing and the Supersonic projects under NASA s Fundamental Aeronautics Program was designed to assess the current state-of-the-art in noise prediction capability and to establish baselines for gauging future progress. The documentation of our current capabilities included quantifying the differences between predictions of noise from computer codes and measurements of noise from experimental tests. Quantifying the accuracy of both the computed and experimental results further enhanced the credibility of the assessment. This presentation gives sample results from codes representative of NASA s capabilities in aircraft noise prediction both for systems and components. These include semi-empirical, statistical, analytical, and numerical codes. System level results are shown for both aircraft and engines. Component level results are shown for a landing gear prototype, for fan broadband noise, for jet noise from a subsonic round nozzle, and for propulsion airframe aeroacoustic interactions. Additional results are shown for modeling of the acoustic behavior of duct acoustic lining and the attenuation of sound in lined ducts with flow.
NASA Astrophysics Data System (ADS)
Linseis, V.; Völklein, F.; Reith, H.; Woias, P.; Nielsch, K.
2018-06-01
An analytical study has been performed on the measurement capabilities of a 100-nm thin suspended membrane setup for the in-plane thermal conductivity measurements of thin film samples using the 3 ω measurement technique, utilizing a COSMOL Multiphysics simulation. The maximum measurement range under observance of given boundary conditions has been studied. Three different exemplary sample materials, with a thickness from the nanometer to the micrometer range and a thermal conductivity from 0.4 W/mK up to 100 W/mK have been investigated as showcase studies. The results of the simulations have been compared to a previously published evaluation model, in order to determine the deviation between both and thereby the measurement limit. As thermal transport properties are temperature dependent, all calculations refer to constant room temperature conditions.
Custom-oriented wavefront sensor for human eye properties measurements
NASA Astrophysics Data System (ADS)
Galetskiy, Sergey; Letfullin, Renat; Dubinin, Alex; Cherezova, Tatyana; Belyakov, Alexey; Kudryashov, Alexis
2005-12-01
The problem of correct measurement of human eye aberrations is very important with the rising widespread of a surgical procedure for reducing refractive error in the eye, so called, LASIK (laser-assisted in situ keratomileusis). In this paper we show capabilities to measure aberrations by means of the aberrometer built in our lab together with Active Optics Ltd. We discuss the calibration of the aberrometer and show invalidity to use for the ophthalmic calibration purposes the analytical equation based on thin lens formula. We show that proper analytical equation suitable for calibration should have dependence on the square of the distance increment and we illustrate this both by experiment and by Zemax Ray tracing modeling. Also the error caused by inhomogeneous intensity distribution of the beam imaged onto the aberrometer's Shack-Hartmann sensor is discussed.
Realini, Marco; Botteon, Alessandra; Colombo, Chiara; Noll, Sarah; Elliott, Stephen R.; Matousek, Pavel
2016-01-01
A recently developed micrometer-scale spatially offset Raman spectroscopy (μ-SORS) method provides a new analytical capability for investigating non-destructively the chemical composition of sub-surface, micrometer-scale thickness, diffusely scattering layers at depths beyond the reach of conventional confocal Raman microscopy. Here, we demonstrate experimentally, for the first time, the capability of μ-SORS to determine whether two detected chemical components originate from two separate layers or whether the two components are mixed together in a single layer. Such information is important in a number of areas, including conservation of cultural heritage objects, and is not available, for highly turbid media, from conventional Raman microscopy, where axial (confocal) scanning is not possible due to an inability to facilitate direct imaging within the highly scattering sample. This application constitutes an additional capability for μ-SORS in addition to its basic capacity to determine the overall chemical make-up of layers in a turbid system. PMID:26767641
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohimer, J.P.
The use of laser-based analytical methods in nuclear-fuel processing plants is considered. The species and locations for accountability, process control, and effluent control measurements in the Coprocessing, Thorex, and reference Purex fuel processing operations are identified and the conventional analytical methods used for these measurements are summarized. The laser analytical methods based upon Raman, absorption, fluorescence, and nonlinear spectroscopy are reviewed and evaluated for their use in fuel processing plants. After a comparison of the capabilities of the laser-based and conventional analytical methods, the promising areas of application of the laser-based methods in fuel processing plants are identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.
2015-01-01
The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less
A wetting and drying scheme for ROMS
Warner, John C.; Defne, Zafer; Haas, Kevin; Arango, Hernan G.
2013-01-01
The processes of wetting and drying have many important physical and biological impacts on shallow water systems. Inundation and dewatering effects on coastal mud flats and beaches occur on various time scales ranging from storm surge, periodic rise and fall of the tide, to infragravity wave motions. To correctly simulate these physical processes with a numerical model requires the capability of the computational cells to become inundated and dewatered. In this paper, we describe a method for wetting and drying based on an approach consistent with a cell-face blocking algorithm. The method allows water to always flow into any cell, but prevents outflow from a cell when the total depth in that cell is less than a user defined critical value. We describe the method, the implementation into the three-dimensional Regional Oceanographic Modeling System (ROMS), and exhibit the new capability under three scenarios: an analytical expression for shallow water flows, a dam break test case, and a realistic application to part of a wetland area along the Georgia Coast, USA.
Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts
NASA Technical Reports Server (NTRS)
Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.
1997-01-01
ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried out reliably with such existing capabilities and (3) the currently unavailable modeling capabilities that should receive high priority for near-term research and development. It should be emphasized that the study is concerned only with the class of 'fast time' analytical and simulation models. 'Real time' models, that typically involve humans-in-the-loop, comprise another extensive class which is not addressed in this report. However, the relationship between some of the fast-time models reviewed and a few well-known real-time models is identified in several parts of this report and the potential benefits from the combined use of these two classes of models-a very important subject-are discussed in chapters 4 and 7.
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.
2012-01-01
Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914
Shear joint capability versus bolt clearance
NASA Technical Reports Server (NTRS)
Lee, H. M.
1992-01-01
The results of a conservative analysis approach into the determination of shear joint strength capability for typical space-flight hardware as a function of the bolt-hole clearance specified in the design are presented. These joints are comprised of high-strength steel fasteners and abutments constructed of aluminum alloys familiar to the aerospace industry. A general analytical expression was first arrived at which relates bolt-hole clearance to the bolt shear load required to place all joint fasteners into a shear transferring position. Extension of this work allowed the analytical development of joint load capability as a function of the number of fasteners, shear strength of the bolt, bolt-hole clearance, and the desired factor of safety. Analysis results clearly indicate that a typical space-flight hardware joint can withstand significant loading when less than ideal bolt hole clearances are used in the design.
Thermal fatigue durability for advanced propulsion materials
NASA Technical Reports Server (NTRS)
Halford, Gary R.
1989-01-01
A review is presented of thermal and thermomechanical fatigue (TMF) crack initiation life prediction and cyclic constitutive modeling efforts sponsored recently by the NASA Lewis Research Center in support of advanced aeronautical propulsion research. A brief description is provided of the more significant material durability models that were created to describe TMF fatigue resistance of both isotropic and anisotropic superalloys, with and without oxidation resistant coatings. The two most significant crack initiation models are the cyclic damage accumulation model and the total strain version of strainrange partitioning. Unified viscoplastic cyclic constitutive models are also described. A troika of industry, university, and government research organizations contributed to the generation of these analytic models. Based upon current capabilities and established requirements, an attempt is made to project which TMF research activities most likely will impact future generation propulsion systems.
Size separation of analytes using monomeric surfactants
Yeung, Edward S.; Wei, Wei
2005-04-12
A sieving medium for use in the separation of analytes in a sample containing at least one such analyte comprises a monomeric non-ionic surfactant of the of the general formula, B-A, wherein A is a hydrophilic moiety and B is a hydrophobic moiety, present in a solvent at a concentration forming a self-assembled micelle configuration under selected conditions and having an aggregation number providing an equivalent weight capable of effecting the size separation of the sample solution so as to resolve a target analyte(s) in a solution containing the same, the size separation taking place in a chromatography or electrophoresis separation system.
The analytical representation of viscoelastic material properties using optimization techniques
NASA Technical Reports Server (NTRS)
Hill, S. A.
1993-01-01
This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.
2000-01-01
There has been no accurate procedure for modeling the high-speed impact of composite materials, but such an analytical capability will be required in designing reliable lightweight engine-containment systems. The majority of the models in use assume a linear elastic material response that does not vary with strain rate. However, for containment systems, polymer matrix composites incorporating ductile polymers are likely to be used. For such a material, the deformation response is likely to be nonlinear and to vary with strain rate. An analytical model has been developed at the NASA Glenn Research Center at Lewis Field that incorporates both of these features. A set of constitutive equations that was originally developed to analyze the viscoplastic deformation of metals (Ramaswamy-Stouffer equations) was modified to simulate the nonlinear, rate-dependent deformation of polymers. Specifically, the effects of hydrostatic stresses on the inelastic response, which can be significant in polymers, were accounted for by a modification of the definition of the effective stress. The constitutive equations were then incorporated into a composite micromechanics model based on the mechanics of materials theory. This theory predicts the deformation response of a composite material from the properties and behavior of the individual constituents. In this manner, the nonlinear, rate-dependent deformation response of a polymer matrix composite can be predicted.
Analytical and experimental investigation of flutter suppression by piezoelectric actuation
NASA Technical Reports Server (NTRS)
Heeg, Jennifer
1993-01-01
The objective of this research was to analytically and experimentally study the capabilities of piezoelectric plate actuators for suppressing flutter. Piezoelectric materials are characterized by their ability to produce voltage when subjected to a mechanical strain. The converse piezoelectric effect can be utilized to actuate a structure by applying a voltage. For this investigation, a two-degree-of-freedom wind tunnel model was designed, analyzed, and tested. The model consisted of a rigid wing and a flexible mount system that permitted a translational and a rotational degree of freedom. The model was designed such that flutter was encountered within the testing envelope of the wind tunnel. Actuators made of piezoelectric material were affixed to leaf springs of the mount system. Command signals, applied to the piezoelectric actuators, exerted control over the damping and stiffness properties. A mathematical aeroservoelastic model was constructed by using finite element methods, laminated plate theory, and aeroelastic analysis tools. Plant characteristics were determined from this model and verified by open loop experimental tests. A flutter suppression control law was designed and implemented on a digital control computer. Closed loop flutter testing was conducted. The experimental results represent the first time that adaptive materials have been used to actively suppress flutter. They demonstrate that small, carefully placed actuating plates can be used effectively to control aeroelastic response.
An analytical and experimental investigation of flutter suppression via piezoelectric actuation
NASA Technical Reports Server (NTRS)
Heeg, Jennifer
1992-01-01
The objective of this research was to analytically and experimentally study the capabilities of adaptive material plate actuators for suppressing flutter. Piezoelectrics are materials which are characterized by their ability to produce voltage when subjected to a mechanical strain. The converse piezoelectric effect can be utilized to actuate a structure by applying a voltage. For this investigation, a two degree of freedom wind-tunnel model was designed, analyzed, and tested. The model consisted of a rigid wing and a flexible mount system which permitted translational and rotational degrees of freedom. Actuators, made of piezoelectric material were affixed to leaf springs on the mount system. Command signals, applied to the piezoelectric actuators, exerted control over the closed-loop damping and stiffness properties. A mathematical aeroservoelastic model was constructed using finite element and stiffness properties. A mathematical aeroservoelastic model was constructed using finite element methods, laminated plate theory, and aeroelastic analysis tools. A flutter suppression control law was designed, implemented on a digital control computer, and tested to conditions 20 percent above the passive flutter speed of the model. The experimental results represent the first time that adaptive materials have been used to actively suppress flutter. It demonstrates that small, carefully-placed actuating plates can be used effectively to control aeroelastic response.
A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.
Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher
2017-08-01
The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1990-01-01
The design, implementation, and empirical evaluation of task-analytic models and intelligent aids for operators in the control of complex dynamic systems, specifically aerospace systems, are studied. Three related activities are included: (1) the models of operator decision making in complex and predominantly automated space systems were used and developed; (2) the Operator Function Model (OFM) was used to represent operator activities; and (3) Operator Function Model Expert System (OFMspert), a stand-alone knowledge-based system was developed, that interacts with a human operator in a manner similar to a human assistant in the control of aerospace systems. OFMspert is an architecture for an operator's assistant that uses the OFM as its system and operator knowledge base and a blackboard paradigm of problem solving to dynamically generate expectations about upcoming operator activities and interpreting actual operator actions. An experiment validated the OFMspert's intent inferencing capability and showed that it inferred the intentions of operators in ways comparable to both a human expert and operators themselves. OFMspert was also augmented with control capabilities. An interface allowed the operator to interact with OFMspert, delegating as much or as little control responsibility as the operator chose. With its design based on the OFM, OFMspert's control capabilities were available at multiple levels of abstraction and allowed the operator a great deal of discretion over the amount and level of delegated control. An experiment showed that overall system performance was comparable for teams consisting of two human operators versus a human operator and OFMspert team.
Capabilities for Intercultural Dialogue
ERIC Educational Resources Information Center
Crosbie, Veronica
2014-01-01
The capabilities approach offers a valuable analytical lens for exploring the challenge and complexity of intercultural dialogue in contemporary settings. The central tenets of the approach, developed by Amartya Sen and Martha Nussbaum, involve a set of humanistic goals including the recognition that development is a process whereby people's…
Dynamics of Atmospheric Boundary Layers: Large-Eddy Simulations and Reduced Analytical Models
NASA Astrophysics Data System (ADS)
Momen, Mostafa
Real-world atmospheric and oceanic boundary layers (ABL) involve many inherent complexities, the understanding and modeling of which manifestly exceeds our current capabilities. Previous studies largely focused on the "textbook ABL", which is (quasi) steady and barotropic. However, it is evident that the "real-world ABL", even over flat terrain, rarely meets such simplifying assumptions. The present thesis aims to illustrate and model four complicating features of ABLs that have been overlooked thus far despite their ubiquity: 1) unsteady pressure gradients in neutral ABLs (Chapters 2 and 3), 2) interacting effects of unsteady pressure gradients and static stability in diabatic ABLs (Chapter 4), 3) time-variable buoyancy fluxes (Chapter 5) , and 4) impacts of baroclinicity in neutral and diabatic ABLs (Chapter 6). State-of-the-art large-eddy simulations will be used as a tool to explain the underlying physics and to validate analytical models we develop for these features. Chapter 2 focuses on the turbulence equilibrium: when the forcing time scale is comparable to the turbulence time scale, the turbulence is shown to be out of equilibrium, and the velocity profiles depart from the log-law; However, for longer, and surprisingly for shorter forcing times, quasi-equilibrium is maintained. In Chapter 3, a reduced analytical model, based on the Navier-Stokes equations, will be introduced and shown to be analogous to a damped oscillator where inertial, Coriolis, and friction forces mirror the mass, spring, and damper, respectively. When a steady buoyancy (stable or unstable) is superposed on the unsteady pressure gradient, the same model structure can be maintained, but the damping term, corresponding to friction forces and vertical coupling, needs to account for stability. However, for the reverse case with variable buoyancy flux and stability, the model needs to be extended to allow time-variable damper coefficient. These extensions of the analytical model are presented respectively in Chapters 4 and 5. Chapter 6 investigates the interacting effects of baroclinicity (direction and strength) and stability on ABLs. Cold advection and positive shear increased the friction velocity, the low-level jet elevation and strength while warm advection and negative shear acted opposite. Finally, Chapter 7 provides a synthesis and a future outlook.
Deformation of Polymer Composites in Force Protection Systems
NASA Astrophysics Data System (ADS)
Nazarian, Oshin
Systems used for protecting personnel, vehicles and infrastructure from ballistic and blast threats derive their performance from a combination of the intrinsic properties of the constituent materials and the way in which the materials are arranged and attached to one another. The present work addresses outstanding issues in both the intrinsic properties of high-performance fiber composites and the consequences of how such composites are integrated into force protection systems. One aim is to develop a constitutive model for the large-strain intralaminar shear deformation of an ultra-high molecular weight polyethylene (UHMWPE) fiber-reinforced composite. To this end, an analytical model based on a binary representation of the constituent phases is developed and validated using finite element analyses. The model is assessed through comparisons with experimental measurements on cross-ply composite specimens in the +/-45° orientation. The hardening behavior and the limiting tensile strain are attributable to rotations of fibers in the plastic domain and the effects of these rotations on the internal stress state. The model is further assessed through quasi-static punch experiments and dynamic impact tests using metal foam projectiles. The finite element model based on this model accurately captures both the back-face deflection-time history and the final plate profile (especially the changes caused by fiber pull-in). A separate analytical framework for describing the accelerations caused by head impact during, for example, the secondary collision of a vehicle occupant with the cabin interior during an external event is also presented. The severity of impact, characterized by the Head Injury Criterion (HIC), is used to assess the efficacy of crushable foams in mitigating head injury. The framework is used to identify the optimal foam strength that minimizes the HIC for prescribed mass and velocity, subject to constraints on foam thickness. The predictive capability of the model is evaluated through comparisons with a series of experimental measurements from impacts of an instrumented headform onto several commercial foams. Additional comparisons are made with the results of finite element simulations. An analytical model for the planar impact of a cylindrical mass on a foam is also developed. This model sets a theoretical bound for the reduction in HIC by utilizing a "plate-on-foam" design. Experimental results of impact tests on foams coupled with stiff composite plates are presented, with comparisons to the theoretical limits predicted by the analytical model. Design maps are developed from the analytical models, illustrating the variations in the HIC with foam strength and impact velocity.
Benefit-cost evaluation of an intra-regional air service in the Bay area
NASA Technical Reports Server (NTRS)
Haefner, L. E.
1977-01-01
Utilization of an iterative statistical model is presented to evaluate combinations of commuter airport sites and surface transportation facilities in confunction with service by a given commuter aircraft type in light of Bay Area regional growth alternatives and peak and off-peak regional travel patterns. The model evaluates such transportation options with respect to criteria of airline profitability, public acceptance, and public and private nonuser costs. It incorporates information modal split, peak and off-peak use of the air commuter fleet, terminal and airport cost, development costs and uses of land in proximity to the airport sites, regional population shifts, and induced zonal shifts in travel demand. The model is multimodal in its analytical capability, and performs exhaustive sensitivity analysis.
Lessons Learned from Deploying an Analytical Task Management Database
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen
2007-01-01
Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.
Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.
Yago, Martín; Alcover, Silvia
2016-07-01
According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.
Bedekar, Vivek; Morway, Eric D.; Langevin, Christian D.; Tonkin, Matthew J.
2016-09-30
MT3D-USGS, a U.S. Geological Survey updated release of the groundwater solute transport code MT3DMS, includes new transport modeling capabilities to accommodate flow terms calculated by MODFLOW packages that were previously unsupported by MT3DMS and to provide greater flexibility in the simulation of solute transport and reactive solute transport. Unsaturated-zone transport and transport within streams and lakes, including solute exchange with connected groundwater, are among the new capabilities included in the MT3D-USGS code. MT3D-USGS also includes the capability to route a solute through dry cells that may occur in the Newton-Raphson formulation of MODFLOW (that is, MODFLOW-NWT). New chemical reaction Package options include the ability to simulate inter-species reactions and parent-daughter chain reactions. A new pump-and-treat recirculation package enables the simulation of dynamic recirculation with or without treatment for combinations of wells that are represented in the flow model, mimicking the above-ground treatment of extracted water. A reformulation of the treatment of transient mass storage improves conservation of mass and yields solutions for better agreement with analytical benchmarks. Several additional features of MT3D-USGS are (1) the separate specification of the partitioning coefficient (Kd) within mobile and immobile domains; (2) the capability to assign prescribed concentrations to the top-most active layer; (3) the change in mass storage owing to the change in water volume now appears as its own budget item in the global mass balance summary; (4) the ability to ignore cross-dispersion terms; (5) the definition of Hydrocarbon Spill-Source Package (HSS) mass loading zones using regular and irregular polygons, in addition to the currently supported circular zones; and (6) the ability to specify an absolute minimum thickness rather than the default percent minimum thickness in dry-cell circumstances.Benchmark problems that implement the new features and packages test the accuracy of new code through comparison to analytical benchmarks, as well as to solutions from other published codes. The input file structure for MT3D-USGS adheres to MT3DMS conventions for backward compatibility: the new capabilities and packages described herein are readily invoked by adding three-letter package name acronyms to the name file or by setting input flags as needed. Memory is managed in MT3D-USGS using FORTRAN modules in order to simplify code development and expansion.
Structural Dynamics Modeling of HIRENASD in Support of the Aeroelastic Prediction Workshop
NASA Technical Reports Server (NTRS)
Wieseman, Carol; Chwalowski, Pawel; Heeg, Jennifer; Boucke, Alexander; Castro, Jack
2013-01-01
An Aeroelastic Prediction Workshop (AePW) was held in April 2012 using three aeroelasticity case study wind tunnel tests for assessing the capabilities of various codes in making aeroelasticity predictions. One of these case studies was known as the HIRENASD model that was tested in the European Transonic Wind Tunnel (ETW). This paper summarizes the development of a standardized enhanced analytical HIRENASD structural model for use in the AePW effort. The modifications to the HIRENASD finite element model were validated by comparing modal frequencies, evaluating modal assurance criteria, comparing leading edge, trailing edge and twist of the wing with experiment and by performing steady and unsteady CFD analyses for one of the test conditions on the same grid, and identical processing of results.
A new analytical compact model for two-dimensional finger photodiodes
NASA Astrophysics Data System (ADS)
Naeve, T.; Hohenbild, M.; Seegebrecht, P.
2008-02-01
A new physically based circuit simulation model for finger photodiodes has been proposed. The approach is based on the solution of transport and continuity equation for generated carriers within the two-dimensional structure. As an example we present results of a diode consisting of N+-fingers located in a P-well on top of a N-type buried layer integrated in a P-type silicon substrate (N+/PW/NBL/Psub finger photodiode). The model is capable to predict the sensitivity of the diode in a wide spectral range very accurately. The structure under consideration was fabricated in an industrial 0.6 μm BiCMOS process. The good agreement of simulated sensitivity data with results of measurements and numerical simulations demonstrate the high quality of our model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.
2004-04-01
There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less
The expected results method for data verification
NASA Astrophysics Data System (ADS)
Monday, Paul
2016-05-01
The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-06-08
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
Hybrid boosters for future launch vehicles
NASA Astrophysics Data System (ADS)
Dargies, E.; Lo, R. E.
1987-10-01
Hybrid rocket propulsion systems furnish the advantages of much higher safety levels, due both to shut-down capability in case of ignition failure to one unit and the potential choice of nontoxic propellant combinations, such as LOX/polyethylene; they nevertheless yield performance levels comparable or superior to those of solid rocket boosters. Attention is presently given to the results of DFVLR analytical model studies of hybrid propulsion systems, with attention to solid fuel grain geometrical design and propellant grain surface ablation rate. The safety of hybrid rockets recommends them for use by manned spacecraft.
Boundary cooled rocket engines for space storable propellants
NASA Technical Reports Server (NTRS)
Kesselring, R. C.; Mcfarland, B. L.; Knight, R. M.; Gurnitz, R. N.
1972-01-01
An evaluation of an existing analytical heat transfer model was made to develop the technology of boundary film/conduction cooled rocket thrust chambers to the space storable propellant combination oxygen difluoride/diborane. Critical design parameters were identified and their importance determined. Test reduction methods were developed to enable data obtained from short duration hot firings with a thin walled (calorimeter) chamber to be used quantitatively evaluate the heat absorbing capability of the vapor film. The modification of the existing like-doublet injector was based on the results obtained from the calorimeter firings.
2002-12-19
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft (No. 843) flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2002-12-19
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft (No. 843) flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2002-12-19
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft (No. 843) flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
Process Improvement Through Tool Integration in Aero-Mechanical Design
NASA Technical Reports Server (NTRS)
Briggs, Clark
2010-01-01
Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.
Introducing Chemometrics to the Analytical Curriculum: Combining Theory and Lab Experience
ERIC Educational Resources Information Center
Gilbert, Michael K.; Luttrell, Robert D.; Stout, David; Vogt, Frank
2008-01-01
Beer's law is an ideal technique that works only in certain situations. A method for dealing with more complex conditions needs to be integrated into the analytical chemistry curriculum. For that reason, the capabilities and limitations of two common chemometric algorithms, classical least squares (CLS) and principal component regression (PCR),…
USDA-ARS?s Scientific Manuscript database
Most analytical methods for persistent organic pollutants (POPs) focus on targeted analytes. Therefore, analysis of multiple classes of POPs typically entails several sample preparations, fractionations, and injections, whereas other chemicals of possible interest are neglected. To analyze a wider...
Give Me a Customizable Dashboard: Personalized Learning Analytics Dashboards in Higher Education
ERIC Educational Resources Information Center
Roberts, Lynne D.; Howell, Joel A.; Seaman, Kristen
2017-01-01
With the increased capability of learning analytics in higher education, more institutions are developing or implementing student dashboards. Despite the emergence of dashboards as an easy way to present data to students, students have had limited involvement in the dashboard development process. As part of a larger program of research examining…
NASA Astrophysics Data System (ADS)
Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.
2017-12-01
Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.
Differential homogeneous immunosensor device
Malmros, Mark K.; Gulbinski, III, Julian
1990-04-10
There is provided a novel method of testing for the presence of an analyte in a fluid suspected of containing the same. In this method, in the presence of the analyte, a substance capable of modifying certain characteristics of the substrate is bound to the substrate and the change in these qualities is measured. While the method may be modified for carrying out quantitative differential analyses, it eliminates the need for washing analyte from the substrate which is characteristic of prior art methods.
Decision making in prioritization of required operational capabilities
NASA Astrophysics Data System (ADS)
Andreeva, P.; Karev, M.; Kovacheva, Ts.
2015-10-01
The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.
ERIC Educational Resources Information Center
Burton, Hilary D.
TIS (Technology Information System) is an intelligent gateway system capable of performing quantitative evaluation and analysis of bibliographic citations using a set of Process functions. Originally developed by Lawrence Livermore National Laboratory (LLNL) to analyze information retrieved from three major federal databases, DOE/RECON,…
Intelligent Vehicle Mobility M&S Capability Development (FY13 innovation Project) (Briefing Charts)
2014-05-19
Intelligent Vehicle Mobility M&S Capability Development (FY13 Innovation Project) P. Jayakumar and J. Raymond, Analytics 19 May 2014...PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Paramsithy Jayakumar ; J Raymond 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING
Experimental and Analytical Evaluation of a Composite Honeycomb Deployable Energy Absorber
NASA Technical Reports Server (NTRS)
Jackson, Karen E.; Kellas, Sotiris; Horta, Lucas G.; Annett, Martin S.; Polanco, Michael A.; Littell, Justin D.; Fasanella, Edwin L.
2011-01-01
In 2006, the NASA Subsonic Rotary Wing Aeronautics Program sponsored the experimental and analytical evaluation of an externally deployable composite honeycomb structure that is designed to attenuate impact energy during helicopter crashes. The concept, which is designated the Deployable Energy Absorber (DEA), utilizes an expandable Kevlar honeycomb structure to dissipate kinetic energy through crushing. The DEA incorporates a unique flexible hinge design that allows the honeycomb to be packaged and stowed flat until needed for deployment. A variety of deployment options such as linear, radial, and/or hybrid methods can be used. Experimental evaluation of the DEA utilized a building block approach that included material characterization testing of its constituent, Kevlar -129 fabric/epoxy, and flexural testing of single hexagonal cells. In addition, the energy attenuation capabilities of the DEA were demonstrated through multi-cell component dynamic crush tests, and vertical drop tests of a composite fuselage section, retrofitted with DEA blocks, onto concrete, water, and soft soil. During each stage of the DEA evaluation process, finite element models of the test articles were developed and simulations were performed using the explicit, nonlinear transient dynamic finite element code, LS-DYNA. This report documents the results of the experimental evaluation that was conducted to assess the energy absorption capabilities of the DEA.
Cross-Disciplinary Consultancy to Enhance Predictions of Asthma Exacerbation Risk in Boston.
Reid, Margaret; Gunn, Julia; Shah, Snehal; Donovan, Michael; Eggo, Rosalind; Babin, Steven; Stajner, Ivanka; Rogers, Eric; Ensor, Katherine B; Raun, Loren; Levy, Jonathan I; Painter, Ian; Phipatanakul, Wanda; Yip, Fuyuen; Nath, Anjali; Streichert, Laura C; Tong, Catherine; Burkom, Howard
2016-01-01
This paper continues an initiative conducted by the International Society for Disease Surveillance with funding from the Defense Threat Reduction Agency to connect near-term analytical needs of public health practice with technical expertise from the global research community. The goal is to enhance investigation capabilities of day-to-day population health monitors. A prior paper described the formation of consultancies for requirements analysis and dialogue regarding costs and benefits of sustainable analytic tools. Each funded consultancy targets a use case of near-term concern to practitioners. The consultancy featured here focused on improving predictions of asthma exacerbation risk in demographic and geographic subdivisions of the city of Boston, Massachusetts, USA based on the combination of known risk factors for which evidence is routinely available. A cross-disciplinary group of 28 stakeholders attended the consultancy on March 30-31, 2016 at the Boston Public Health Commission. Known asthma exacerbation risk factors are upper respiratory virus transmission, particularly in school-age children, harsh or extreme weather conditions, and poor air quality. Meteorological subject matter experts described availability and usage of data sources representing these risk factors. Modelers presented multiple analytic approaches including mechanistic models, machine learning approaches, simulation techniques, and hybrids. Health department staff and local partners discussed surveillance operations, constraints, and operational system requirements. Attendees valued the direct exchange of information among public health practitioners, system designers, and modelers. Discussion finalized design of an 8-year de-identified dataset of Boston ED patient records for modeling partners who sign a standard data use agreement.
Cross-Disciplinary Consultancy to Enhance Predictions of Asthma Exacerbation Risk in Boston
Reid, Margaret; Gunn, Julia; Shah, Snehal; Donovan, Michael; Eggo, Rosalind; Babin, Steven; Stajner, Ivanka; Rogers, Eric; Ensor, Katherine B.; Raun, Loren; Levy, Jonathan I.; Painter, Ian; Phipatanakul, Wanda; Yip, Fuyuen; Nath, Anjali; Streichert, Laura C.; Tong, Catherine
2016-01-01
This paper continues an initiative conducted by the International Society for Disease Surveillance with funding from the Defense Threat Reduction Agency to connect near-term analytical needs of public health practice with technical expertise from the global research community. The goal is to enhance investigation capabilities of day-to-day population health monitors. A prior paper described the formation of consultancies for requirements analysis and dialogue regarding costs and benefits of sustainable analytic tools. Each funded consultancy targets a use case of near-term concern to practitioners. The consultancy featured here focused on improving predictions of asthma exacerbation risk in demographic and geographic subdivisions of the city of Boston, Massachusetts, USA based on the combination of known risk factors for which evidence is routinely available. A cross-disciplinary group of 28 stakeholders attended the consultancy on March 30-31, 2016 at the Boston Public Health Commission. Known asthma exacerbation risk factors are upper respiratory virus transmission, particularly in school-age children, harsh or extreme weather conditions, and poor air quality. Meteorological subject matter experts described availability and usage of data sources representing these risk factors. Modelers presented multiple analytic approaches including mechanistic models, machine learning approaches, simulation techniques, and hybrids. Health department staff and local partners discussed surveillance operations, constraints, and operational system requirements. Attendees valued the direct exchange of information among public health practitioners, system designers, and modelers. Discussion finalized design of an 8-year de-identified dataset of Boston ED patient records for modeling partners who sign a standard data use agreement. PMID:28210420
Shelley, Jacob T.; Wiley, Joshua S.; Hieftje, Gary M.
2011-01-01
The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the Flowing Atmospheric-Pressure Afterglow (FAPA). FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn, and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown. PMID:21627097
Shelley, Jacob T; Wiley, Joshua S; Hieftje, Gary M
2011-07-15
The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the flowing atmospheric-pressure afterglow (FAPA). The FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown.
Development of a robust space power system decision model
NASA Astrophysics Data System (ADS)
Chew, Gilbert; Pelaccio, Dennis G.; Jacobs, Mark; Stancati, Michael; Cataldo, Robert
2001-02-01
NASA continues to evaluate power systems to support human exploration of the Moon and Mars. The system(s) would address all power needs of surface bases and on-board power for space transfer vehicles. Prior studies have examined both solar and nuclear-based alternatives with respect to individual issues such as sizing or cost. What has not been addressed is a comprehensive look at the risks and benefits of the options that could serve as the analytical framework to support a system choice that best serves the needs of the exploration program. This paper describes the SAIC developed Space Power System Decision Model, which uses a formal Two-step Analytical Hierarchy Process (TAHP) methodology that is used in the decision-making process to clearly distinguish candidate power systems in terms of benefits, safety, and risk. TAHP is a decision making process based on the Analytical Hierarchy Process, which employs a hierarchic approach of structuring decision factors by weights, and relatively ranks system design options on a consistent basis. This decision process also includes a level of data gathering and organization that produces a consistent, well-documented assessment, from which the capability of each power system option to meet top-level goals can be prioritized. The model defined on this effort focuses on the comparative assessment candidate power system options for Mars surface application(s). This paper describes the principles of this approach, the assessment criteria and weighting procedures, and the tools to capture and assess the expert knowledge associated with space power system evaluation. .
Bending of an Infinite beam on a base with two parameters in the absence of a part of the base
NASA Astrophysics Data System (ADS)
Aleksandrovskiy, Maxim; Zaharova, Lidiya
2018-03-01
Currently, in connection with the rapid development of high-rise construction and the improvement of joint operation of high-rise structures and bases models, the questions connected with the use of various calculation methods become topical. The rigor of analytical methods is capable of more detailed and accurate characterization of the structures behavior, which will affect the reliability of objects and can lead to a reduction in their cost. In the article, a model with two parameters is used as a computational model of the base that can effectively take into account the distributive properties of the base by varying the coefficient reflecting the shift parameter. The paper constructs the effective analytical solution of the problem of a beam of infinite length interacting with a two-parameter voided base. Using the Fourier integral equations, the original differential equation is reduced to the Fredholm integral equation of the second kind with a degenerate kernel, and all the integrals are solved analytically and explicitly, which leads to an increase in the accuracy of the computations in comparison with the approximate methods. The paper consider the solution of the problem of a beam loaded with a concentrated force applied at the point of origin with a fixed value of the length of the dip section. The paper gives the analysis of the obtained results values for various parameters of coefficient taking into account cohesion of the ground.
A Numerical Study of the Effects of Curvature and Convergence on Dilution Jet Mixing
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Reynolds, R.; White, C.
1987-01-01
An analytical program was conducted to assemble and assess a three-dimensional turbulent viscous flow computer code capable of analyzing the flow field in the transition liners of small gas turbine engines. This code is of the TEACH type with hybrid numerics, and uses the power law and SIMPLER algorithms, an orthogonal curvilinear coordinate system, and an algebraic Reynolds stress turbulence model. The assessments performed in this study, consistent with results in the literature, showed that in its present form this code is capable of predicting trends and qualitative results. The assembled code was used to perform a numerical experiment to investigate the effects of curvature and convergence in the transition liner on the mixing of single and opposed rows of cool dilution jets injected into a hot mainstream flow.
A numerical study of the effects of curvature and convergence on dilution jet mixing
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Reynolds, R.; White, C.
1987-01-01
An analytical program was conducted to assemble and assess a three-dimensional turbulent viscous flow computer code capable of analyzing the flow field in the transition liners of small gas turbine engines. This code is of the TEACH type with hybrid numerics, and uses the power law and SIMPLER algorithms, an orthogonal curvilinear coordinate system, and an algebraic Reynolds stress turbulence model. The assessments performed in this study, consistent with results in the literature, showed that in its present form this code is capable of predicting trends and qualitative results. The assembled code was used to perform a numerical experiment to investigate the effects of curvature and convergence in the transition liner on the mixing of single and opposed rows of cool dilution jets injected into a hot mainstream flow.
Resolved-particle simulation by the Physalis method: Enhancements and new capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierakowski, Adam J., E-mail: sierakowski@jhu.edu; Prosperetti, Andrea; Faculty of Science and Technology and J.M. Burgers Centre for Fluid Dynamics, University of Twente, P.O. Box 217, 7500 AE Enschede
2016-03-15
We present enhancements and new capabilities of the Physalis method for simulating disperse multiphase flows using particle-resolved simulation. The current work enhances the previous method by incorporating a new type of pressure-Poisson solver that couples with a new Physalis particle pressure boundary condition scheme and a new particle interior treatment to significantly improve overall numerical efficiency. Further, we implement a more efficient method of calculating the Physalis scalar products and incorporate short-range particle interaction models. We provide validation and benchmarking for the Physalis method against experiments of a sedimenting particle and of normal wall collisions. We conclude with an illustrativemore » simulation of 2048 particles sedimenting in a duct. In the appendix, we present a complete and self-consistent description of the analytical development and numerical methods.« less
NASA Technical Reports Server (NTRS)
Hess, R. A.
1977-01-01
A brief review of some of the more pertinent applications of analytical pilot models to the prediction of aircraft handling qualities is undertaken. The relative ease with which multiloop piloting tasks can be modeled via the optimal control formulation makes the use of optimal pilot models particularly attractive for handling qualities research. To this end, a rating hypothesis is introduced which relates the numerical pilot opinion rating assigned to a particular vehicle and task to the numerical value of the index of performance resulting from an optimal pilot modeling procedure as applied to that vehicle and task. This hypothesis is tested using data from piloted simulations and is shown to be reasonable. An example concerning a helicopter landing approach is introduced to outline the predictive capability of the rating hypothesis in multiaxis piloting tasks.
Mapping the magnonic landscape in patterned magnetic structures
NASA Astrophysics Data System (ADS)
Davies, C. S.; Poimanov, V. D.; Kruglyak, V. V.
2017-09-01
We report the development of a hybrid numerical/analytical model capable of mapping the spatially varying distributions of the local ferromagnetic resonance (FMR) frequency and dynamic magnetic susceptibility in a wide class of patterned and compositionally modulated magnetic structures. Starting from the numerically simulated static micromagnetic state, the magnetization is deliberately deflected orthogonally to its equilibrium orientation, and the magnetic fields generated in response to this deflection are evaluated using micromagnetic software. This allows us to calculate the elements of the effective demagnetizing tensor, which are then used within a linear analytical formalism to map the local FMR frequency and dynamic magnetic susceptibility. To illustrate the typical results that one can obtain using this model, we analyze three micromagnetic systems boasting nonuniformity in either one or two dimensions, and successfully explain the spin-wave emission observed in each case, demonstrating the ubiquitous nature of the Schlömann excitation mechanism underpinning the observations. Finally, the developed model of local FMR frequency can be used to explain how spin waves could be confined and steered using magnetic nonuniformities of various origins, rendering it a powerful tool for the mapping of the graded magnonic index in magnonics.
Simulation and analysis of airborne antenna radiation patterns
NASA Technical Reports Server (NTRS)
Kim, J. J.; Burnside, Walter D.
1984-01-01
The objective is to develop an accurate and efficient analytic solution for predicting high frequency radiation patterns of fuselage-mounted airborne antennas. This is an analytic study of airborne antenna patterns using the Uniform Geometrical Theory of Diffraction (UTD). The aircraft is modeled in its most basic form so that the solution is applicable to general-type aircraft. The fuselage is modeled as a perfectly conducting composite ellipsoid; whereas, the wings, stabilizers, nose, fuel tanks, and engines, are simulated as perfectly conducting flat plates that can be attached to the fuselage and/or to each other. The composite-ellipsoid fuselage model is necessary to successfully simulate the wide variety of real world fuselage shapes. Since the antenna is mounted on the fuselage, it has a dominant effect on the resulting radiation pattern so it must be simulated accurately, especially near the antenna. Various radiation patterns are calculated for commercial, private, and military aircraft, and the Space Shuttle Orbiter. The application of this solution to numerous practical airborne antenna problems illustrates its versatility and design capability. In most cases, the solution accuracy is verified by the comparisons between the calculated and measured data.
NASA Astrophysics Data System (ADS)
Valdivia, V.; Barrado, A.; Lazaro, A.; Rueda, P.; Tonicello, F.; Fernandez, A.; Mourra, O.
2011-10-01
Solar array simulators (SASs) are hardware devices, commonly applied instead of actual solar arrays (SAs) during the design process of spacecrafts power conditioning and distribution units (PCDUs), and during spacecrafts assembly integration and tests. However, the dynamic responses between SASs and actual SAs are usually different. This fact plays an important role, since the dynamic response of the SAS may influence significantly the dynamic behaviour of the PCDU under certain conditions, even leading to instability. This paper deals with the dynamic interactions between SASs and PCDUs. Several methods for dynamic characterization of the SASs are discussed, and the response of commercial SASs widely applied in the space industry is compared to that of actual SAs. After that, the interactions are experimentally analyzed by using a boost converter connected to the aforementioned SASs, thus demonstrating their critical importance. The interactions are first tackled analytically by means of small-signal models, and finally a black-box modelling method of SASs is proposed as a useful tool to analyze the interactions by means of simulation. The capabilities of both the analytical method and the black- box model to predict the interactions are demonstrated.
Nagy, Brigitta; Farkas, Attila; Gyürkés, Martin; Komaromy-Hiller, Szofia; Démuth, Balázs; Szabó, Bence; Nusser, Dávid; Borbás, Enikő; Marosi, György; Nagy, Zsombor Kristóf
2017-09-15
The integration of Process Analytical Technology (PAT) initiative into the continuous production of pharmaceuticals is indispensable for reliable production. The present paper reports the implementation of in-line Raman spectroscopy in a continuous blending and tableting process of a three-component model pharmaceutical system, containing caffeine as model active pharmaceutical ingredient (API), glucose as model excipient and magnesium stearate as lubricant. The real-time analysis of API content, blend homogeneity, and tablet content uniformity was performed using a Partial Least Squares (PLS) quantitative method. The in-line Raman spectroscopic monitoring showed that the continuous blender was capable of producing blends with high homogeneity, and technological malfunctions can be detected by the proposed PAT method. The Raman spectroscopy-based feedback control of the API feeder was also established, creating a 'Process Analytically Controlled Technology' (PACT), which guarantees the required API content in the produced blend. This is, to the best of the authors' knowledge, the first ever application of Raman-spectroscopy in continuous blending and the first Raman-based feedback control in the formulation technology of solid pharmaceuticals. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Perelló, Josep; Masoliver, Jaume; Kasprzak, Andrzej; Kutner, Ryszard
2008-09-01
Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.
Firing patterns in the adaptive exponential integrate-and-fire model.
Naud, Richard; Marcille, Nicolas; Clopath, Claudia; Gerstner, Wulfram
2008-11-01
For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulations.
Scalable Visual Analytics of Massive Textual Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.
2007-04-01
This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.
Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.
1999-01-01
As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.
Retention in porous layer pillar array planar separation platforms
Lincoln, Danielle R.; Lavrik, Nickolay V.; Kravchenko, Ivan I.; ...
2016-08-11
Here, this work presents the retention capabilities and surface area enhancement of highly ordered, high-aspect-ratio, open-platform, two-dimensional (2D) pillar arrays when coated with a thin layer of porous silicon oxide (PSO). Photolithographically prepared pillar arrays were coated with 50–250 nm of PSO via plasma-enhanced chemical vapor deposition and then functionalized with either octadecyltrichlorosilane or n-butyldimethylchlorosilane. Theoretical calculations indicate that a 50 nm layer of PSO increases the surface area of a pillar nearly 120-fold. Retention capabilities were tested by observing capillary-action-driven development under various conditions, as well as by running one-dimensional separations on varying thicknesses of PSO. Increasing the thicknessmore » of PSO on an array clearly resulted in greater retention of the analyte(s) in question in both experiments. In culmination, a two-dimensional separation of fluorescently derivatized amines was performed to further demonstrate the capabilities of these fabricated platforms.« less
Retention in porous layer pillar array planar separation platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Danielle R.; Lavrik, Nickolay V.; Kravchenko, Ivan I.
Here, this work presents the retention capabilities and surface area enhancement of highly ordered, high-aspect-ratio, open-platform, two-dimensional (2D) pillar arrays when coated with a thin layer of porous silicon oxide (PSO). Photolithographically prepared pillar arrays were coated with 50–250 nm of PSO via plasma-enhanced chemical vapor deposition and then functionalized with either octadecyltrichlorosilane or n-butyldimethylchlorosilane. Theoretical calculations indicate that a 50 nm layer of PSO increases the surface area of a pillar nearly 120-fold. Retention capabilities were tested by observing capillary-action-driven development under various conditions, as well as by running one-dimensional separations on varying thicknesses of PSO. Increasing the thicknessmore » of PSO on an array clearly resulted in greater retention of the analyte(s) in question in both experiments. In culmination, a two-dimensional separation of fluorescently derivatized amines was performed to further demonstrate the capabilities of these fabricated platforms.« less
NASA Astrophysics Data System (ADS)
Lakra, Suchita; Mandal, Sanjoy
2017-06-01
A quadruple micro-optical ring resonator (QMORR) with multiple output bus waveguides is mathematically modeled and analyzed by making use of the delay-line signal processing approach in Z-domain and Mason's gain formula. The performances of QMORR with two output bus waveguides with vertical coupling are analyzed. This proposed structure is capable of providing wider free spectral response from both the output buses with appreciable cross talk. Thus, this configuration could provide increased capacity to insert a large number of communication channels. The simulated frequency response characteristic and its dispersion and group delay characteristics are graphically presented using the MATLAB environment.
Modeling and Visualizing Flow of Chemical Agents Across Complex Terrain
NASA Technical Reports Server (NTRS)
Kao, David; Kramer, Marc; Chaderjian, Neal
2005-01-01
Release of chemical agents across complex terrain presents a real threat to homeland security. Modeling and visualization tools are being developed that capture flow fluid terrain interaction as well as point dispersal downstream flow paths. These analytic tools when coupled with UAV atmospheric observations provide predictive capabilities to allow for rapid emergency response as well as developing a comprehensive preemptive counter-threat evacuation plan. The visualization tools involve high-end computing and massive parallel processing combined with texture mapping. We demonstrate our approach across a mountainous portion of North California under two contrasting meteorological conditions. Animations depicting flow over this geographical location provide immediate assistance in decision support and crisis management.
Minimal two-sphere model of the generation of fluid flow at low Reynolds numbers.
Leoni, M; Bassetti, B; Kotar, J; Cicuta, P; Cosentino Lagomarsino, M
2010-03-01
Locomotion and generation of flow at low Reynolds number are subject to severe limitations due to the irrelevance of inertia: the "scallop theorem" requires that the system have at least two degrees of freedom, which move in non-reciprocal fashion, i.e. breaking time-reversal symmetry. We show here that a minimal model consisting of just two spheres driven by harmonic potentials is capable of generating flow. In this pump system the two degrees of freedom are the mean and relative positions of the two spheres. We have performed and compared analytical predictions, numerical simulation and experiments, showing that a time-reversible drive is sufficient to induce flow.
NASA Astrophysics Data System (ADS)
Agrawal, Ankit; Choudhary, Alok
2016-05-01
Our ability to collect "big data" has greatly surpassed our capability to analyze it, underscoring the emergence of the fourth paradigm of science, which is data-driven discovery. The need for data informatics is also emphasized by the Materials Genome Initiative (MGI), further boosting the emerging field of materials informatics. In this article, we look at how data-driven techniques are playing a big role in deciphering processing-structure-property-performance relationships in materials, with illustrative examples of both forward models (property prediction) and inverse models (materials discovery). Such analytics can significantly reduce time-to-insight and accelerate cost-effective materials discovery, which is the goal of MGI.
Atomic Oxygen Energy in Low Frequency Hyperthermal Plasma Ashers
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Miller, Sharon K R.; Kneubel, Christian A.
2014-01-01
Experimental and analytical analysis of the atomic oxygen erosion of pyrolytic graphite as well as Monte Carlo computational modeling of the erosion of Kapton H (DuPont, Wilmington, DE) polyimide was performed to determine the hyperthermal energy of low frequency (30 to 35 kHz) plasma ashers operating on air. It was concluded that hyperthermal energies in the range of 0.3 to 0.9 eV are produced in the low frequency air plasmas which results in texturing similar to that in low Earth orbit (LEO). Monte Carlo computational modeling also indicated that such low energy directed ions are fully capable of producing the experimentally observed textured surfaces in low frequency plasmas.
Synergia: an accelerator modeling tool with 3-D space charge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amundson, James F.; Spentzouris, P.; /Fermilab
2004-07-01
High precision modeling of space-charge effects, together with accurate treatment of single-particle dynamics, is essential for designing future accelerators as well as optimizing the performance of existing machines. We describe Synergia, a high-fidelity parallel beam dynamics simulation package with fully three dimensional space-charge capabilities and a higher order optics implementation. We describe the computational techniques, the advanced human interface, and the parallel performance obtained using large numbers of macroparticles. We also perform code benchmarks comparing to semi-analytic results and other codes. Finally, we present initial results on particle tune spread, beam halo creation, and emittance growth in the Fermilab boostermore » accelerator.« less
Sensitivity analysis of a wing aeroelastic response
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.
1991-01-01
A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.
High-accuracy 3D Fourier forward modeling of gravity field based on the Gauss-FFT technique
NASA Astrophysics Data System (ADS)
Zhao, Guangdong; Chen, Bo; Chen, Longwei; Liu, Jianxin; Ren, Zhengyong
2018-03-01
The 3D Fourier forward modeling of 3D density sources is capable of providing 3D gravity anomalies coincided with the meshed density distribution within the whole source region. This paper firstly derives a set of analytical expressions through employing 3D Fourier transforms for calculating the gravity anomalies of a 3D density source approximated by right rectangular prisms. To reduce the errors due to aliasing and imposed periodicity as well as edge effects in the Fourier domain modeling, we develop the 3D Gauss-FFT technique to the 3D gravity anomalies forward modeling. The capability and adaptability of this scheme are tested by simple synthetic models. The results show that the accuracy of the Fourier forward methods using the Gauss-FFT with 4 Gaussian-nodes (or more) is comparable to that of the spatial modeling. In addition, the "ghost" source effects in the 3D Fourier forward gravity field due to imposed periodicity of the standard FFT algorithm are remarkably depressed by the application of the 3D Gauss-FFT algorithm. More importantly, the execution times of the 4 nodes Gauss-FFT modeling are reduced by two orders of magnitude compared with the spatial forward method. It demonstrates that the improved Fourier method is an efficient and accurate forward modeling tool for the gravity field.
Integration of GIS and Bim for Indoor Geovisual Analytics
NASA Astrophysics Data System (ADS)
Wu, B.; Zhang, S.
2016-06-01
This paper presents an endeavour of integration of GIS (Geographical Information System) and BIM (Building Information Modelling) for indoor geovisual analytics. The merits of two types of technologies, GIS and BIM are firstly analysed in the context of indoor environment. GIS has well-developed capabilities of spatial analysis such as network analysis, while BIM has the advantages for indoor 3D modelling and dynamic simulation. This paper firstly investigates the important aspects for integrating GIS and BIM. Different data standards and formats such as the IFC (Industry Foundation Classes) and GML (Geography Markup Language) are discussed. Their merits and limitations in data transformation between GIS and BIM are analysed in terms of semantic and geometric information. An optimized approach for data exchange between GIS and BIM datasets is then proposed. After that, a strategy of using BIM for 3D indoor modelling, GIS for spatial analysis, and BIM again for visualization and dynamic simulation of the analysis results is presented. Based on the developments, this paper selects a typical problem, optimized indoor emergency evacuation, to demonstrate the integration of GIS and BIM for indoor geovisual analytics. The block Z of the Hong Kong Polytechnic University is selected as a test site. Detailed indoor and outdoor 3D models of the block Z are created using a BIM software Revit. The 3D models are transferred to a GIS software ArcGIS to carry out spatial analysis. Optimized evacuation plans considering dynamic constraints are generated based on network analysis in ArcGIS assuming there is a fire accident inside the building. The analysis results are then transferred back to BIM software for visualization and dynamic simulation. The developed methods and results are of significance to facilitate future development of GIS and BIM integrated solutions in various applications.
Bapiro, Tashinga E; Richards, Frances M; Goldgraben, Mae A; Olive, Kenneth P; Madhu, Basetti; Frese, Kristopher K; Cook, Natalie; Jacobetz, Michael A; Smith, Donna-Michelle; Tuveson, David A; Griffiths, John R; Jodrell, Duncan I
2011-11-01
To develop a sensitive analytical method to quantify gemcitabine (2',2'-difluorodeoxycytidine, dFdC) and its metabolites 2',2'-difluorodeoxyuridine (dFdU) and 2',2'-difluorodeoxycytidine-5'-triphosphate (dFdCTP) simultaneously from tumour tissue. Pancreatic ductal adenocarcinoma tumour tissue from genetically engineered mouse models of pancreatic cancer (KP ( FL/FL ) C and KP ( R172H/+) C) was collected after dosing the mice with gemcitabine. (19)F NMR spectroscopy and LC-MS/MS protocols were optimised to detect gemcitabine and its metabolites in homogenates of the tumour tissue. A (19)F NMR protocol was developed, which was capable of distinguishing the three analytes in tumour homogenates. However, it required at least 100 mg of the tissue in question and a long acquisition time per sample, making it impractical for use in large PK/PD studies or clinical trials. The LC-MS/MS protocol was developed using porous graphitic carbon to separate the analytes, enabling simultaneous detection of all three analytes from as little as 10 mg of tissue, with a sensitivity for dFdCTP of 0.2 ng/mg tissue. Multiple pieces of tissue from single tumours were analysed, showing little intra-tumour variation in the concentrations of dFdC or dFdU (both intra- and extra-cellular). Intra-tumoural variation was observed in the concentration of dFdCTP, an intra-cellular metabolite, which may reflect regions of different cellularity within a tumour. We have developed a sensitive LC-MS/MS method capable of quantifying gemcitabine, dFdU and dFdCTP in pancreatic tumour tissue. The requirement for only 10 mg of tissue enables this protocol to be used to analyse multiple areas from a single tumour and to spare tissue for additional pharmacodynamic assays.
FIER: Software for analytical modeling of delayed gamma-ray spectra
NASA Astrophysics Data System (ADS)
Matthews, E. F.; Goldblum, B. L.; Bernstein, L. A.; Quiter, B. J.; Brown, J. A.; Younes, W.; Burke, J. T.; Padgett, S. W.; Ressler, J. J.; Tonchev, A. P.
2018-05-01
A new software package, the Fission Induced Electromagnetic Response (FIER) code, has been developed to analytically predict delayed γ-ray spectra following fission. FIER uses evaluated nuclear data and solutions to the Bateman equations to calculate the time-dependent populations of fission products and their decay daughters resulting from irradiation of a fissionable isotope. These populations are then used in the calculation of γ-ray emission rates to obtain the corresponding delayed γ-ray spectra. FIER output was compared to experimental data obtained by irradiation of a 235U sample in the Godiva critical assembly. This investigation illuminated discrepancies in the input nuclear data libraries, showcasing the usefulness of FIER as a tool to address nuclear data deficiencies through comparison with experimental data. FIER provides traceability between γ-ray emissions and their contributing nuclear species, decay chains, and parent fission fragments, yielding a new capability for the nuclear science community.
Methodology for the systems engineering process. Volume 3: Operational availability
NASA Technical Reports Server (NTRS)
Nelson, J. H.
1972-01-01
A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.
Wu, J.S.; Kim, A. M.; Bleher, R.; Myers, B.D.; Marvin, R. G.; Inada, H.; Nakamura, K.; Zhang, X.F.; Roth, E.; Li, S.Y.; Woodruff, T. K.; O'Halloran, T. V.; Dravid, Vinayak P.
2013-01-01
A dedicated analytical scanning transmission electron microscope (STEM) with dual energy dispersive spectroscopy (EDS) detectors has been designed for complementary high performance imaging as well as high sensitivity elemental analysis and mapping of biological structures. The performance of this new design, based on a Hitachi HD-2300A model, was evaluated using a variety of biological specimens. With three imaging detectors, both the surface and internal structure of cells can be examined simultaneously. The whole-cell elemental mapping, especially of heavier metal species that have low cross-section for electron energy loss spectroscopy (EELS), can be faithfully obtained. Optimization of STEM imaging conditions is applied to thick sections as well as thin sections of biological cells under low-dose conditions at room- and cryogenic temperatures. Such multimodal capabilities applied to soft/biological structures usher a new era for analytical studies in biological systems. PMID:23500508
NASA Astrophysics Data System (ADS)
Suresh, Deivarajan
Secondary concentrators operate in the focal plane of a point focusing system such as a paraboloidal dish or a tower and, when properly designed, are capable of enhancing the overall concentration ratio of the optical system at least by factor of two to five. The viability of using different shapes was demonstrated both analytically as well as experimentally in recent years, including Compound Parabolic Concentrators (CPCs) of circular cross section and 'trumpets' as secondaries. Current research effort is centered around a HCPC (Hexagonal CPC). Major areas addressed include an overview on the state of development of secondary concentrators, some background information related to the design of a HCPC, the results of an analytical study on the thermal behavior of this HCPC under concentrated flux conditions, and a computer modeling for assessing the possible thermal interactions between the secondary and a high temperature receiver.
Darwish, Hany W; Bakheit, Ahmed H; Abdelhameed, Ali S
2016-03-01
Simultaneous spectrophotometric analysis of a multi-component dosage form of olmesartan, amlodipine and hydrochlorothiazide used for the treatment of hypertension has been carried out using various chemometric methods. Multivariate calibration methods include classical least squares (CLS) executed by net analyte processing (NAP-CLS), orthogonal signal correction (OSC-CLS) and direct orthogonal signal correction (DOSC-CLS) in addition to multivariate curve resolution-alternating least squares (MCR-ALS). Results demonstrated the efficiency of the proposed methods as quantitative tools of analysis as well as their qualitative capability. The three analytes were determined precisely using the aforementioned methods in an external data set and in a dosage form after optimization of experimental conditions. Finally, the efficiency of the models was validated via comparison with the partial least squares (PLS) method in terms of accuracy and precision.
Polymorphic Evolutionary Games.
Fishman, Michael A
2016-06-07
In this paper, I present an analytical framework for polymorphic evolutionary games suitable for explicitly modeling evolutionary processes in diploid populations with sexual reproduction. The principal aspect of the proposed approach is adding diploid genetics cum sexual recombination to a traditional evolutionary game, and switching from phenotypes to haplotypes as the new game׳s pure strategies. Here, the relevant pure strategy׳s payoffs derived by summing the payoffs of all the phenotypes capable of producing gametes containing that particular haplotype weighted by the pertinent probabilities. The resulting game is structurally identical to the familiar Evolutionary Games with non-linear pure strategy payoffs (Hofbauer and Sigmund, 1998. Cambridge University Press), and can be analyzed in terms of an established analytical framework for such games. And these results can be translated into the terms of genotypic, and whence, phenotypic evolutionary stability pertinent to the original game. Copyright © 2016 Elsevier Ltd. All rights reserved.
Andreyev, Dmitry; Arriaga, Edgar A
2007-07-15
This technical note describes a detector capable of simultaneously monitoring scattering and fluorescence signals of individual particles separated by capillary electrophoresis. Due to its nonselective nature, scattering alone is not sufficient to identify analyte particles. However, when the analyte particles are fluorescent, the detector described here is able to identify simultaneously occurring scattering and fluorescent signals, even when contaminating particles (i.e., nonfluorescent) are present. Both fluorescent polystyrene particles and 10-nonyl acridine orange (NAO)-labeled mitochondria were used as models. Fluorescence versus scattering (FVS) plots made it possible to identify two types of particles and a contaminant in a mixture of polystyrene particles. We also analyzed NAO-labeled mitochondria before and after cryogenic storage; the mitochondria FVS plots changed with storage, which suggests that the detector reported here is suitable for monitoring subtle changes in mitochondrial morphology that would not be revealed by monitoring only fluorescence or scattering signals.
Heave-pitch-roll analysis and testing of air cushion landing systems
NASA Technical Reports Server (NTRS)
Boghani, A. B.; Captain, K. M.; Wormley, D. N.
1978-01-01
The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.
Good Chemical Measurements, Good Public Policies
NASA Astrophysics Data System (ADS)
Faulkner, Larry R.
2005-02-01
At every turn now, one encounters sharply debated issues and important public policies that rest on chemical information. This is true in practically any arena where public interest intersects with the material world: health care practice and public health; energy; quality of air, water, and food; manufacturing standards and product liability; criminal justice; national and international security, including the defense against terrorism. The scale can be truly global, as in the case of the current debate over climate change, which extends into international efforts to regulate gaseous emissions. Sometimes the relevant chemical measurements and applicable theory are sound and their scope is appropriate to the policy; often they are inadequate and a policy or debate overreaches the analytical capability needed to support it. In the decades ahead, the issues with us today will become even more pressing and will drive a still greater reliance on analytical chemistry. This presentation will have four parts covering (a) illustrations of the impact of analytical chemistry on public debate and public policy, including instances where analytical capabilities actually gave rise to new issues and policies, (b) the manner in which chemical information is handled and understood in public debates, (c) areas of analytical chemistry that will be critical to sound public policy in the future, and (d) implications for the education of leaders and general citizens of modern societies.
Study of a tri-trophic prey-dependent food chain model of interacting populations.
Haque, Mainul; Ali, Nijamuddin; Chakravarty, Santabrata
2013-11-01
The current paper accounts for the influence of intra-specific competition among predators in a prey dependent tri-trophic food chain model of interacting populations. We offer a detailed mathematical analysis of the proposed food chain model to illustrate some of the significant results that has arisen from the interplay of deterministic ecological phenomena and processes. Biologically feasible equilibria of the system are observed and the behaviours of the system around each of them are described. In particular, persistence, stability (local and global) and bifurcation (saddle-node, transcritical, Hopf-Andronov) analysis of this model are obtained. Relevant results from previous well known food chain models are compared with the current findings. Global stability analysis is also carried out by constructing appropriate Lyapunov functions. Numerical simulations show that the present system is capable enough to produce chaotic dynamics when the rate of self-interaction is very low. On the other hand such chaotic behaviour disappears for a certain value of the rate of self interaction. In addition, numerical simulations with experimented parameters values confirm the analytical results and shows that intra-specific competitions bears a potential role in controlling the chaotic dynamics of the system; and thus the role of self interactions in food chain model is illustrated first time. Finally, a discussion of the ecological applications of the analytical and numerical findings concludes the paper. Copyright © 2013 Elsevier Inc. All rights reserved.
A quantitative, comprehensive analytical model for ``fast'' magnetic reconnection in Hall MHD
NASA Astrophysics Data System (ADS)
Simakov, Andrei N.
2008-11-01
Magnetic reconnection in nature usually happens on fast (e.g. dissipation independent) time scales. While such scales have been observed computationally [1], a fundamental analytical model capable of explaining them has been lacking. Here, we propose such a quantitative model for 2D Hall MHD reconnection without a guide field. The model recovers the Sweet-Parker and the electron MHD [2] results in the appropriate limits of the ion inertial length, di, and is valid everywhere in between [3]. The model predicts the dissipation region aspect ratio and the reconnection rate Ez in terms of dissipation and inertial parameters, and has been found to be in excellent agreement with non-linear simulations. It confirms a number of long-standing empirical results and resolves several controversies. In particular, we find that both open X-point and elongated dissipation regions allow ``fast'' reconnection and that Ez depends on di. Moreover, when applied to electron-positron plasmas, the model demonstrates that fast dispersive waves are not instrumental for ``fast'' reconnection [4]. [1] J. Birn et al., J. Geophys. Res. 106, 3715 (2001). [2] L. Chac'on, A. N. Simakov, and A. Zocco, Phys. Rev. Lett. 99, 235001 (2007). [3] A. N. Simakov and L. Chac'on, submitted to Phys. Rev. Lett. [4] L. Chac'on, A. N. Simakov, V. Lukin, and A. Zocco, Phys. Rev. Lett. 101, 025003 (2008).
NASA Technical Reports Server (NTRS)
Huang, N. E.; Parsons, C. L.; Long, S. R.; Bliven, L. F.
1983-01-01
Wave breaking is proposed as the primary energy dissipation mechanism for the gravity wave field. The energy dissipation rate is calculated based on the statistical model proposed by Longuet-Higgins (1969) with a modification of the breaking criterion incorporating the surface stress according to Phillips and Banner (1974). From this modified model, an analytic expression is found for the wave attenuation rate and the half-life time of the wave field which depend only on the significant slope of the wave field and the ratio of friction velocity to initial wave phase velocity. These expressions explain why the freshly generated wave field does not last long, but why swells are capable of propagating long distances without substantial change in energy density. It is shown that breaking is many orders of magnitude more effective in dissipating wave energy than the molecular viscosity, if the significant slope is higher than 0.01. Limited observational data from satellite and laboratory are used to compare with the analytic results, and show good agreement.
Garay-Avendaño, Roger L; Zamboni-Rached, Michel
2014-07-10
In this paper, we propose a method that is capable of describing in exact and analytic form the propagation of nonparaxial scalar and electromagnetic beams. The main features of the method presented here are its mathematical simplicity and the fast convergence in the cases of highly nonparaxial electromagnetic beams, enabling us to obtain high-precision results without the necessity of lengthy numerical simulations or other more complex analytical calculations. The method can be used in electromagnetism (optics, microwaves) as well as in acoustics.
Differential homogeneous immunosensor device
Malmros, M.K.; Gulbinski, J. III.
1990-04-10
There is provided a novel method of testing for the presence of an analyte in a fluid suspected of containing the same. In this method, in the presence of the analyte, a substance capable of modifying certain characteristics of the substrate is bound to the substrate and the change in these qualities is measured. While the method may be modified for carrying out quantitative differential analyses, it eliminates the need for washing the analyte from the substrate which is characteristic of prior art methods. 12 figs.
Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin
2014-03-01
A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.
Rotational modes of a simple Earth model
NASA Astrophysics Data System (ADS)
Seyed-Mahmoud, B.; Rochester, M. G.; Rogister, Y. J. G.
2017-12-01
We study the tilt-over mode (TOM), the spin-over mode (SOM), the free core nutation (FCN), and their relationships to each other using a simple Earth model with a homogeneous and incompressible liquid core and a rigid mantle. Analytical solutions for the periods of these modes as well as that of the Chandler wobble is found for the Earth model. We show that the FCN is the same mode as the SOM of a wobbling Earth. The reduced pressure, in terms of which the vector momentum equation is known to reduce to a scalar second order differential equation (the so called Poincaŕe equation), is used as the independent variable. Analytical solutions are then found for the displacement eigenfucntions in a meridional plane of the liquid core for the aforementioned modes. We show that the magnitude of motion in the mantle during the FCN is comparable to that in the liquid core, hence very small. The displacement eigenfunctions for these aforementioned modes as well as those for the free inner core nutation (FICN), computed numerically, are also given for a three layer Earth model which also includes a rigid but capable of wobbling inner core. We will discuss the slow convergence of the period of the FICN in terms of the characteristic surfaces of the Poincare equation.
Monte Carlo simulation of liquid bridge rupture: Application to lung physiology
NASA Astrophysics Data System (ADS)
Alencar, Adriano M.; Wolfe, Elie; Buldyrev, Sergey V.
2006-08-01
In the course of certain lung diseases, the surface properties and the amount of fluids coating the airways changes and liquid bridges may form in the small airways blocking the flow of air, impairing gas exchange. During inhalation, these liquid bridges may rupture due to mechanical instability and emit a discrete sound event called pulmonary crackle, which can be heard using a simple stethoscope. We hypothesize that this sound is a result of the acoustical release of energy that had been stored in the surface of liquid bridges prior to its rupture. We develop a lattice gas model capable of describing these phenomena. As a step toward modeling this process, we address a simpler but related problem, that of a liquid bridge between two planar surfaces. This problem has been analytically solved and we use this solution as a validation of the lattice gas model of the liquid bridge rupture. Specifically, we determine the surface free energy and critical stability conditions in a system containing a liquid bridge of volume Ω formed between two parallel planes, separated by a distance 2h , with a contact angle Θ using both Monte Carlo simulation of a lattice gas model and variational calculus based on minimization of the surface area with the volume and the contact angle constraints. In order to simulate systems with different contact angles, we vary the parameters between the constitutive elements of the lattice gas. We numerically and analytically determine the phase diagram of the system as a function of the dimensionless parameters hΩ-1/3 and Θ . The regions of this phase diagram correspond to the mechanical stability and thermodynamical stability of the liquid bridge. We also determine the conditions for the symmetrical versus asymmetrical rupture of the bridge. We numerically and analytically compute the release of free energy during rupture. The simulation results are in agreement with the analytical solution. Furthermore, we discuss the results in connection to the rupture of similar bridges that exist in diseased lungs.
State-of-the-Art of (Bio)Chemical Sensor Developments in Analytical Spanish Groups
Plata, María Reyes; Contento, Ana María; Ríos, Angel
2010-01-01
(Bio)chemical sensors are one of the most exciting fields in analytical chemistry today. The development of these analytical devices simplifies and miniaturizes the whole analytical process. Although the initial expectation of the massive incorporation of sensors in routine analytical work has been truncated to some extent, in many other cases analytical methods based on sensor technology have solved important analytical problems. Many research groups are working in this field world-wide, reporting interesting results so far. Modestly, Spanish researchers have contributed to these recent developments. In this review, we summarize the more representative achievements carried out for these groups. They cover a wide variety of sensors, including optical, electrochemical, piezoelectric or electro-mechanical devices, used for laboratory or field analyses. The capabilities to be used in different applied areas are also critically discussed. PMID:22319260
Computer search for binary cyclic UEP codes of odd length up to 65
NASA Technical Reports Server (NTRS)
Lin, Mao-Chao; Lin, Chi-Chang; Lin, Shu
1990-01-01
Using an exhaustive computation, the unequal error protection capabilities of all binary cyclic codes of odd length up to 65 that have minimum distances at least 3 are found. For those codes that can only have upper bounds on their unequal error protection capabilities computed, an analytic method developed by Dynkin and Togonidze (1976) is used to show that the upper bounds meet the exact unequal error protection capabilities.
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China’s steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities’ abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns. PMID:26422266
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China's steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities' abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns.
2011-09-01
Evaluation Process through Capabilities-Based Analysis 5. FUNDING NUMBERS 6. AUTHOR(S) Eric J. Lednicky 7. PERFORMING ORGANIZATION NAME(S) AND...ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S...14 C. MEASURES OF EFFECTIVENESS / MEASURES OF PERFORMANCE
Badal, Sunil P; Michalak, Shawn D; Chan, George C-Y; You, Yi; Shelley, Jacob T
2016-04-05
Plasma-based ambient desorption/ionization sources are versatile in that they enable direct ionization of gaseous samples as well as desorption/ionization of analytes from liquid and solid samples. However, ionization matrix effects, caused by competitive ionization processes, can worsen sensitivity or even inhibit detection all together. The present study is focused on expanding the analytical capabilities of the flowing atmospheric-pressure afterglow (FAPA) source by exploring additional types of ionization chemistry. Specifically, it was found that the abundance and type of reagent ions produced by the FAPA source and, thus, the corresponding ionization pathways of analytes, can be altered by changing the source working conditions. High abundance of proton-transfer reagent ions was observed with relatively high gas flow rates and low discharge currents. Conversely, charge-transfer reagent species were most abundant at low gas flows and high discharge currents. A rather nonpolar model analyte, biphenyl, was found to significantly change ionization pathway based on source operating parameters. Different analyte ions (e.g., MH(+) via proton-transfer and M(+.) via charge-transfer) were formed under unique operating parameters demonstrating two different operating regimes. These tunable ionization modes of the FAPA were used to enable or enhance detection of analytes which traditionally exhibit low-sensitivity in plasma-based ADI-MS analyses. In one example, 2,2'-dichloroquaterphenyl was detected under charge-transfer FAPA conditions, which were difficult or impossible to detect with proton-transfer FAPA or direct analysis in real-time (DART). Overall, this unique mode of operation increases the number and range of detectable analytes and has the potential to lessen ionization matrix effects in ADI-MS analyses.
A data model and database for high-resolution pathology analytical image informatics.
Wang, Fusheng; Kong, Jun; Cooper, Lee; Pan, Tony; Kurc, Tahsin; Chen, Wenjin; Sharma, Ashish; Niedermayr, Cristobal; Oh, Tae W; Brat, Daniel; Farris, Alton B; Foran, David J; Saltz, Joel
2011-01-01
The systematic analysis of imaged pathology specimens often results in a vast amount of morphological information at both the cellular and sub-cellular scales. While microscopy scanners and computerized analysis are capable of capturing and analyzing data rapidly, microscopy image data remain underutilized in research and clinical settings. One major obstacle which tends to reduce wider adoption of these new technologies throughout the clinical and scientific communities is the challenge of managing, querying, and integrating the vast amounts of data resulting from the analysis of large digital pathology datasets. This paper presents a data model, which addresses these challenges, and demonstrates its implementation in a relational database system. This paper describes a data model, referred to as Pathology Analytic Imaging Standards (PAIS), and a database implementation, which are designed to support the data management and query requirements of detailed characterization of micro-anatomic morphology through many interrelated analysis pipelines on whole-slide images and tissue microarrays (TMAs). (1) Development of a data model capable of efficiently representing and storing virtual slide related image, annotation, markup, and feature information. (2) Development of a database, based on the data model, capable of supporting queries for data retrieval based on analysis and image metadata, queries for comparison of results from different analyses, and spatial queries on segmented regions, features, and classified objects. The work described in this paper is motivated by the challenges associated with characterization of micro-scale features for comparative and correlative analyses involving whole-slides tissue images and TMAs. Technologies for digitizing tissues have advanced significantly in the past decade. Slide scanners are capable of producing high-magnification, high-resolution images from whole slides and TMAs within several minutes. Hence, it is becoming increasingly feasible for basic, clinical, and translational research studies to produce thousands of whole-slide images. Systematic analysis of these large datasets requires efficient data management support for representing and indexing results from hundreds of interrelated analyses generating very large volumes of quantifications such as shape and texture and of classifications of the quantified features. We have designed a data model and a database to address the data management requirements of detailed characterization of micro-anatomic morphology through many interrelated analysis pipelines. The data model represents virtual slide related image, annotation, markup and feature information. The database supports a wide range of metadata and spatial queries on images, annotations, markups, and features. We currently have three databases running on a Dell PowerEdge T410 server with CentOS 5.5 Linux operating system. The database server is IBM DB2 Enterprise Edition 9.7.2. The set of databases consists of 1) a TMA database containing image analysis results from 4740 cases of breast cancer, with 641 MB storage size; 2) an algorithm validation database, which stores markups and annotations from two segmentation algorithms and two parameter sets on 18 selected slides, with 66 GB storage size; and 3) an in silico brain tumor study database comprising results from 307 TCGA slides, with 365 GB storage size. The latter two databases also contain human-generated annotations and markups for regions and nuclei. Modeling and managing pathology image analysis results in a database provide immediate benefits on the value and usability of data in a research study. The database provides powerful query capabilities, which are otherwise difficult or cumbersome to support by other approaches such as programming languages. Standardized, semantic annotated data representation and interfaces also make it possible to more efficiently share image data and analysis results.
Model studies of crosswind landing-gear configurations for STOL aircraft
NASA Technical Reports Server (NTRS)
Stubbs, S. M.; Byrdsong, T. A.
1973-01-01
A dynamic model was used to directly compare four different crosswind landing gear mechanisms. The model was landed as a free body onto a laterally sloping runway used to simulate a crosswind side force. A radio control system was used for steering to oppose the side force as the model rolled to a stop. The configuration in which the landing gears are alined by the pilot and locked in the direction of motion prior to touchdown gave the smoothest runout behavior with the vehicle maintaining its crab angle throughout the landing roll. Nose wheel steering was confirmed to be better than steering with nose and main gears differentially or together. Testing is continuing to obtain quantitative data to establish an experimental data base for validation of an analytical program that will be capable of predicting full scale results.
Smalø, Hans S; Astrand, Per-Olof; Jensen, Lasse
2009-07-28
The electronegativity equalization model (EEM) has been combined with a point-dipole interaction model to obtain a molecular mechanics model consisting of atomic charges, atomic dipole moments, and two-atom relay tensors to describe molecular dipole moments and molecular dipole-dipole polarizabilities. The EEM has been phrased as an atom-atom charge-transfer model allowing for a modification of the charge-transfer terms to avoid that the polarizability approaches infinity for two particles at infinite distance and for long chains. In the present work, these shortcomings have been resolved by adding an energy term for transporting charges through individual atoms. A Gaussian distribution is adopted for the atomic charge distributions, resulting in a damping of the electrostatic interactions at short distances. Assuming that an interatomic exchange term may be described as the overlap between two electronic charge distributions, the EEM has also been extended by a short-range exchange term. The result is a molecular mechanics model where the difference of charge transfer in insulating and metallic systems is modeled regarding the difference in bond length between different types of system. For example, the model is capable of modeling charge transfer in both alkanes and alkenes with alternating double bonds with the same set of carbon parameters only relying on the difference in bond length between carbon sigma- and pi-bonds. Analytical results have been obtained for the polarizability of a long linear chain. These results show that the model is capable of describing the polarizability scaling both linearly and nonlinearly with the size of the system. Similarly, a linear chain with an end atom with a high electronegativity has been analyzed analytically. The dipole moment of this model system can either be independent of the length or increase linearly with the length of the chain. In addition, the model has been parametrized for alkane and alkene chains with data from density functional theory calculations, where the polarizability behaves differently with the chain length. For the molecular dipole moment, the same two systems have been studied with an aldehyde end group. Both the molecular polarizability and the dipole moment are well described as a function of the chain length for both alkane and alkene chains demonstrating the power of the presented model.
The Analysis of Adhesively Bonded Advanced Composite Joints Using Joint Finite Elements
NASA Technical Reports Server (NTRS)
Stapleton, Scott E.; Waas, Anthony M.
2012-01-01
The design and sizing of adhesively bonded joints has always been a major bottleneck in the design of composite vehicles. Dense finite element (FE) meshes are required to capture the full behavior of a joint numerically, but these dense meshes are impractical in vehicle-scale models where a course mesh is more desirable to make quick assessments and comparisons of different joint geometries. Analytical models are often helpful in sizing, but difficulties arise in coupling these models with full-vehicle FE models. Therefore, a joint FE was created which can be used within structural FE models to make quick assessments of bonded composite joints. The shape functions of the joint FE were found by solving the governing equations for a structural model for a joint. By analytically determining the shape functions of the joint FE, the complex joint behavior can be captured with very few elements. This joint FE was modified and used to consider adhesives with functionally graded material properties to reduce the peel stress concentrations located near adherend discontinuities. Several practical concerns impede the actual use of such adhesives. These include increased manufacturing complications, alterations to the grading due to adhesive flow during manufacturing, and whether changing the loading conditions significantly impact the effectiveness of the grading. An analytical study is conducted to address these three concerns. Furthermore, proof-of-concept testing is conducted to show the potential advantages of functionally graded adhesives. In this study, grading is achieved by strategically placing glass beads within the adhesive layer at different densities along the joint. Furthermore, the capability to model non-linear adhesive constitutive behavior with large rotations was developed, and progressive failure of the adhesive was modeled by re-meshing the joint as the adhesive fails. Results predicted using the joint FE was compared with experimental results for various joint configurations, including double cantilever beam and single lap joints.
Functional Fault Model Development Process to Support Design Analysis and Operational Assessment
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.
2016-01-01
A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.
PhreeqcRM: A reaction module for transport simulators based on the geochemical model PHREEQC
Parkhurst, David L.; Wissmeier, Laurin
2015-01-01
PhreeqcRM is a geochemical reaction module designed specifically to perform equilibrium and kinetic reaction calculations for reactive transport simulators that use an operator-splitting approach. The basic function of the reaction module is to take component concentrations from the model cells of the transport simulator, run geochemical reactions, and return updated component concentrations to the transport simulator. If multicomponent diffusion is modeled (e.g., Nernst–Planck equation), then aqueous species concentrations can be used instead of component concentrations. The reaction capabilities are a complete implementation of the reaction capabilities of PHREEQC. In each cell, the reaction module maintains the composition of all of the reactants, which may include minerals, exchangers, surface complexers, gas phases, solid solutions, and user-defined kinetic reactants.PhreeqcRM assigns initial and boundary conditions for model cells based on standard PHREEQC input definitions (files or strings) of chemical compositions of solutions and reactants. Additional PhreeqcRM capabilities include methods to eliminate reaction calculations for inactive parts of a model domain, transfer concentrations and other model properties, and retrieve selected results. The module demonstrates good scalability for parallel processing by using multiprocessing with MPI (message passing interface) on distributed memory systems, and limited scalability using multithreading with OpenMP on shared memory systems. PhreeqcRM is written in C++, but interfaces allow methods to be called from C or Fortran. By using the PhreeqcRM reaction module, an existing multicomponent transport simulator can be extended to simulate a wide range of geochemical reactions. Results of the implementation of PhreeqcRM as the reaction engine for transport simulators PHAST and FEFLOW are shown by using an analytical solution and the reactive transport benchmark of MoMaS.
Determination of thermal diffusivities of cylindrical bodies being cooled
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dincer, I.
1996-09-01
This paper deals with the development of an analytical model for determining the thermal diffusivities of the individual solid cylindrical bodies subjected to cooling is presented. Applications of this model were made using the experimental center temperature data obtained from the cylindrical products (e.g., cucumber and grape) during air cooling at the flow velocity of 2 m/s. As an experimental result, the thermal diffusivities of products were found to be 1.45{times}10{sup {minus}7} m{sup 2}/s for cucumber and 1.68{times}10{sup {minus}7} m{sup 2}/s for grape. It can be concluded that the present model is capable of determining the thermal diffusivities of cylindricalmore » bodies during cooling in a simple and effective form.« less
NASA Astrophysics Data System (ADS)
Jaenisch, Holger M.; Handley, James W.; Hicklen, Michael L.
2007-04-01
This paper describes a novel capability for modeling known idea propagation transformations and predicting responses to new ideas from geopolitical groups. Ideas are captured using semantic words that are text based and bear cognitive definitions. We demonstrate a unique algorithm for converting these into analytical predictive equations. Using the illustrative idea of "proposing a gasoline price increase of 1 per gallon from 2" and its changing perceived impact throughout 5 demographic groups, we identify 13 cost of living Diplomatic, Information, Military, and Economic (DIME) features common across all 5 demographic groups. This enables the modeling and monitoring of Political, Military, Economic, Social, Information, and Infrastructure (PMESII) effects of each group to this idea and how their "perception" of this proposal changes. Our algorithm and results are summarized in this paper.
Model of care transformation: a health care system CNE's journey.
Swick, Maureen; Doulaveris, Phyllis; Christensen, Patricia
2012-01-01
In 2001, the Institute of Medicine released the report "Crossing the Quality Chasm: A New Health System for the 21st Century." The report criticizes our health care system and argues that we are failing to provide Americans with the high-quality and affordable health care they deserve and need. While incremental progress has been made, we continue to strive for improved care quality, and our rising costs are potentially catastrophic. Consistent with the Institute of Medicine report, and its reputation for innovation, Inova Health System identified care model transformation as a system priority. Given that the organization is replacing its electronic health record and introducing advanced analytic capabilities, the opportunity to transform the model of care in tandem with core clinical platform enhancement was a compelling reason to move forward.
Nonlinear Dynamic Analysis of Disordered Bladed-Disk Assemblies
NASA Technical Reports Server (NTRS)
McGee, Oliver G., III
1997-01-01
In a effort to address current needs for efficient, air propulsion systems, we have developed some new analytical predictive tools for understanding and alleviating aircraft engine instabilities which have led to accelerated high cycle fatigue and catastrophic failures of these machines during flight. A frequent cause of failure in Jets engines is excessive resonant vibrations and stall flutter instabilities. The likelihood of these phenomena is reduced when designers employ the analytical models we have developed. These prediction models will ultimately increase the nation's competitiveness in producing high performance Jets engines with enhanced operability, energy economy, and safety. The objectives of our current threads of research in the final year are directed along two lines. First, we want to improve the current state of blade stress and aeromechanical reduced-ordered modeling of high bypass engine fans, Specifically, a new reduced-order iterative redesign tool for passively controlling the mechanical authority of shroudless, wide chord, laminated composite transonic bypass engine fans has been developed. Second, we aim to advance current understanding of aeromechanical feedback control of dynamic flow instabilities in axial flow compressors. A systematic theoretical evaluation of several approaches to aeromechanical feedback control of rotating stall in axial compressors has been conducted. Attached are abstracts of two .papers under preparation for the 1998 ASME Turbo Expo in Stockholm, Sweden sponsored under Grant No. NAG3-1571. Our goals during the final year under Grant No. NAG3-1571 is to enhance NASA's capabilities of forced response of turbomachines (such as NASA FREPS). We with continue our development of the reduced-ordered, three-dimensional component synthesis models for aeromechanical evaluation of integrated bladeddisk assemblies (i.e., the disk, non-identical bladeing etc.). We will complete our development of component systems design optimization strategies for specified vibratory stresses and increased fatigue life prediction of assembly components, and for specified frequency margins on the Campbell diagrams of turbomachines. Finally, we will integrate the developed codes with NASA's turbomachinery aeromechanics prediction capability (such as NASA FREPS).
NASA Astrophysics Data System (ADS)
Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad
2018-03-01
Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.
Exploration Laboratory Analysis
NASA Technical Reports Server (NTRS)
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
Exploration Laboratory Analysis
NASA Technical Reports Server (NTRS)
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the downselection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institute's rHEALTH X and Intelligent Optical System's lateral flow assays combined with Holomic's smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements. The technology demonstrations and metrics for success will be finalized in FY16. Also, the downselected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Cunningham, Virginia L; D'Aco, Vincent J; Pfeiffer, Danielle; Anderson, Paul D; Buzby, Mary E; Hannah, Robert E; Jahnke, James; Parke, Neil J
2012-07-01
This article presents the capability expansion of the PhATE™ (pharmaceutical assessment and transport evaluation) model to predict concentrations of trace organics in sludges and biosolids from municipal wastewater treatment plants (WWTPs). PhATE was originally developed as an empirical model to estimate potential concentrations of active pharmaceutical ingredients (APIs) in US surface and drinking waters that could result from patient use of medicines. However, many compounds, including pharmaceuticals, are not completely transformed in WWTPs and remain in biosolids that may be applied to land as a soil amendment. This practice leads to concerns about potential exposures of people who may come into contact with amended soils and also about potential effects to plants and animals living in or contacting such soils. The model estimates the mass of API in WWTP influent based on the population served, the API per capita use, and the potential loss of the compound associated with human use (e.g., metabolism). The mass of API on the treated biosolids is then estimated based on partitioning to primary and secondary solids, potential loss due to biodegradation in secondary treatment (e.g., activated sludge), and potential loss during sludge treatment (e.g., aerobic digestion, anaerobic digestion, composting). Simulations using 2 surrogate compounds show that predicted environmental concentrations (PECs) generated by PhATE are in very good agreement with measured concentrations, i.e., well within 1 order of magnitude. Model simulations were then carried out for 18 APIs representing a broad range of chemical and use characteristics. These simulations yielded 4 categories of results: 1) PECs are in good agreement with measured data for 9 compounds with high analytical detection frequencies, 2) PECs are greater than measured data for 3 compounds with high analytical detection frequencies, possibly as a result of as yet unidentified depletion mechanisms, 3) PECs are less than analytical reporting limits for 5 compounds with low analytical detection frequencies, and 4) the PEC is greater than the analytical method reporting limit for 1 compound with a low analytical detection frequency, possibly again as a result of insufficient depletion data. Overall, these results demonstrate that PhATE has the potential to be a very useful tool in the evaluation of APIs in biosolids. Possible applications include: prioritizing APIs for assessment even in the absence of analytical methods; evaluating sludge processing scenarios to explore potential mitigation approaches; using in risk assessments; and developing realistic nationwide concentrations, because PECs can be represented as a cumulative probability distribution. Finally, comparison of PECs to measured concentrations can also be used to identify the need for fate studies of compounds of interest in biosolids. Copyright © 2011 SETAC.
Fault Injection and Monitoring Capability for a Fault-Tolerant Distributed Computation System
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo; Yates, Amy M.; Malekpour, Mahyar R.
2010-01-01
The Configurable Fault-Injection and Monitoring System (CFIMS) is intended for the experimental characterization of effects caused by a variety of adverse conditions on a distributed computation system running flight control applications. A product of research collaboration between NASA Langley Research Center and Old Dominion University, the CFIMS is the main research tool for generating actual fault response data with which to develop and validate analytical performance models and design methodologies for the mitigation of fault effects in distributed flight control systems. Rather than a fixed design solution, the CFIMS is a flexible system that enables the systematic exploration of the problem space and can be adapted to meet the evolving needs of the research. The CFIMS has the capabilities of system-under-test (SUT) functional stimulus generation, fault injection and state monitoring, all of which are supported by a configuration capability for setting up the system as desired for a particular experiment. This report summarizes the work accomplished so far in the development of the CFIMS concept and documents the first design realization.
Research and development activities in unified control-structure modeling and design
NASA Technical Reports Server (NTRS)
Nayak, A. P.
1985-01-01
Results of work sponsored by JPL and other organizations to develop a unified control/structures modeling and design capability for large space structures is presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. The development of a methodology for global design optimization is recommended as a long term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization. Recommendations are also presented for near term research activities at JPL. The key recommendation is to continue the development of integrated dynamic modeling/control design techniques, with special attention given to the development of structural models specially tailored to support design.
Chemoviscosity modeling for thermosetting resins, 2
NASA Technical Reports Server (NTRS)
Hou, T. H.
1985-01-01
A new analytical model for simulating chemoviscosity of thermosetting resin was formulated. The model is developed by modifying the Williams-Landel-Ferry (WLF) theory in polymer rheology for thermoplastic materials. By assuming a linear relationship between the glass transition temperature and the degree of cure of the resin system under cure, the WLF theory can be modified to account for the factor of reaction time. Temperature dependent functions of the modified WLF theory constants were determined from the isothermal cure data of Lee, Loos, and Springer for the Hercules 3501-6 resin system. Theoretical predictions of the model for the resin under dynamic heating cure cycles were shown to compare favorably with the experimental data reported by Carpenter. A chemoviscosity model which is capable of not only describing viscosity profiles accurately under various cure cycles, but also correlating viscosity data to the changes of physical properties associated with the structural transformations of the thermosetting resin systems during cure was established.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Comprehensive modeling of a liquid rocket combustion chamber
NASA Technical Reports Server (NTRS)
Liang, P.-Y.; Fisher, S.; Chang, Y. M.
1985-01-01
An analytical model for the simulation of detailed three-phase combustion flows inside a liquid rocket combustion chamber is presented. The three phases involved are: a multispecies gaseous phase, an incompressible liquid phase, and a particulate droplet phase. The gas and liquid phases are continuum described in an Eulerian fashion. A two-phase solution capability for these continuum media is obtained through a marriage of the Implicit Continuous Eulerian (ICE) technique and the fractional Volume of Fluid (VOF) free surface description method. On the other hand, the particulate phase is given a discrete treatment and described in a Lagrangian fashion. All three phases are hence treated rigorously. Semi-empirical physical models are used to describe all interphase coupling terms as well as the chemistry among gaseous components. Sample calculations using the model are given. The results show promising application to truly comprehensive modeling of complex liquid-fueled engine systems.
NASA Astrophysics Data System (ADS)
Wang, Baoming; Haque, M. A.
2015-08-01
With atomic-scale imaging and analytical capabilities such as electron diffraction and energy-loss spectroscopy, the transmission electron microscope has allowed access to the internal microstructure of materials like no other microscopy. It has been mostly a passive or post-mortem analysis tool, but that trend is changing with in situ straining, heating and electrical biasing. In this study, we design and demonstrate a multi-functional microchip that integrates actuators, sensors, heaters and electrodes with freestanding electron transparent specimens. In addition to mechanical testing at elevated temperatures, the chip can actively control microstructures (grain growth and phase change) of the specimen material. Using nano-crystalline aluminum, nickel and zirconium as specimen materials, we demonstrate these novel capabilities inside the microscope. Our approach of active microstructural control and quantitative testing with real-time visualization can influence mechanistic modeling by providing direct and accurate evidence of the fundamental mechanisms behind materials behavior.
Centaur Standard Shroud (CSS) static ultimate load structural tests
NASA Technical Reports Server (NTRS)
1975-01-01
A series of tests were conducted on the jettisonable metallic shroud used on the Titan/Centaur launch vehicle to verify its structural capabilities and to evaluate its structural interaction with the Centaur stage. A flight configured shroud and the interfacing Titan/Centaur structural assemblies were subjected to tests consisting of combinations of applied axial and shear loads to design ultimate values, including a set of tests on thermal conditions and two dynamic response tests to verify the analytical stiffness model. The strength capabilities were demonstrated at ultimate (125 percent of design limit) loads. It was also verified that the spring rate of the flight configured shroud-to-Centaur forward structural deflections of the specimen became nonlinear, as expected, above limit load values. This test series qualification program verified that the Titan/Centaur shroud and the Centaur and Titan interface components are qualified structurally at design ultimate loads.
A general low frequency acoustic radiation capability for NASTRAN
NASA Technical Reports Server (NTRS)
Everstine, G. C.; Henderson, F. M.; Schroeder, E. A.; Lipman, R. R.
1986-01-01
A new capability called NASHUA is described for calculating the radiated acoustic sound pressure field exterior to a harmonically-excited arbitrary submerged 3-D elastic structure. The surface fluid pressures and velocities are first calculated by coupling a NASTRAN finite element model of the structure with a discretized form of the Helmholtz surface integral equation for the exterior fluid. After the fluid impedance is calculated, most of the required matrix operations are performed using the general matrix manipulation package (DMAP) available in NASTRAN. Far field radiated pressures are then calculated from the surface solution using the Helmholtz exterior integral equation. Other output quantities include the maximum sound pressure levels in each of the three coordinate planes, the rms and average surface pressures and normal velocities, the total radiated power and the radiation efficiency. The overall approach is illustrated and validated using known analytic solutions for submerged spherical shells subjected to both uniform and nonuniform applied loads.
Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun
2018-03-01
The aim of this work is to develop a gamma-ray/neutron dual-particle imager, based on rotational modulation collimators (RMCs) and pulse shape discrimination (PSD)-capable scintillators, for possible applications for radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources in various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation-maximization method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio, showing the viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators.
Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen
2016-01-01
Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.
Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations
NASA Astrophysics Data System (ADS)
Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.
2016-12-01
Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.
Safina, Gulnara
2012-01-27
Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
Collaborative visual analytics of radio surveys in the Big Data era
NASA Astrophysics Data System (ADS)
Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.
2017-06-01
Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.
Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel
2016-09-07
We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Goh, Swee C.; Elliott, Catherine; Quon, Tony K.
2012-01-01
Purpose: The purpose of this paper is to present a meta-analysis of a subset of published empirical research papers that measure learning capability and link it to organizational performance. It also seeks to examine both financial and non-financial performance. Design/methodology/approach: In a search of published research on learning capability…
ERIC Educational Resources Information Center
Tao, Sharon
2013-01-01
Tanzanian teachers have been criticised for a variety of behaviours such as absenteeism, lack of preparation and rote-teaching. This paper introduces an analytical framework that attempts to provide explanations for these behaviours by locating Capability Approach concepts within a Critical Realist theory of causation. Qualitative data from three…
Panel methods: An introduction
NASA Technical Reports Server (NTRS)
Erickson, Larry L.
1990-01-01
Panel methods are numerical schemes for solving (the Prandtl-Glauert equation) for linear, inviscid, irrotational flow about aircraft flying at subsonic or supersonic speeds. The tools at the panel-method user's disposal are (1) surface panels of source-doublet-vorticity distributions that can represent nearly arbitrary geometry, and (2) extremely versatile boundary condition capabilities that can frequently be used for creative modeling. Panel-method capabilities and limitations, basic concepts common to all panel-method codes, different choices that were made in the implementation of these concepts into working computer programs, and various modeling techniques involving boundary conditions, jump properties, and trailing wakes are discussed. An approach for extending the method to nonlinear transonic flow is also presented. Three appendices supplement the main test. In appendix 1, additional detail is provided on how the basic concepts are implemented into a specific computer program (PANAIR). In appendix 2, it is shown how to evaluate analytically the fundamental surface integral that arises in the expressions for influence-coefficients, and evaluate its jump property. In appendix 3, a simple example is used to illustrate the so-called finite part of the improper integrals.
Tetrahedral Finite-Volume Solutions to the Navier-Stokes Equations on Complex Configurations
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Pirzadeh, Shahyar Z.
1998-01-01
A review of the algorithmic features and capabilities of the unstructured-grid flow solver USM3Dns is presented. This code, along with the tetrahedral grid generator, VGRIDns, is being extensively used throughout the U.S. for solving the Euler and Navier-Stokes equations on complex aerodynamic problems. Spatial discretization is accomplished by a tetrahedral cell-centered finite-volume formulation using Roe's upwind flux difference splitting. The fluxes are limited by either a Superbee or MinMod limiter. Solution reconstruction within the tetrahedral cells is accomplished with a simple, but novel, multidimensional analytical formula. Time is advanced by an implicit backward-Euler time-stepping scheme. Flow turbulence effects are modeled by the Spalart-Allmaras one-equation model, which is coupled with a wall function to reduce the number of cells in the near-wall region of the boundary layer. The issues of accuracy and robustness of USM3Dns Navier-Stokes capabilities are addressed for a flat-plate boundary layer, and a full F-16 aircraft with external stores at transonic speed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kazantsev, Victor; Pimashkin, Alexey; Department of Neurodynamics and Neurobiology, Nizhny Novgorod State University, 23 Gagarin Ave., 603950 Nizhny Novgorod
We propose two-layer architecture of associative memory oscillatory network with directional interlayer connectivity. The network is capable to store information in the form of phase-locked (in-phase and antiphase) oscillatory patterns. The first (input) layer takes an input pattern to be recognized and their units are unidirectionally connected with all units of the second (control) layer. The connection strengths are weighted using the Hebbian rule. The output (retrieved) patterns appear as forced-phase locked states of the control layer. The conditions are found and analytically expressed for pattern retrieval in response on incoming stimulus. It is shown that the system is capablemore » to recover patterns with a certain level of distortions or noises in their profiles. The architecture is implemented with the Kuramoto phase model and using synaptically coupled neural oscillators with spikes. It is found that the spiking model is capable to retrieve patterns using the spiking phase that translates memorized patterns into the spiking phase shifts at different time scales.« less
NASA Astrophysics Data System (ADS)
Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.
2016-07-01
The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.
General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft
NASA Technical Reports Server (NTRS)
Dove, Edwin; Hughes, Steve
2007-01-01
The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.
A dashboard-based system for supporting diabetes care.
Dagliati, Arianna; Sacchi, Lucia; Tibollo, Valentina; Cogni, Giulia; Teliti, Marsida; Martinez-Millana, Antonio; Traver, Vicente; Segagni, Daniele; Posada, Jorge; Ottaviano, Manuel; Fico, Giuseppe; Arredondo, Maria Teresa; De Cata, Pasquale; Chiovato, Luca; Bellazzi, Riccardo
2018-05-01
To describe the development, as part of the European Union MOSAIC (Models and Simulation Techniques for Discovering Diabetes Influence Factors) project, of a dashboard-based system for the management of type 2 diabetes and assess its impact on clinical practice. The MOSAIC dashboard system is based on predictive modeling, longitudinal data analytics, and the reuse and integration of data from hospitals and public health repositories. Data are merged into an i2b2 data warehouse, which feeds a set of advanced temporal analytic models, including temporal abstractions, care-flow mining, drug exposure pattern detection, and risk-prediction models for type 2 diabetes complications. The dashboard has 2 components, designed for (1) clinical decision support during follow-up consultations and (2) outcome assessment on populations of interest. To assess the impact of the clinical decision support component, a pre-post study was conducted considering visit duration, number of screening examinations, and lifestyle interventions. A pilot sample of 700 Italian patients was investigated. Judgments on the outcome assessment component were obtained via focus groups with clinicians and health care managers. The use of the decision support component in clinical activities produced a reduction in visit duration (P ≪ .01) and an increase in the number of screening exams for complications (P < .01). We also observed a relevant, although nonstatistically significant, increase in the proportion of patients receiving lifestyle interventions (from 69% to 77%). Regarding the outcome assessment component, focus groups highlighted the system's capability of identifying and understanding the characteristics of patient subgroups treated at the center. Our study demonstrates that decision support tools based on the integration of multiple-source data and visual and predictive analytics do improve the management of a chronic disease such as type 2 diabetes by enacting a successful implementation of the learning health care system cycle.
NASA Astrophysics Data System (ADS)
Giuffre, Christopher James
In the natural world there is no such thing as a perfectly sharp edge, either thru wear or machining imprecation at the macroscopic scale all edges have curvature. This curvature can have significant impact when comparing results with theory. Both numerical and analytic models for the contact of an object with a sharp edge predict infinite stresses which are not present in the physical world. It is for this reason that the influence of rounded edges must be studied to better understand how they affect model response. Using a commercial available finite element package this influence will be studied in two different problems; how this edge geometry effects the shape of a contusion (bruise) and the accuracy of analytic models for the shaft loaded blister test (SLBT). The contusion study presents work that can be used to enable medical examiners to better determine if the object in question was capable of causing the contusions present. Using a simple layered tissue model which represents a generic location on the human body, a sweep of objects with different edges properties is studied using a simple strain based injury metric. This analysis aims to examine the role that contact area and energy have on the formation, location, and shape of the resulting contusion. In studying the SLBT with finite element analysis and cohesive zone modeling, the assessment of various analytic models will provide insight into how to accurately measure the fracture energy for both the simulation and experiment. This provides insight into the interactions between a film, the substrate it is bonded to and the loading plug. In addition, parametric studies are used to examine potential experimental designs and enable future work in this field. The final product of this project provides tools and insight into future study of the effect rounded edges have on contact and this work enables for more focused studies within desired regimes of interest.
A method for modeling laterally asymmetric proton beamlets resulting from collimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelover, Edgar; Wang, Dongxu; Flynn, Ryan T.
2015-03-15
Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEVmore » parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σ{sub x1},σ{sub x2},σ{sub y1},σ{sub y2}) together with the spatial location of the maximum dose (μ{sub x},μ{sub y}). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets.« less
Effect of load introduction on graphite epoxy compression specimens
NASA Technical Reports Server (NTRS)
Reiss, R.; Yao, T. M.
1981-01-01
Compression testing of modern composite materials is affected by the manner in which the compressive load is introduced. Two such effects are investigated: (1) the constrained edge effect which prevents transverse expansion and is common to all compression testing in which the specimen is gripped in the fixture; and (2) nonuniform gripping which induces bending into the specimen. An analytical model capable of quantifying these foregoing effects was developed which is based upon the principle of minimum complementary energy. For pure compression, the stresses are approximated by Fourier series. For pure bending, the stresses are approximated by Legendre polynomials.