Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis
NASA Technical Reports Server (NTRS)
Lindstrom, D. G.; Normand, E.; Wilcox, A. D.
1972-01-01
In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.
Automating a Detailed Cognitive Task Analysis for Structuring Curriculum
1991-08-01
1991-- ] Aleeo/i ISM’-19# l Title: Automating a Detailed Cognitive Task Analysis for Structuring Curriculum Activities: To date we have completed task...The Institute for Management Sciences. Although the particular application of the modified GOMS cognitive task analysis technique under development is...Laboratories 91 9 23 074 Automnating a Detailed Cognitive Task Analysis For Stucuring Curriculum Research Plan Year 1 Task 1.0 Design Task 1.1 Conduct body
Enamel paint techniques in archaeology and their identification using XRF and micro-XRF
NASA Astrophysics Data System (ADS)
Hložek, M.; Trojek, T.; Komoróczy, B.; Prokeš, R.
2017-08-01
This investigation focuses in detail on the analysis of discoveries in South Moravia - important sites from the Roman period in Pasohlávky and Mušov. Using X-ray fluorescence analysis and micro-analysis we help identify the techniques of enamel paint and give a thorough chemical analysis in details which would not be possible to determine by means of macroscopic examination. We thus address the influence of elemental composition on the final colour of the enamel paint and describe the less known technique of combining enamel with millefiori. The material analyses of the metal artefacts decorated with enamel paint significantly contribute to our knowledge of the technology being used during the Roman period.
Facilitating the exploitation of ERTS imagery using snow enhancement techniques
NASA Technical Reports Server (NTRS)
Wobber, F. J. (Principal Investigator); Martin, K. R.; Sheffield, C.; Russell, O.; Amato, R. V.
1972-01-01
The author has identified the following significant results. Analysis of all available (Gemini, Apollo, Nimbus, NASA aircraft) small scale snow covered imagery has been conducted to develop and refine snow enhancement techniques. A detailed photographic interpretation of ERTS-simulation imagery covering the Feather River/Lake Tahoe area was completed and the 580-680nm. band was determined to be the optimum band for fracture detection. ERTS-1 MSS bands 5 and 7 are best suited for detailed fracture mapping. The two bands should provide more complete fracture detail when utilized in combination. Analysis of early ERTS-1 data along with U-2 ERTS simulation imagery indicates that snow enhancement is a viable technique for geological fracture mapping. A wealth of fracture detail on snow-free terrain was noted during preliminary analysis of ERTS-1 images 1077-15005-6 and 7, 1077-15011-5 and 7, and 1079-15124-5 and 7. A direct comparison of data yield on snow-free versus snow-covered terrain will be conducted within these areas following receipt of snow-covered ERTS-1 imagery.
An Efficient Analysis Methodology for Fluted-Core Composite Structures
NASA Technical Reports Server (NTRS)
Oremont, Leonard; Schultz, Marc R.
2012-01-01
The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.
Biomedical surface analysis: Evolution and future directions (Review)
Castner, David G.
2017-01-01
This review describes some of the major advances made in biomedical surface analysis over the past 30–40 years. Starting from a single technique analysis of homogeneous surfaces, it has been developed into a complementary, multitechnique approach for obtaining detailed, comprehensive information about a wide range of surfaces and interfaces of interest to the biomedical community. Significant advances have been made in each surface analysis technique, as well as how the techniques are combined to provide detailed information about biological surfaces and interfaces. The driving force for these advances has been that the surface of a biomaterial is the interface between the biological environment and the biomaterial, and so, the state-of-the-art in instrumentation, experimental protocols, and data analysis methods need to be developed so that the detailed surface structure and composition of biomedical devices can be determined and related to their biological performance. Examples of these advances, as well as areas for future developments, are described for immobilized proteins, complex biomedical surfaces, nanoparticles, and 2D/3D imaging of biological materials. PMID:28438024
Nuclear reactor descriptions for space power systems analysis
NASA Technical Reports Server (NTRS)
Mccauley, E. W.; Brown, N. J.
1972-01-01
For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.
Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Lee Kenneth
2017-03-01
This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
NASA Astrophysics Data System (ADS)
Sinha, Mangalika; Modi, Mohammed H.
2017-10-01
In-depth compositional analysis of 240 Å thick aluminium oxide thin film has been carried out using soft x-ray reflectivity (SXR) and x-ray photoelectron spectroscopy technique (XPS). The compositional details of the film is estimated by modelling the optical index profile obtained from the SXR measurements over 60-200 Å wavelength region. The SXR measurements are carried out at Indus-1 reflectivity beamline. The method suggests that the principal film region is comprised of Al2O3 and AlOx (x = 1.6) phases whereas the interface region comprised of SiO2 and AlOx (x = 1.6) mixture. The soft x-ray reflectivity technique combined with XPS measurements explains the compositional details of principal layer. Since the interface region cannot be analyzed with the XPS technique in a non-destructive manner in such a case the SXR technique is a powerful tool for nondestructive compositional analysis of interface region.
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
Solid State Audio/Speech Processor Analysis.
1980-03-01
techniques. The techniques were demonstrated to be worthwhile in an efficient realtime AWR system. Finally, microprocessor architectures were designed to...do not include custom chip development, detailed hardware design , construction or testing. ITTDCD is very encouraged by the results obtained in this...California, Berkley, was responsible for furnishing the simulation data of OD speech analysis techniques and for the design and development of the hardware OD
A constant current charge technique for low Earth orbit life testing
NASA Technical Reports Server (NTRS)
Glueck, Peter
1991-01-01
A constant current charge technique for low earth orbit testing of nickel cadmium cells is presented. The method mimics the familiar taper charge of the constant potential technique while maintaining cell independence for statistical analysis. A detailed example application is provided and the advantages and disadvantages of this technique are discussed.
Electrolytic preconcentration in instrumental analysis.
Sioda, R E; Batley, G E; Lund, W; Wang, J; Leach, S C
1986-05-01
The use of electrolytic deposition as a separation and preconcentration step in trace metal analysis is reviewed. Both the principles and applications of the technique are dealt with in some detail. Electrolytic preconcentration can be combined with a variety of instrumental techniques. Special attention is given to stripping voltammetry, potentiometric stripping analysis, different combinations with atomic-absorption spectrometry, and the use of flow-through porous electrodes. It is pointed out that the electrolytic preconcentration technique deserves more extensive use as well as fundamental investigation.
NASA Technical Reports Server (NTRS)
Succi, G. P.
1983-01-01
The techniques of helicopter rotor noise prediction attempt to describe precisely the details of the noise field and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The Farassat noise prediction techniques was studied, and high speed helicopter noise prediction using more detailed representations of the thickness and loading noise sources was investigated. These predictions were based on the measured blade surface pressures on an AH-1G rotor and compared to the measured sound field. Although refinements in the representation of the thickness and loading noise sources improve the calculation, there are still discrepancies between the measured and predicted sound field. Analysis of the blade surface pressure data indicates shocks on the blades, which are probably responsible for these discrepancies.
Space Construction System Analysis. Part 2: Executive summary
NASA Technical Reports Server (NTRS)
1980-01-01
A detailed, end-to-end analysis of the activities, techniques, equipment and Shuttle provisions required to construct a reference project system is described. Included are: platform definition; construction analysis; cost and programmatics; and space construction experiments concepts.
Mulware, Stephen Juma
2015-01-01
The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.
NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.
Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus
2014-12-01
We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.
Transportation Network Analysis and Decomposition Methods
DOT National Transportation Integrated Search
1978-03-01
The report outlines research in transportation network analysis using decomposition techniques as a basis for problem solutions. Two transportation network problems were considered in detail: a freight network flow problem and a scheduling problem fo...
Some failure modes and analysis techniques for terrestrial solar cell modules
NASA Technical Reports Server (NTRS)
Shumka, A.; Stern, K. H.
1978-01-01
Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.
ERIC Educational Resources Information Center
Gray, John S.
1994-01-01
A detailed analysis and computer-based solution to a puzzle addressing the arrangement of dominoes on a grid is presented. The problem is one used in a college-level data structures or algorithms course. The solution uses backtracking to generate all possible answers. Details of the use of backtracking and techniques for mapping abstract problems…
ERIC Educational Resources Information Center
Loehlin, James H.; Norton, Alexandra P.
1988-01-01
Describes a crystallography experiment using both diffraction-angle and diffraction-intensity information to determine the lattice constant and a lattice independent molecular parameter, while still employing standard X-ray powder diffraction techniques. Details the method, experimental details, and analysis for this activity. (CW)
Exploitation of ERTS-1 imagery utilizing snow enhancement techniques
NASA Technical Reports Server (NTRS)
Wobber, F. J.; Martin, K. R.
1973-01-01
Photogeological analysis of ERTS-simulation and ERTS-1 imagery of snowcovered terrain within the ERAP Feather River site and within the New England (ERTS) test area provided new fracture detail which does not appear on available geological maps. Comparative analysis of snowfree ERTS-1 images has demonstrated that MSS Bands 5 and 7 supply the greatest amount of geological fracture detail. Interpretation of the first snow-covered ERTS-1 images in correlation with ground snow depth data indicates that a heavy blanket of snow (more than 9 inches) accentuates major structural features while a light "dusting", (less than 1 inch) accentuates more subtle topographic expressions. An effective mail-based method for acquiring timely ground-truth (snowdepth) information was established and provides a ready correlation of fracture detail with snow depth so as to establish the working limits of the technique. The method is both efficient and inexpensive compared with the cost of similarly scaled direct field observations.
Validation of helicopter noise prediction techniques
NASA Technical Reports Server (NTRS)
Succi, G. P.
1981-01-01
The current techniques of helicopter rotor noise prediction attempt to describe the details of the noise field precisely and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The purpose of this paper is to review those techniques in general and the Farassat/Nystrom analysis in particular. The predictions of the Farassat/Nystrom noise computer program, using both measured and calculated blade surface pressure data, are compared to measured noise level data. This study is based on a contract from NASA to Bolt Beranek and Newman Inc. with measured data from the AH-1G Helicopter Operational Loads Survey flight test program supplied by Bell Helicopter Textron.
In-situ identification of anti-personnel mines using acoustic resonant spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, R L; Roberts, R S
1999-02-01
A new technique for identifying buried Anti-Personnel Mines is described, and a set of preliminary experiments designed to assess the feasibility of this technique is presented. Analysis of the experimental results indicates that the technique has potential, but additional work is required to bring the technique to fruition. In addition to the experimental results presented here, a technique used to characterize the sensor employed in the experiments is detailed.
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
Artificial intelligence techniques used in respiratory sound analysis--a systematic review.
Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian
2014-02-01
Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.
NASA Astrophysics Data System (ADS)
Shoukry, Samir N.; William, Gergis W.; Riad, Mourad Y.; McBride, Kevyn C.
2006-08-01
Dynamic relaxation is a technique developed to solve static problems through an explicit integration in finite element. The main advantage of such a technique is the ability to solve a large problem in a relatively short time compared with the traditional implicit techniques, especially when using nonlinear material models. This paper describes the use of such a technique in analyzing large transportation structures as dowel jointed concrete pavements and 306-m-long, reinforced concrete bridge superstructure under the effect of temperature variations. The main feature of the pavement model is the detailed modeling of dowel bars and their interfaces with the surrounding concrete using extremely fine mesh of solid elements, while in the bridge structure it is the detailed modeling of the girder-deck interface as well as the bracing members between the girders. The 3DFE results were found to be in a good agreement with experimentally measured data obtained from an instrumented pavements sections and a highway bridge constructed in West Virginia. Thus, such a technique provides a good tool for analyzing the response of large structures to static loads in a fraction of the time required by traditional, implicit finite element methods.
Multidisciplinary aeroelastic analysis of a generic hypersonic vehicle
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Petersen, K. L.
1993-01-01
This paper presents details of a flutter and stability analysis of aerospace structures such as hypersonic vehicles. Both structural and aerodynamic domains are discretized by the common finite element technique. A vibration analysis is first performed by the STARS code employing a block Lanczos solution scheme. This is followed by the generation of a linear aerodynamic grid for subsequent linear flutter analysis within subsonic and supersonic regimes of the flight envelope; the doublet lattice and constant pressure techniques are employed to generate the unsteady aerodynamic forces. Flutter analysis is then performed for several representative flight points. The nonlinear flutter solution is effected by first implementing a CFD solution of the entire vehicle. Thus, a 3-D unstructured grid for the entire flow domain is generated by a moving front technique. A finite element Euler solution is then implemented employing a quasi-implicit as well as an explicit solution scheme. A novel multidisciplinary analysis is next effected that employs modal and aerodynamic data to yield aerodynamic damping characteristics. Such analyses are performed for a number of flight points to yield a large set of pertinent data that define flight flutter characteristics of the vehicle. This paper outlines the finite-element-based integrated analysis procedures in detail, which is followed by the results of numerical analyses of flight flutter simulation.
Calculation of three-dimensional, inviscid, supersonic, steady flows
NASA Technical Reports Server (NTRS)
Moretti, G.
1981-01-01
A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.
Mei, Liang; Svanberg, Sune
2015-03-20
This work presents a detailed study of the theoretical aspects of the Fourier analysis method, which has been utilized for gas absorption harmonic detection in wavelength modulation spectroscopy (WMS). The lock-in detection of the harmonic signal is accomplished by studying the phase term of the inverse Fourier transform of the Fourier spectrum that corresponds to the harmonic signal. The mathematics and the corresponding simulation results are given for each procedure when applying the Fourier analysis method. The present work provides a detailed view of the WMS technique when applying the Fourier analysis method.
Evaluation of respiratory system mechanics in mice using the forced oscillation technique.
McGovern, Toby K; Robichaud, Annette; Fereydoonzad, Liah; Schuessler, Thomas F; Martin, James G
2013-05-15
The forced oscillation technique (FOT) is a powerful, integrative and translational tool permitting the experimental assessment of lung function in mice in a comprehensive, detailed, precise and reproducible manner. It provides measurements of respiratory system mechanics through the analysis of pressure and volume signals acquired in reaction to predefined, small amplitude, oscillatory airflow waveforms, which are typically applied at the subject's airway opening. The present protocol details the steps required to adequately execute forced oscillation measurements in mice using a computer-controlled piston ventilator (flexiVent; SCIREQ Inc, Montreal, Qc, Canada). The description is divided into four parts: preparatory steps, mechanical ventilation, lung function measurements, and data analysis. It also includes details of how to assess airway responsiveness to inhaled methacholine in anesthetized mice, a common application of this technique which also extends to other outcomes and various lung pathologies. Measurements obtained in naïve mice as well as from an oxidative-stress driven model of airway damage are presented to illustrate how this tool can contribute to a better characterization and understanding of studied physiological changes or disease models as well as to applications in new research areas.
Measurement and control of detailed electronic properties in a single molecule break junction.
Wang, Kun; Hamill, Joseph; Zhou, Jianfeng; Guo, Cunlan; Xu, Bingqian
2014-01-01
The lack of detailed experimental controls has been one of the major obstacles hindering progress in molecular electronics. While large fluctuations have been occurring in the experimental data, specific details, related mechanisms, and data analysis techniques are in high demand to promote our physical understanding at the single-molecule level. A series of modulations we recently developed, based on traditional scanning probe microscopy break junctions (SPMBJs), have helped to discover significant properties in detail which are hidden in the contact interfaces of a single-molecule break junction (SMBJ). For example, in the past we have shown that the correlated force and conductance changes under the saw tooth modulation and stretch-hold mode of PZT movement revealed inherent differences in the contact geometries of a molecular junction. In this paper, using a bias-modulated SPMBJ and utilizing emerging data analysis techniques, we report on the measurement of the altered alignment of the HOMO of benzene molecules with changing the anchoring group which coupled the molecule to metal electrodes. Further calculations based on Landauer fitting and transition voltage spectroscopy (TVS) demonstrated the effects of modulated bias on the location of the frontier molecular orbitals. Understanding the alignment of the molecular orbitals with the Fermi level of the electrodes is essential for understanding the behaviour of SMBJs and for the future design of more complex devices. With these modulations and analysis techniques, fruitful information has been found about the nature of the metal-molecule junction, providing us insightful clues towards the next step for in-depth study.
Applications of mass spectrometry techniques to autoclave curing of materials
NASA Technical Reports Server (NTRS)
Smith, A. C.
1983-01-01
Mass spectrometer analysis of gases evolved from polymer materials during a cure cycle can provide a wealth of information useful for studying cure properties and procedures. In this paper data is presented for two materials to support the feasibility of using mass spectrometer gas analysis techniques to enhance the knowledge of autoclave curing of composite materials and provide additional information for process control evaluation. It is expected that this technique will also be useful in working out the details involved in determining the proper cure cycle for new or experimental materials.
LSI/VLSI design for testability analysis and general approach
NASA Technical Reports Server (NTRS)
Lam, A. Y.
1982-01-01
The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.
POLO: a user's guide to Probit Or LOgit analysis.
Jacqueline L. Robertson; Robert M. Russell; N.E. Savin
1980-01-01
This user's guide provides detailed instructions for the use of POLO (Probit Or LOgit), a computer program for the analysis of quantal response data such as that obtained from insecticide bioassays by the techniques of probit or logit analysis. Dosage-response lines may be compared for parallelism or...
Utility of fluorescence microscopy in embryonic/fetal topographical analysis.
Zucker, R M; Elstein, K H; Shuey, D L; Ebron-McCoy, M; Rogers, J M
1995-06-01
For topographical analysis of developing embryos, investigators typically rely on scanning electron microscopy (SEM) to provide the surface detail not attainable with light microscopy. SEM is an expensive and time-consuming technique, however, and the preparation procedure may alter morphology and leave the specimen friable. We report that by using a high-resolution compound epifluorescence microscope with inexpensive low-power objectives and the fluorochrome acridine orange, we were able to obtain surface images of fixed or fresh whole rat embryos and fetal palates of considerably greater topographical detail than those obtained using routine light microscopy. Indeed the resulting high-resolution images afford not only superior qualitative documentation of morphological observations, but the capability for detailed morphometry via digitization and computer-assisted image analysis.
NASA Astrophysics Data System (ADS)
Grassi, N.
2005-06-01
In the framework of the extensive study on the wood painting "Madonna dei fusi" attributed to Leonardo da Vinci, Ion Beam Analysis (IBA) techniques were used at the Florence accelerator laboratory to get information about the elemental composition of the paint layers. After a brief description of the basic principle and the general features of IBA techniques, we will illustrate in detail how the analysis allowed us to characterise the pigments of original and restored areas and the substrate composition, and to obtain information about the stratigraphy of the painting, also providing an estimate of the paint layer thickness.
Schmitt, M; Groß, K; Grub, J; Heib, F
2015-06-01
Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (<0.4mm) and the dominance of counted events with small velocity the measurements are less influenced by motion dynamics and the procedure can be called "slow moving" analysis. The presented procedures as performed are especially sensitive to the range which reaches from the static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration behaviour (reactive de-wetting) are presented. Copyright © 2014 Elsevier Inc. All rights reserved.
Edge enhancement and noise suppression for infrared image based on feature analysis
NASA Astrophysics Data System (ADS)
Jiang, Meng
2018-06-01
Infrared images are often suffering from background noise, blurred edges, few details and low signal-to-noise ratios. To improve infrared image quality, it is essential to suppress noise and enhance edges simultaneously. To realize it in this paper, we propose a novel algorithm based on feature analysis in shearlet domain. Firstly, as one of multi-scale geometric analysis (MGA), we introduce the theory and superiority of shearlet transform. Secondly, after analyzing the defects of traditional thresholding technique to suppress noise, we propose a novel feature extraction distinguishing image structures from noise well and use it to improve the traditional thresholding technique. Thirdly, with computing the correlations between neighboring shearlet coefficients, the feature attribute maps identifying the weak detail and strong edges are completed to improve the generalized unsharped masking (GUM). At last, experiment results with infrared images captured in different scenes demonstrate that the proposed algorithm suppresses noise efficiently and enhances image edges adaptively.
Analytical Characterization of Erythritol Tetranitrate, an Improvised Explosive.
Matyáš, Robert; Lyčka, Antonín; Jirásko, Robert; Jakový, Zdeněk; Maixner, Jaroslav; Mišková, Linda; Künzel, Martin
2016-05-01
Erythritol tetranitrate (ETN), an ester of nitric acid and erythritol, is a solid crystalline explosive with high explosive performance. Although it has never been used in any industrial or military application, it has become one of the most prepared and misused improvise explosives. In this study, several analytical techniques were explored to facilitate analysis in forensic laboratories. FTIR and Raman spectrometry measurements expand existing data and bring more detailed assignment of bands through the parallel study of erythritol [(15) N4 ] tetranitrate. In the case of powder diffraction, recently published data were verified, and (1) H, (13) C, and (15) N NMR spectra are discussed in detail. The technique of electrospray ionization tandem mass spectrometry was successfully used for the analysis of ETN. Described methods allow fast, versatile, and reliable detection or analysis of samples containing erythritol tetranitrate in forensic laboratories. © 2016 American Academy of Forensic Sciences.
Microfluidics for Single-Cell Genetic Analysis
Thompson, A. M.; Paguirigan, A. L.; Kreutz, J. E.; Radich, J. P.; Chiu, D. T.
2014-01-01
The ability to correlate single-cell genetic information to cellular phenotypes will provide the kind of detailed insight into human physiology and disease pathways that is not possible to infer from bulk cell analysis. Microfluidic technologies are attractive for single-cell manipulation due to precise handling and low risk of contamination. Additionally, microfluidic single-cell techniques can allow for high-throughput and detailed genetic analyses that increase accuracy and decreases reagent cost compared to bulk techniques. Incorporating these microfluidic platforms into research and clinical laboratory workflows can fill an unmet need in biology, delivering the highly accurate, highly informative data necessary to develop new therapies and monitor patient outcomes. In this perspective, we describe the current and potential future uses of microfluidics at all stages of single-cell genetic analysis, including cell enrichment and capture, single-cell compartmentalization and manipulation, and detection and analyses. PMID:24789374
Uses of Children's Make-Believe Play in Family Therapy: Theory and Clinical Examples.
ERIC Educational Resources Information Center
Ariel, Shlomo; And Others
1985-01-01
Presents and illustrates by clinical examples a theoretical framework for developing, describing, and analyzing family-therapeutic techniques involving make-believe play. Induces specifications of the therapeutic goals served by the technique and its procedural details and an analysis of its rationale. Draws on a definition of the concept…
Many-body-theory study of lithium photoionization
NASA Technical Reports Server (NTRS)
Chang, T. N.; Poe, R. T.
1975-01-01
A detailed theoretical calculation is carried out for the photoionization of lithium at low energies within the framework of Brueckner-Goldstone perturbational approach. In this calculation extensive use is made of the recently developed multiple-basis-set technique. Through this technique all second-order perturbation terms, plus a number of important classes of terms to infinite order, have been taken into account. Analysis of the results enables one to resolve the discrepancies between two previous works on this subject. The detailed calculation also serves as a test on the convergence of the many-body perturbation-expansion approach.
Identification of cost effective energy conservation measures
NASA Technical Reports Server (NTRS)
Bierenbaum, H. S.; Boggs, W. H.
1978-01-01
In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1977-06-01
The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less
Analysis of Learning Curve Fitting Techniques.
1987-09-01
1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
Plume rise study at Colbert Steam Plant--data presentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, T.L.; Coleman, J.H.
1979-05-01
This report makes detailed data on plume rise available for independent analysis by other specialists studying atmospheric dispersion. Techniques of data collection and methods of data reduction are detailed. Data from 24 time-averaged observations of the plume at Colbert Steam Plant, its source, and the meteorological conditions are reported. Most of the data were collected during early to midmorning and are therefore characterized by stable atmospheric conditions. The data are presented in both a summary and a detailed format.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Introduction to Time Series Analysis
NASA Technical Reports Server (NTRS)
Hardin, J. C.
1986-01-01
The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
Shock wave viscosity measurements
NASA Astrophysics Data System (ADS)
Celliers, Peter
2013-06-01
Several decades ago a method was proposed and demonstrated to measure the viscosity of fluids at high pressure by observing the oscillatory damping of sinusoidal perturbations on a shock front. A detailed mathematical analysis of the technique carried out subsequently by Miller and Ahrens revealed its potential, as well as a deep level of complexity in the analysis. We revisit the ideas behind this technique in the context of a recent experimental development: two-dimensional imaging velocimetry. The new technique allows one to capture a broad spectrum of perturbations down to few micron scale-lengths imposed on a shock front from an initial perturbation. The detailed evolution of the perturbation spectrum is sensitive to the viscosity in the fluid behind the shock front. Initial experiments are aimed at examining the viscosity of shock compressed SiO2 just above the shock melting transition. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Structural Design and Sizing of a Metallic Cryotank Concept
NASA Technical Reports Server (NTRS)
Sleight, David W.; Martin, Robert A.; Johnson, Theodore F.
2013-01-01
This paper presents the structural design and sizing details of a 33-foot (10 m) metallic cryotank concept used as the reference design to compare with the composite cryotank concepts developed by industry as part of NASA s Composite Cryotank Technology Development (CCTD) Project. The structural design methodology and analysis results for the metallic cryotank concept are reported in the paper. The paper describes the details of the metallic cryotank sizing assumptions for the baseline and reference tank designs. In particular, the paper discusses the details of the cryotank weld land design and analyses performed to obtain a reduced weight metallic cryotank design using current materials and manufacturing techniques. The paper also discusses advanced manufacturing techniques to spin-form the cryotank domes and compares the potential mass savings to current friction stir-welded technology.
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Barrow, H. G.; Weyl, S. A.
1976-01-01
Cooperative (man-machine) scene analysis techniques were developed whereby humans can provide a computer with guidance when completely automated processing is infeasible. An interactive approach promises significant near-term payoffs in analyzing various types of high volume satellite imagery, as well as vehicle-based imagery used in robot planetary exploration. This report summarizes the work accomplished over the duration of the project and describes in detail three major accomplishments: (1) the interactive design of texture classifiers; (2) a new approach for integrating the segmentation and interpretation phases of scene analysis; and (3) the application of interactive scene analysis techniques to cartography.
De Crop, An; Bacher, Klaus; Van Hoof, Tom; Smeets, Peter V; Smet, Barbara S; Vergauwen, Merel; Kiendys, Urszula; Duyck, Philippe; Verstraete, Koenraad; D'Herde, Katharina; Thierens, Hubert
2012-01-01
To determine the correlation between the clinical and physical image quality of chest images by using cadavers embalmed with the Thiel technique and a contrast-detail phantom. The use of human cadavers fulfilled the requirements of the institutional ethics committee. Clinical image quality was assessed by using three human cadavers embalmed with the Thiel technique, which results in excellent preservation of the flexibility and plasticity of organs and tissues. As a result, lungs can be inflated during image acquisition to simulate the pulmonary anatomy seen on a chest radiograph. Both contrast-detail phantom images and chest images of the Thiel-embalmed bodies were acquired with an amorphous silicon flat-panel detector. Tube voltage (70, 81, 90, 100, 113, 125 kVp), copper filtration (0.1, 0.2, 0.3 mm Cu), and exposure settings (200, 280, 400, 560, 800 speed class) were altered to simulate different quality levels. Four experienced radiologists assessed the image quality by using a visual grading analysis (VGA) technique based on European Quality Criteria for Chest Radiology. The phantom images were scored manually and automatically with use of dedicated software, both resulting in an inverse image quality figure (IQF). Spearman rank correlations between inverse IQFs and VGA scores were calculated. A statistically significant correlation (r = 0.80, P < .01) was observed between the VGA scores and the manually obtained inverse IQFs. Comparison of the VGA scores and the automated evaluated phantom images showed an even better correlation (r = 0.92, P < .001). The results support the value of contrast-detail phantom analysis for evaluating clinical image quality in chest radiography. © RSNA, 2011.
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
Raman imaging from microscopy to macroscopy: Quality and safety control of biological materials
USDA-ARS?s Scientific Manuscript database
Raman imaging can analyze biological materials by generating detailed chemical images. Over the last decade, tremendous advancements in Raman imaging and data analysis techniques have overcome problems such as long data acquisition and analysis times and poor sensitivity. This review article introdu...
Non-destructive Analysis Reveals Effect of Installation Details on Plywood Siding Performance
Christopher G. Hunt; Gregory T. Schueneman; Steven Lacher; Xiping Wang; R. Sam Williams
2015-01-01
This study evaluated the influence of a variety of construction techniques on the performance of plywood siding and the applied paint, using both ultrasound and conventional visual inspection techniques. The impact of bottom edge contact, flashing vs. caulking board ends, priming the bottom edge, location (Wisconsin vs. Mississippi) and a gap behind the siding to...
ERIC Educational Resources Information Center
Pumfrey, Peter D.
The second edition of this British publication provides details of recent developments in the assessment of reading attainments and the analysis of reading processes. The book begins with a description of various types of reading tests and assessment techniques with consideration given to the purposes for which normative, criterion-referenced, and…
Lagrangian analysis of multiscale particulate flows with the particle finite element method
NASA Astrophysics Data System (ADS)
Oñate, Eugenio; Celigueta, Miguel Angel; Latorre, Salvador; Casas, Guillermo; Rossi, Riccardo; Rojek, Jerzy
2014-05-01
We present a Lagrangian numerical technique for the analysis of flows incorporating physical particles of different sizes. The numerical approach is based on the particle finite element method (PFEM) which blends concepts from particle-based techniques and the FEM. The basis of the Lagrangian formulation for particulate flows and the procedure for modelling the motion of small and large particles that are submerged in the fluid are described in detail. The numerical technique for analysis of this type of multiscale particulate flows using a stabilized mixed velocity-pressure formulation and the PFEM is also presented. Examples of application of the PFEM to several particulate flows problems are given.
Sample preparation for the analysis of isoflavones from soybeans and soy foods.
Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A
2009-01-02
This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.
The use of interpractive graphic displays for interpretation of surface design parameters
NASA Technical Reports Server (NTRS)
Talcott, N. A., Jr.
1981-01-01
An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Lajus, Dmitry; Sukhikh, Natalia; Alekseev, Victor
2015-01-01
Interest in cryptic species has increased significantly with current progress in genetic methods. The large number of cryptic species suggests that the resolution of traditional morphological techniques may be insufficient for taxonomical research. However, some species now considered to be cryptic may, in fact, be designated pseudocryptic after close morphological examination. Thus the “cryptic or pseudocryptic” dilemma speaks to the resolution of morphological analysis and its utility for identifying species. We address this dilemma first by systematically reviewing data published from 1980 to 2013 on cryptic species of Copepoda and then by performing an in-depth morphological study of the former Eurytemora affinis complex of cryptic species. Analyzing the published data showed that, in 5 of 24 revisions eligible for systematic review, cryptic species assignment was based solely on the genetic variation of forms without detailed morphological analysis to confirm the assignment. Therefore, some newly described cryptic species might be designated pseudocryptic under more detailed morphological analysis as happened with Eurytemora affinis complex. Recent genetic analyses of the complex found high levels of heterogeneity without morphological differences; it is argued to be cryptic. However, next detailed morphological analyses allowed to describe a number of valid species. Our study, using deep statistical analyses usually not applied for new species describing, of this species complex confirmed considerable differences between former cryptic species. In particular, fluctuating asymmetry (FA), the random variation of left and right structures, was significantly different between forms and provided independent information about their status. Our work showed that multivariate statistical approaches, such as principal component analysis, can be powerful techniques for the morphological discrimination of cryptic taxons. Despite increasing cryptic species designations, morphological techniques have great potential in determining copepod taxonomy. PMID:26120427
A pilot modeling technique for handling-qualities research
NASA Technical Reports Server (NTRS)
Hess, R. A.
1980-01-01
A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.
Hunt, N C; Attanoos, R; Jasani, B
1996-01-01
The use of high temperature antigen retrieval methods has been of major importance in increasing the diagnostic utility of immunocytochemistry. However, these techniques are not without their problems and in this report attention is drawn to a loss of nuclear morphological detail, including mitotic figures, following microwave antigen retrieval. This was not seen with an equivalent autoclave technique. This phenomenon was quantified using image analysis in a group of B cell lymphomas stained with the antibody L26. Loss of nuclear morphological detail may lead to difficulty in identifying cells accurately, which is important in the diagnostic setting-for example, when trying to distinguish a malignant lymphoid infiltrate within a mixed cell population. In such cases it would clearly be wise to consider the use of alternative high temperature retrieval methods and accept their slightly lower staining enhancement capability compared with the microwave technique. Images PMID:9038766
14 CFR 1274.801 - Adjustments to performance costs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... NASA's initial cost share or funding levels, detailed cost analysis techniques may be applied, which... shall continue to maintain the share ratio requirements (normally 50/50) stated in § 1274.204(b). ...
Investigation of advanced phase-shifting projected fringe profilometry techniques
NASA Astrophysics Data System (ADS)
Liu, Hongyu
1999-11-01
The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.
Synthesis of samarium doped gadolinium oxide nanorods, its spectroscopic and physical properties
NASA Astrophysics Data System (ADS)
Boopathi, G.; Gokul Raj, S.; Ramesh Kumar, G.; Mohan, R.; Mohan, S.
2018-06-01
One-dimensional samarium doped gadolinium oxide [Sm:Gd2O3] nanorods have been synthesized successfully through co-precipitation technique in aqueous solution. The as-synthesized and calcined products were characterized by using powder X-ray diffraction pattern, Fourier transform Raman spectroscopy, thermogravimetric/differential thermal analysis, scanning electron microscopy with energy-dispersive X-ray analysis, transmission electron microscopy, Fourier transform infrared spectroscopy, Ultraviolet-Visible spectrometry, photoluminescence spectrophotometer and X-ray photoelectron spectroscopy techniques. The obtained results are discussed in detailed manner.
Gorzsás, András; Sundberg, Björn
2014-01-01
Fourier transform infrared (FT-IR) spectroscopy is a fast, sensitive, inexpensive, and nondestructive technique for chemical profiling of plant materials. In this chapter we discuss the instrumental setup, the basic principles of analysis, and the possibilities for and limitations of obtaining qualitative and semiquantitative information by FT-IR spectroscopy. We provide detailed protocols for four fully customizable techniques: (1) Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS): a sensitive and high-throughput technique for powders; (2) attenuated total reflectance (ATR) spectroscopy: a technique that requires no sample preparation and can be used for solid samples as well as for cell cultures; (3) microspectroscopy using a single element (SE) detector: a technique used for analyzing sections at low spatial resolution; and (4) microspectroscopy using a focal plane array (FPA) detector: a technique for rapid chemical profiling of plant sections at cellular resolution. Sample preparation, measurement, and data analysis steps are listed for each of the techniques to help the user collect the best quality spectra and prepare them for subsequent multivariate analysis.
Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Choi, S. B.; Ibrahim, A.
2010-01-01
A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.
IGA: A Simplified Introduction and Implementation Details for Finite Element Users
NASA Astrophysics Data System (ADS)
Agrawal, Vishal; Gautam, Sachin S.
2018-05-01
Isogeometric analysis (IGA) is a recently introduced technique that employs the Computer Aided Design (CAD) concept of Non-uniform Rational B-splines (NURBS) tool to bridge the substantial bottleneck between the CAD and finite element analysis (FEA) fields. The simplified transition of exact CAD models into the analysis alleviates the issues originating from geometrical discontinuities and thus, significantly reduces the design-to-analysis time in comparison to traditional FEA technique. Since its origination, the research in the field of IGA is accelerating and has been applied to various problems. However, the employment of CAD tools in the area of FEA invokes the need of adapting the existing implementation procedure for the framework of IGA. Also, the usage of IGA requires the in-depth knowledge of both the CAD and FEA fields. This can be overwhelming for a beginner in IGA. Hence, in this paper, a simplified introduction and implementation details for the incorporation of NURBS based IGA technique within the existing FEA code is presented. It is shown that with little modifications, the available standard code structure of FEA can be adapted for IGA. For the clear and concise explanation of these modifications, step-by-step implementation of a benchmark plate with a circular hole under the action of in-plane tension is included.
A visual analysis of multi-attribute data using pixel matrix displays
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel; Schreck, Tobias
2007-01-01
Charts and tables are commonly used to visually analyze data. These graphics are simple and easy to understand, but charts show only highly aggregated data and present only a limited number of data values while tables often show too many data values. As a consequence, these graphics may either lose or obscure important information, so different techniques are required to monitor complex datasets. Users need more powerful visualization techniques to digest and compare detailed multi-attribute data to analyze the health of their business. This paper proposes an innovative solution based on the use of pixel-matrix displays to represent transaction-level information. With pixelmatrices, users can visualize areas of importance at a glance, a capability not provided by common charting techniques. We present our solutions to use colored pixel-matrices in (1) charts for visualizing data patterns and discovering exceptions, (2) tables for visualizing correlations and finding root-causes, and (3) time series for visualizing the evolution of long-running transactions. The solutions have been applied with success to product sales, Internet network performance analysis, and service contract applications demonstrating the benefits of our method over conventional graphics. The method is especially useful when detailed information is a key part of the analysis.
NASA Technical Reports Server (NTRS)
Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.
2014-01-01
We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.
INFRARED SPECTROSCOPY: A TOOL FOR DETERMINATION OF THE DEGREE OF CONVERSION IN DENTAL COMPOSITES
Moraes, Luciene Gonçalves Palmeira; Rocha, Renata Sanches Ferreira; Menegazzo, Lívia Maluf; de AraÚjo, Eudes Borges; Yukimitu, Keizo; Moraes, João Carlos Silos
2008-01-01
Infrared spectroscopy is one of the most widely used techniques for measurement of conversion degree in dental composites. However, to obtain good quality spectra and quantitative analysis from spectral data, appropriate expertise and knowledge of the technique are mandatory. This paper presents important details to use infrared spectroscopy for determination of the conversion degree. PMID:19089207
NASA Technical Reports Server (NTRS)
Sundstrom, J. L.
1980-01-01
The techniques required to produce and validate six detailed task timeline scenarios for crew workload studies are described. Specific emphasis is given to: general aviation single pilot instrument flight rules operations in a high density traffic area; fixed path metering and spacing operations; and comparative workload operation between the forward and aft-flight decks of the NASA terminal control vehicle. The validation efforts also provide a cursory examination of the resultant demand workload based on the operating procedures depicted in the detailed task scenarios.
Mixed Stationary Liquid Phases for Gas-Liquid Chromatography.
ERIC Educational Resources Information Center
Koury, Albert M.; Parcher, Jon F.
1979-01-01
Describes a laboratory technique for use in an undergraduate instrumental analysis course that, using the interpretation of window diagrams, prepares a mixed liquid phase column for gas-liquid chromatography. A detailed procedure is provided. (BT)
NASA Astrophysics Data System (ADS)
Hildebrandt, Mario; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus
2011-11-01
The preventive application of automated latent fingerprint acquisition devices can enhance the Homeland Defence, e.g. by improving the border security. Here, contact-less optical acquisition techniques for the capture of traces are subject to research; chromatic white light sensors allow for multi-mode operation using coarse or detailed scans. The presence of potential fingerprints could be detected using fast coarse scans. Those Regions-of- Interest can be acquired afterwards with high-resolution detailed scans to allow for a verification or identification of individuals. An acquisition and analysis of fingerprint traces on different objects that are imported or pass borders might be a great enhancement for security. Additionally, if suspicious objects require a further investigation, an initial securing of potential fingerprints could be very useful. In this paper we show current research results for the coarse detection of fingerprints to prepare the detailed acquisition from various surface materials that are relevant for preventive applications.
Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study
NASA Astrophysics Data System (ADS)
Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh
2018-03-01
Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.
Reliability analysis of the F-8 digital fly-by-wire system
NASA Technical Reports Server (NTRS)
Brock, L. D.; Goodman, H. A.
1981-01-01
The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.
Air pollution source identification
NASA Technical Reports Server (NTRS)
Fordyce, J. S.
1975-01-01
Techniques for air pollution source identification are reviewed, and some results obtained with them are evaluated. Described techniques include remote sensing from satellites and aircraft, on-site monitoring, and the use of injected tracers and pollutants themselves as tracers. The use of a large number of trace elements in ambient airborne particulate matter as a practical means of identifying sources is discussed in detail. Sampling and analysis techniques are described, and it is shown that elemental constituents can be related to specific source types such as those found in the earth's crust and those associated with specific industries. Source identification sytems are noted which utilize charged particle X-ray fluorescence analysis of original field data.
Aeroservoelastic and Flight Dynamics Analysis Using Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Arena, Andrew S., Jr.
1999-01-01
This document in large part is based on the Masters Thesis of Cole Stephens. The document encompasses a variety of technical and practical issues involved when using the STARS codes for Aeroservoelastic analysis of vehicles. The document covers in great detail a number of technical issues and step-by-step details involved in the simulation of a system where aerodynamics, structures and controls are tightly coupled. Comparisons are made to a benchmark experimental program conducted at NASA Langley. One of the significant advantages of the methodology detailed is that as a result of the technique used to accelerate the CFD-based simulation, a systems model is produced which is very useful for developing the control law strategy, and subsequent high-speed simulations.
2004-01-01
Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
NASA Technical Reports Server (NTRS)
Faller, K. H.
1976-01-01
A technique for the detection and measurement of surface feature interfaces in remotely acquired data was developed and evaluated. A computer implementation of this technique was effected to automatically process classified data derived from various sources such as the LANDSAT multispectral scanner and other scanning sensors. The basic elements of the operational theory of the technique are described, followed by the details of the procedure. An example of an application of the technique to the analysis of tidal shoreline length is given with a breakdown of manpower requirements.
Further Developments of the Fringe-Imaging Skin Friction Technique
NASA Technical Reports Server (NTRS)
Zilliac, Gregory C.
1996-01-01
Various aspects and extensions of the Fringe-Imaging Skin Friction technique (FISF) have been explored through the use of several benchtop experiments and modeling. The technique has been extended to handle three-dimensional flow fields with mild shear gradients. The optical and imaging system has been refined and a PC-based application has been written that has made it possible to obtain high resolution skin friction field measurements in a reasonable period of time. The improved method was tested on a wingtip and compared with Navier-Stokes computations. Additionally, a general approach to interferogram-fringe spacing analysis has been developed that should have applications in other areas of interferometry. A detailed error analysis of the FISF technique is also included.
A multiple technique approach to the analysis of urinary calculi.
Rodgers, A L; Nassimbeni, L R; Mulder, K J
1982-01-01
10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.
Computerized traffic data analysis system.
DOT National Transportation Integrated Search
1975-01-01
The techniques of collecting detailed traffic data for a given site are well known. A popular method uses chart recorders in combination with various vehicle sensing devices, such as tape switches, to provide an accurate pictoral display of the traff...
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
Cortijo, Sandra; Charoensawan, Varodom; Roudier, François; Wigge, Philip A
2018-01-01
Chromatin immunoprecipitation combined with next-generation sequencing (ChIP-seq) is a powerful technique to investigate in vivo transcription factor (TF) binding to DNA, as well as chromatin marks. Here we provide a detailed protocol for all the key steps to perform ChIP-seq in Arabidopsis thaliana roots, also working on other A. thaliana tissues and in most non-ligneous plants. We detail all steps from material collection, fixation, chromatin preparation, immunoprecipitation, library preparation, and finally computational analysis based on a combination of publicly available tools.
Estimating the cost of major ongoing cost plus hardware development programs
NASA Technical Reports Server (NTRS)
Bush, J. C.
1990-01-01
Approaches are developed for forecasting the cost of major hardware development programs while these programs are in the design and development C/D phase. Three approaches are developed: a schedule assessment technique for bottom-line summary cost estimation, a detailed cost estimation approach, and an intermediate cost element analysis procedure. The schedule assessment technique was developed using historical cost/schedule performance data.
Comparing digital data processing techniques for surface mine and reclamation monitoring
NASA Technical Reports Server (NTRS)
Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.
1982-01-01
The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.
A collection of flow visualization techniques used in the Aerodynamic Research Branch
NASA Technical Reports Server (NTRS)
1984-01-01
Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.
NASA Technical Reports Server (NTRS)
Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.
1987-01-01
This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.
Algal Biomass Analysis by Laser-Based Analytical Techniques—A Review
Pořízka, Pavel; Prochazková, Petra; Prochazka, David; Sládková, Lucia; Novotný, Jan; Petrilak, Michal; Brada, Michal; Samek, Ota; Pilát, Zdeněk; Zemánek, Pavel; Adam, Vojtěch; Kizek, René; Novotný, Karel; Kaiser, Jozef
2014-01-01
Algal biomass that is represented mainly by commercially grown algal strains has recently found many potential applications in various fields of interest. Its utilization has been found advantageous in the fields of bioremediation, biofuel production and the food industry. This paper reviews recent developments in the analysis of algal biomass with the main focus on the Laser-Induced Breakdown Spectroscopy, Raman spectroscopy, and partly Laser-Ablation Inductively Coupled Plasma techniques. The advantages of the selected laser-based analytical techniques are revealed and their fields of use are discussed in detail. PMID:25251409
On Dynamics of Spinning Structures
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Ibrahim, A.
2012-01-01
This paper provides details of developments pertaining to vibration analysis of gyroscopic systems, that involves a finite element structural discretization followed by the solution of the resulting matrix eigenvalue problem by a progressive, accelerated simultaneous iteration technique. Thus Coriolis, centrifugal and geometrical stiffness matrices are derived for shell and line elements, followed by the eigensolution details as well as solution of representative problems that demonstrates the efficacy of the currently developed numerical procedures and tools.
MeV ion-beam analysis of optical data storage films
NASA Technical Reports Server (NTRS)
Leavitt, J. A.; Mcintyre, L. C., Jr.; Lin, Z.
1993-01-01
Our objectives are threefold: (1) to accurately characterize optical data storage films by MeV ion-beam analysis (IBA) for ODSC collaborators; (2) to develop new and/or improved analysis techniques; and (3) to expand the capabilities of the IBA facility itself. Using H-1(+), He-4(+), and N-15(++) ion beams in the 1.5 MeV to 10 MeV energy range from a 5.5 MV Van de Graaff accelerator, film thickness (in atoms/sq cm), stoichiometry, impurity concentration profiles, and crystalline structure were determined by Rutherford backscattering (RBS), high-energy backscattering, channeling, nuclear reaction analysis (NRA) and proton induced X-ray emission (PIXE). Most of these techniques are discussed in detail in the ODSC Annual Report (February 17, 1987), p. 74. The PIXE technique is briefly discussed in the ODSC Annual Report (March 15, 1991), p. 23.
Characterization of Low-Molecular-Weight Heparins by Strong Anion-Exchange Chromatography.
Sadowski, Radosław; Gadzała-Kopciuch, Renata; Kowalkowski, Tomasz; Widomski, Paweł; Jujeczka, Ludwik; Buszewski, Bogusław
2017-11-01
Currently, detailed structural characterization of low-molecular-weight heparin (LMWH) products is an analytical subject of great interest. In this work, we carried out a comprehensive structural analysis of LMWHs and applied a modified pharmacopeial method, as well as methods developed by other researchers, to the analysis of novel biosimilar LMWH products; and, for the first time, compared the qualitative and quantitative composition of commercially available drugs (enoxaparin, nadroparin, and dalteparin). For this purpose, we used strong anion-exchange (SAX) chromatography with spectrophotometric detection because this method is more helpful, easier, and faster than other separation techniques for the detailed disaccharide analysis of new LMWH drugs. In addition, we subjected the obtained results to statistical analysis (factor analysis, t-test, and Newman-Keuls post hoc test).
Iqbal, Asif; Allan, Andrew; Afroze, Shirina
2017-08-01
The study focused to assess the level of efficiency (of both emissions and service quality) that can be achieved for the transport system in Dhaka City, Bangladesh. The assessment technique attempted to quantify the extent of eco-efficiency achievable for the system modifications due to planning or strategy. The eco-efficiency analysis was facilitated with a detailed survey data on Dhaka City transport system, which was conducted for 9 months in 2012-2013. Line source modelling (CALINE4) was incorporated to estimate the on-road emission concentration. The eco-efficiency of the transport systems was assessed with the 'multi-criteria analysis' (MCA) technique that enabled the valuation of systems' qualitative and quantitative parameters. As per the analysis, driving indiscipline on road can alone promise about 47% reductions in emissions, which along with the number of private vehicles were the important stressors that restrict achieving eco-efficiency in Dhaka City. Detailed analysis of the transport system together with the potential transport system scenarios can offer a checklist to the policy makers enabling to identify the possible actions needed that can offer greater services to the dwellers against lesser emissions, which in turn can bring sustainability of the system.
NASA Technical Reports Server (NTRS)
Benepe, D. B.; Cunningham, A. M., Jr.; Traylor, S., Jr.; Dunmyer, W. D.
1978-01-01
Plotted power spectra for all of the flight points examined during the Phase 2 flight data analysis are presented. Detailed descriptions of the aircraft, the flight instrumentation and the analysis techniques are given. Measured and calculated vibration mode frequencies are also presented to assist in further interpretation of the PSD data.
NASA Technical Reports Server (NTRS)
Dimotakis, P. E.; Collins, D. J.; Lang, D. B.
1979-01-01
A description of both the mean and the fluctuating components of the flow, and of the Reynolds stress as observed using a dual forward scattering laser-Doppler velocimeter is presented. A detailed description of the instrument and of the data analysis techniques were included in order to fully document the data. A detailed comparison was made between the laser-Doppler results and those presented in Part 1, and an assessment was made of the ability of the laser-Doppler velocimeter to measure the details of the flows involved.
Cytological Analysis of Meiosis in Caenorhabditis elegans
Phillips, Carolyn M.; McDonald, Kent L.; Dernburg, Abby F.
2011-01-01
The nematode Caenorhabditis elegans has emerged as an informative experimental system for analysis of meiosis, in large part because of the advantageous physical organization of meiotic nuclei as a gradient of stages within the germline. Here we provide tools for detailed observational studies of cells within the worm gonad, including techniques for light and electron microscopy. PMID:19685325
Fostering multiple repertoires in undergraduate behavior analysis students
Polson, David A. D.
1995-01-01
Eight techniques used by the author in teaching an introductory applied behavior analysis course are described: (a) a detailed study guide, (b) frequent tests, (c) composition of practice test questions, (d) in-class study groups, (e) fluency building with a computerized flash-card program, (f) bonus marks for participation during question-and-answer sessions, (g) student presentations that summarize and analyze recently published research, and (h) in-class behavior analysis of comic strips. Together, these techniques require an extensive amount of work by students. Nevertheless, students overwhelmingly prefer this approach to the traditional lecture-midterm-final format, and most earn an A as their final course grade. PMID:22478226
Semen analysis: a new manual and its application to the understanding of semen and its pathology
Jequier, Anne M.
2010-01-01
This article reviews the latest edition of the World Health Organization's manual on semen analysis, a comprehensive instructional guide. The methodology used in the assessment of the usual variables in semen analysis is described, as are many of the less common, but very valuable, sperm function tests. Seminal fluid preparation techniques for procedures such as in vitro fertilization and intrauterine insemination are also outlined in the manual. In addition, it details many useful techniques for the assessment of seminal fluid. It will be a very useful manual for any laboratory that carries out analyses of seminal fluid. PMID:20111075
Semen analysis: a new manual and its application to the understanding of semen and its pathology.
Jequier, Anne M
2010-01-01
This article reviews the latest edition of the World Health Organization's manual on semen analysis, a comprehensive instructional guide. The methodology used in the assessment of the usual variables in semen analysis is described, as are many of the less common, but very valuable, sperm function tests. Seminal fluid preparation techniques for procedures such as in vitro fertilization and intrauterine insemination are also outlined in the manual. In addition, it details many useful techniques for the assessment of seminal fluid. It will be a very useful manual for any laboratory that carries out analyses of seminal fluid.
Hopkins, F B; Gravett, M R; Self, A J; Wang, M; Chua, Hoe-Chee; Hoe-Chee, C; Lee, H S Nancy; Sim, N Lee Hoi; Jones, J T A; Timperley, C M; Riches, J R
2014-08-01
Detailed chemical analysis of solutions used to decontaminate chemical warfare agents can be used to support verification and forensic attribution. Decontamination solutions are amongst the most difficult matrices for chemical analysis because of their corrosive and potentially emulsion-based nature. Consequently, there are relatively few publications that report their detailed chemical analysis. This paper describes the application of modern analytical techniques to the analysis of decontamination solutions following decontamination of the chemical warfare agent O-ethyl S-2-diisopropylaminoethyl methylphosphonothiolate (VX). We confirm the formation of N,N-diisopropylformamide and N,N-diisopropylamine following decontamination of VX with hypochlorite-based solution, whereas they were not detected in extracts of hydroxide-based decontamination solutions by nuclear magnetic resonance (NMR) spectroscopy or gas chromatography-mass spectrometry. We report the electron ionisation and chemical ionisation mass spectroscopic details, retention indices, and NMR spectra of N,N-diisopropylformamide and N,N-diisopropylamine, as well as analytical methods suitable for their analysis and identification in solvent extracts and decontamination residues.
Developing techniques for cause-responsibility analysis of occupational accidents.
Jabbari, Mousa; Ghorbani, Roghayeh
2016-11-01
The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.
Variational Bayesian Parameter Estimation Techniques for the General Linear Model
Starke, Ludger; Ostwald, Dirk
2017-01-01
Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572
Experimental analysis of the flow near the boundary of random porous media
NASA Astrophysics Data System (ADS)
Wu, Zhenxing; Mirbod, Parisa
2018-04-01
The aim of this work is to experimentally examine flow over and near random porous media. Different porous materials were chosen to achieve porosity ranging from 0.95 to 0.99. In this study, we report the detailed velocity measurements of the flow over and near random porous material inside a rectangular duct using a planar particle image velocimetry (PIV) technique. By controlling the flow rate, two different Reynolds numbers were achieved. We determined the slip velocity at the interface between the porous media and free flow. Values of the slip velocity normalized either by the maximum flow velocity or by the shear rate at the interface and the screening distance K1/2 were found to depend on porosity. It was also shown that the depth of penetration inside the porous material was larger than the screening length using Brinkman's prediction. Moreover, we examined a model for the laminar coupled flow over and inside porous media and analyzed the permeability of a random porous medium. This study provided detailed analysis of flow over and at the interface of various specific random porous media using the PIV technique. This analysis has the potential to serve as a first step toward using random porous media as a new passive technique to control the flow over smooth surfaces.
Investigation of optical/infrared sensor techniques for application satellites
NASA Technical Reports Server (NTRS)
Kaufman, I.
1972-01-01
A method of scanning an optical sensor array by acoustic surface waves is discussed. Data cover detailed computer based analysis of the operation of a multielement acoustic surface-wave-scanned optical sensor, the development of design and operation techniques that were used to show the feasibility of an integrated array to design several such arrays, and experimental verification of a number of the calculations with discrete sensor devices.
NASA Technical Reports Server (NTRS)
Foss, W. E., Jr.
1981-01-01
A computer technique to determine the mission radius and maneuverability characteristics of combat aircraft was developed. The technique was used to determine critical operational requirements and the areas in which research programs would be expected to yield the most beneficial results. In turn, the results of research efforts were evaluated in terms of aircraft performance on selected mission segments and for complete mission profiles. Extensive use of the technique in evaluation studies indicates that the calculated performance is essentially the same as that obtained by the proprietary programs in use throughout the aircraft industry.
Overview of Sparse Graph for Multiple Access in Future Mobile Networks
NASA Astrophysics Data System (ADS)
Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui
2017-10-01
Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.
Longenecker, R J; Galazyuk, A V
2012-11-16
Recently prepulse inhibition of the acoustic startle reflex (ASR) became a popular technique for tinnitus assessment in laboratory animals. This method confers a significant advantage over the previously used time-consuming behavioral approaches utilizing basic mechanisms of conditioning. Although this technique has been successfully used to assess tinnitus in different laboratory animals, many of the finer details of this methodology have not been described enough to be replicated, but are critical for tinnitus assessment. Here we provide detail description of key procedures and methodological issues that provide guidance for newcomers with the process of learning to correctly apply gap detection techniques for tinnitus assessment in laboratory animals. The major categories of these issues include: refinement of hardware for best performance, optimization of stimulus parameters, behavioral considerations, and identification of optimal strategies for data analysis. This article is part of a Special Issue entitled: Tinnitus Neuroscience. Copyright © 2012. Published by Elsevier B.V.
Abstractions for DNA circuit design.
Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew
2012-03-07
DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.
Li, Yanyun; Chen, Minjian; Liu, Cuiping; Xia, Yankai; Xu, Bo; Hu, Yanhui; Chen, Ting; Shen, Meiping; Tang, Wei
2018-05-01
Papillary thyroid carcinoma (PTC) is the most common thyroid cancer. Nuclear magnetic resonance (NMR)‑based metabolomic technique is the gold standard in metabolite structural elucidation, and can provide different coverage of information compared with other metabolomic techniques. Here, we firstly conducted NMR based metabolomics study regarding detailed metabolic changes especially metabolic pathway changes related to PTC pathogenesis. 1H NMR-based metabolomic technique was adopted in conju-nction with multivariate analysis to analyze matched tumor and normal thyroid tissues obtained from 16 patients. The results were further annotated with Kyoto Encyclopedia of Genes and Genomes (KEGG), and Human Metabolome Database, and then were analyzed using modules of pathway analysis and enrichment analysis of MetaboAnalyst 3.0. Based on the analytical techniques, we established the models of principal component analysis (PCA), partial least squares-discriminant analysis (PLS-DA), and orthogonal partial least-squares discriminant analysis (OPLS‑DA) which could discriminate PTC from normal thyroid tissue, and found 15 robust differentiated metabolites from two OPLS-DA models. We identified 8 KEGG pathways and 3 pathways of small molecular pathway database which were significantly related to PTC by using pathway analysis and enrichment analysis, respectively, through which we identified metabolisms related to PTC including branched chain amino acid metabolism (leucine and valine), other amino acid metabolism (glycine and taurine), glycolysis (lactate), tricarboxylic acid cycle (citrate), choline metabolism (choline, ethanolamine and glycerolphosphocholine) and lipid metabolism (very-low‑density lipoprotein and low-density lipoprotein). In conclusion, the PTC was characterized with increased glycolysis and inhibited tricarboxylic acid cycle, increased oncogenic amino acids as well as abnormal choline and lipid metabolism. The findings in this study provide new insights into detailed metabolic changes of PTC, and hold great potential in the treatment of PTC.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia
2007-12-01
To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.
A study of data analysis techniques for the multi-needle Langmuir probe
NASA Astrophysics Data System (ADS)
Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.
2018-06-01
In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.
Estimation and prediction of origin-destination matrices for I-66.
DOT National Transportation Integrated Search
2011-09-01
This project uses the Box-Jenkins time-series technique to model and forecast the traffic flows and then : uses the flow forecasts to predict the origin-destination matrices. First, a detailed analysis was conducted : to investigate the best data cor...
Monopulse azimuth measurement in the ATC Radar Beacon System
DOT National Transportation Integrated Search
1971-12-01
A review is made of the application of sum-difference beam : techniques to the ATC Radar Beacon System. A detailed error analysis : is presented for the case of a monopulse azimuth measurement based : on the existing beacon antenna with a modified fe...
ESR Analysis of Polymer Photo-Oxidation
NASA Technical Reports Server (NTRS)
Kim, Soon Sam; Liang, Ranty Hing; Tsay, Fun-Dow; Gupta, Amitave
1987-01-01
Electron-spin resonance identifies polymer-degradation reactions and their kinetics. New technique enables derivation of kinetic model of specific chemical reactions involved in degradation of particular polymer. Detailed information provided by new method enables prediction of aging characteristics long before manifestation of macroscopic mechanical properties.
NASA Technical Reports Server (NTRS)
Towner, Robert L.; Band, Jonathan L.
2012-01-01
An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.
The physical and empirical basis for a specific clear-air turbulence risk index
NASA Technical Reports Server (NTRS)
Keller, J. L.
1985-01-01
An improved operational CAT detection and forecasting technique is developed and detailed. This technique is the specific clear air turbulence risk (SCATR) index. This index shows some promising results. The improvements seen using hand analyzed data, as a result of the more realistic representation of the vertical shear of the horizontal wind, are also realized in the data analysis used in the PROFS/CWP application. The SCATR index should improve as database enhancements such as profiler and VAS satellite data, which increase the resolution in space and time, are brought into even more sophisticated objective analysis schemes.
Precise measurement of the half-life of the Fermi {beta} decay of {sup 26}Al{sup m}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Rebecca J.; Thompson, Maxwell N.; Rassool, Roger P.
2011-08-15
State-of-the-art signal digitization and analysis techniques have been used to measure the half-life of the Fermi {beta} decay of {sup 26}Al{sup m}. The half-life was determined to be 6347.8 {+-} 2.5 ms. This new datum contributes to the experimental testing of the conserved-vector-current hypothesis and the required unitarity of the Cabibbo-Kobayashi-Maskawa matrix: two essential components of the standard model. Detailed discussion of the experimental techniques and data analysis and a thorough investigation of the statistical and systematic uncertainties are presented.
A combined Bodian-Nissl stain for improved network analysis in neuronal cell culture.
Hightower, M; Gross, G W
1985-11-01
Bodian and Nissl procedures were combined to stain dissociated mouse spinal cord cells cultured on coverslips. The Bodian technique stains fine neuronal processes in great detail as well as an intracellular fibrillar network concentrated around the nucleus and in proximal neurites. The Nissl stain clearly delimits neuronal cytoplasm in somata and in large dendrites. A combination of these techniques allows the simultaneous depiction of neuronal perikarya and all afferent and efferent processes. Costaining with little background staining by either procedure suggests high specificity for neurons. This procedure could be exploited for routine network analysis of cultured neurons.
Computer Simulation For Design Of TWT's
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard
1992-01-01
A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.
[The value of methods for morphofunctional analysis of cornea in cataract surgery].
Borodina, N B; Kobzova, M V; Musaeva, G M
2011-01-01
The characteristics of morphofunctional status of cornea after extracapsular cataract extraction and phakoemulsification with IOL implantation (30 and 58 operations respectively) were analyzed in detail using up-to-date diagnostic techniques. The results of examination using developed algorithm including study of light transmission, refraction and protective function of cornea show advantage of microinvasive ultrasound technique of cataract surgery in terms of minimal impact on corneal structure, optical and biomechanical characteristics.
Zinken, Katarzyna M; Cradock, Sue; Skinner, T Chas
2008-08-01
The paper presents the development of a coding tool for self-efficacy orientated interventions in diabetes self-management programmes (Analysis System for Self-Efficacy Training, ASSET) and explores its construct validity and clinical utility. Based on four sources of self-efficacy (i.e., mastery experience, role modelling, verbal persuasion and physiological and affective states), published self-efficacy based interventions for diabetes care were analysed in order to identify specific verbal behavioural techniques. Video-recorded facilitating behaviours were evaluated using ASSET. The reliability between four coders was high (K=0.71). ASSET enabled assessment of both self-efficacy based techniques and participants' response to those techniques. Individual patterns of delivery and shifts over time across facilitators were found. In the presented intervention we observed that self-efficacy utterances were followed by longer patient verbal responses than non-self-efficacy utterances. These detailed analyses with ASSET provide rich data and give the researcher an insight into the underlying mechanism of the intervention process. By providing a detailed description of self-efficacy strategies ASSET can be used by health care professionals to guide reflective practice and support training programmes.
Li, Guo-Sheng; Wei, Xian-Yong
2017-01-01
Elucidation of chemical composition of biooil is essentially important to evaluate the process of lignocellulosic biomass (LCBM) conversion and its upgrading and suggest proper value-added utilization like producing fuel and feedstock for fine chemicals. Although the main components of LCBM are cellulose, hemicelluloses, and lignin, the chemicals derived from LCBM differ significantly due to the various feedstock and methods used for the decomposition. Biooil, produced from pyrolysis of LCBM, contains hundreds of organic chemicals with various classes. This review covers the methodologies used for the componential analysis of biooil, including pretreatments and instrumental analysis techniques. The use of chromatographic and spectrometric methods was highlighted, covering the conventional techniques such as gas chromatography, high performance liquid chromatography, Fourier transform infrared spectroscopy, nuclear magnetic resonance, and mass spectrometry. The combination of preseparation methods and instrumental technologies is a robust pathway for the detailed componential characterization of biooil. The organic species in biooils can be classified into alkanes, alkenes, alkynes, benzene-ring containing hydrocarbons, ethers, alcohols, phenols, aldehydes, ketones, esters, carboxylic acids, and other heteroatomic organic compounds. The recent development of high resolution mass spectrometry and multidimensional hyphenated chromatographic and spectrometric techniques has considerably elucidated the composition of biooils. PMID:29387086
NASA Astrophysics Data System (ADS)
Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.
2004-08-01
The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.
Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora
2018-06-15
Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.
Edge Preserved Speckle Noise Reduction Using Integrated Fuzzy Filters
Dewal, M. L.; Rohit, Manoj Kumar
2014-01-01
Echocardiographic images are inherent with speckle noise which makes visual reading and analysis quite difficult. The multiplicative speckle noise masks finer details, necessary for diagnosis of abnormalities. A novel speckle reduction technique based on integration of geometric, wiener, and fuzzy filters is proposed and analyzed in this paper. The denoising applications of fuzzy filters are studied and analyzed along with 26 denoising techniques. It is observed that geometric filter retains noise and, to address this issue, wiener filter is embedded into the geometric filter during iteration process. The performance of geometric-wiener filter is further enhanced using fuzzy filters and the proposed despeckling techniques are called integrated fuzzy filters. Fuzzy filters based on moving average and median value are employed in the integrated fuzzy filters. The performances of integrated fuzzy filters are tested on echocardiographic images and synthetic images in terms of image quality metrics. It is observed that the performance parameters are highest in case of integrated fuzzy filters in comparison to fuzzy and geometric-fuzzy filters. The clinical validation reveals that the output images obtained using geometric-wiener, integrated fuzzy, nonlocal means, and details preserving anisotropic diffusion filters are acceptable. The necessary finer details are retained in the denoised echocardiographic images. PMID:27437499
Energy resolution improvement of CdTe detectors by using the principal component analysis technique
NASA Astrophysics Data System (ADS)
Alharbi, T.
2018-02-01
In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.
Bartlett, Yvonne K; Sheeran, Paschal; Hawley, Mark S
2014-01-01
Purpose The purpose of this study was to identify the behaviour change techniques (BCTs) that are associated with greater effectiveness in smoking cessation interventions for people with chronic obstructive pulmonary disease (COPD). Methods A systematic review and meta-analysis was conducted. Web of Knowledge, CINAHL, EMBASE, PsycINFO, and MEDLINE were searched from the earliest date available to December 2012. Data were extracted and weighted average effect sizes calculated; BCTs used were coded according to an existing smoking cessation-specific BCT taxonomy. Results Seventeen randomized controlled trials (RCTs) were identified that involved a total sample of 7446 people with COPD. The sample-weighted mean quit rate for all RCTs was 13.19%, and the overall sample-weighted effect size was d+ = 0.33. Thirty-seven BCTs were each used in at least three interventions. Four techniques were associated with significantly larger effect sizes: Facilitate action planning/develop treatment plan, Prompt self-recording, Advise on methods of weight control, and Advise on/facilitate use of social support. Three new COPD-specific BCTs were identified, and Linking COPD and smoking was found to result in significantly larger effect sizes. Conclusions Smoking cessation interventions aimed at people with COPD appear to benefit from using techniques focussed on forming detailed plans and self-monitoring. Additional RCTs that use standardized reporting of intervention components and BCTs would be valuable to corroborate findings from the present meta-analysis. Statement of contribution What is already known on this subject? Chronic obstructive pulmonary disease (COPD) is responsible for considerable health and economic burden worldwide, and smoking cessation (SC) is the only known treatment that can slow the decline in lung function experienced. Previous reviews of smoking cessation interventions for this population have established that a combination of pharmacological support and behavioural counselling is most effective. While pharmacological support has been detailed, and effectiveness ranked, the content of behavioural counselling varies between interventions, and it is not clear what the most effective components are. What does this study add? Detailed description of ‘behavioural counselling’ component of SC interventions for people with COPD. Meta-analysis to identify effective behaviour change techniques tailored for this population. Discussion of these findings in the context of designing tailored SC interventions. PMID:24397814
Fuzzy rationality and parameter elicitation in decision analysis
NASA Astrophysics Data System (ADS)
Nikolova, Natalia D.; Tenekedjiev, Kiril I.
2010-07-01
It is widely recognised by decision analysts that real decision-makers always make estimates in an interval form. An overview of techniques to find an optimal alternative among such with imprecise and interval probabilities is presented. Scalarisation methods are outlined as most appropriate. A proper continuation of such techniques is fuzzy rational (FR) decision analysis. A detailed representation of the elicitation process influenced by fuzzy rationality is given. The interval character of probabilities leads to the introduction of ribbon functions, whose general form and special cases are compared with the p-boxes. As demonstrated, approximation of utilities in FR decision analysis does not depend on the probabilities, but the approximation of probabilities is dependent on preferences.
Model reduction methods for control design
NASA Technical Reports Server (NTRS)
Dunipace, K. R.
1988-01-01
Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.
Steiner, Carine; Ducret, Axel; Tille, Jean-Christophe; Thomas, Marlene; McKee, Thomas A; Rubbia-Brandt, Laura A; Scherl, Alexander; Lescuyer, Pierre; Cutler, Paul
2014-01-01
Proteomic analysis of tissues has advanced in recent years as instruments and methodologies have evolved. The ability to retrieve peptides from formalin-fixed paraffin-embedded tissues followed by shotgun or targeted proteomic analysis is offering new opportunities in biomedical research. In particular, access to large collections of clinically annotated samples should enable the detailed analysis of pathologically relevant tissues in a manner previously considered unfeasible. In this paper, we review the current status of proteomic analysis of formalin-fixed paraffin-embedded tissues with a particular focus on targeted approaches and the potential for this technique to be used in clinical research and clinical diagnosis. We also discuss the limitations and perspectives of the technique, particularly with regard to application in clinical diagnosis and drug discovery. PMID:24339433
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
One way Doppler extractor. Volume 1: Vernier technique
NASA Technical Reports Server (NTRS)
Blasco, R. W.; Klein, S.; Nossen, E. J.; Starner, E. R.; Yanosov, J. A.
1974-01-01
A feasibility analysis, trade-offs, and implementation for a One Way Doppler Extraction system are discussed. A Doppler error analysis shows that quantization error is a primary source of Doppler measurement error. Several competing extraction techniques are compared and a Vernier technique is developed which obtains high Doppler resolution with low speed logic. Parameter trade-offs and sensitivities for the Vernier technique are analyzed, leading to a hardware design configuration. A detailed design, operation, and performance evaluation of the resulting breadboard model is presented which verifies the theoretical performance predictions. Performance tests have verified that the breadboard is capable of extracting Doppler, on an S-band signal, to an accuracy of less than 0.02 Hertz for a one second averaging period. This corresponds to a range rate error of no more than 3 millimeters per second.
Optical Analysis And Alignment Applications Using The Infrared Smartt Interferometer
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.; Bolen, P. D.; Liberman, I.; Seery, B. D.
1981-12-01
The possiblility of using the infrared Smartt interferometer for optical analysis and alignment of infrared laser systems has been discussed previously. In this paper, optical analysis of the Gigawatt Test Facility at Los Alamos, as well as a deformable mirror manufactured by Rocketdyne, are discussed as examples of the technique. The possibility of optically characterizing, as well as aligning, pulsed high energy laser systems like Helios and Antares is discussed in some detail.
Optical analysis and alignment applications using the infrared Smartt interferometer
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.; Bolen, P. D.; Liberman, I.; Seery, B. D.
The possibility of using the infrared Smartt interferometer for optical analysis and alignment of infrared laser systems has been discussed previously. In this paper, optical analysis of the Gigawatt Test Facility at Los Alamos, as well as a deformable mirror manufactured by Rocketdyne, are discussed as examples of the technique. The possibility of optically characterizing, as well as aligning, pulsed high energy laser systems like Helios and Antares is discussed in some detail.
NASA Technical Reports Server (NTRS)
Benepe, D. B.; Cunningham, A. M., Jr.; Traylor, S., Jr.; Dunmyer, W. D.
1978-01-01
Power spectral density (PSD) data for all of the flight points examined during the Phase 2 flight data analysis are presented in tabular form. Detailed descriptions of the aircraft, the flight instrumentation and the analysis techniques are given. Measured and calculated vibration mode frequencies are also presented to assist in further interpretation of the PSD data.
Seitz, Kelsey E; Smith, Cynthia R; Marks, Stanley L; Venn-Watson, Stephanie K; Ivančić, Marina
2016-12-01
The objective of this study was to establish a comprehensive technique for ultrasound examination of the dolphin hepatobiliary system and apply this technique to 30 dolphins to determine what, if any, sonographic changes are associated with blood-based indicators of metabolic syndrome (insulin greater than 14 μIU/ml or glucose greater than 112 mg/dl) and iron overload (transferrin saturation greater than 65%). A prospective study of individuals in a cross-sectional population with and without elevated postprandial insulin levels was performed. Twenty-nine bottlenose dolphins ( Tursiops truncatus ) in a managed collection were included in the final data analysis. An in-water ultrasound technique was developed that included detailed analysis of the liver and pancreas. Dolphins with hyperinsulinemia concentrations had larger livers compared with dolphins with nonelevated concentrations. Using stepwise, multivariate regression including blood-based indicators of metabolic syndrome in dolphins, glucose was the best predictor of and had a positive linear association with liver size (P = 0.007, R 2 = 0.24). Bottlenose dolphins are susceptible to metabolic syndrome and associated complications that affect the liver, including fatty liver disease and iron overload. This study facilitated the establishment of a technique for a rapid, diagnostic, and noninvasive ultrasonographic evaluation of the dolphin liver. In addition, the study identified ultrasound-detectable hepatic changes associated primarily with elevated glucose concentration in dolphins. Future investigations will strive to detail the pathophysiological mechanisms for these changes.
Robb, Paul D; Craven, Alan J
2008-12-01
An image processing technique is presented for atomic resolution high-angle annular dark-field (HAADF) images that have been acquired using scanning transmission electron microscopy (STEM). This technique is termed column ratio mapping and involves the automated process of measuring atomic column intensity ratios in high-resolution HAADF images. This technique was developed to provide a fuller analysis of HAADF images than the usual method of drawing single intensity line profiles across a few areas of interest. For instance, column ratio mapping reveals the compositional distribution across the whole HAADF image and allows a statistical analysis and an estimation of errors. This has proven to be a very valuable technique as it can provide a more detailed assessment of the sharpness of interfacial structures from HAADF images. The technique of column ratio mapping is described in terms of a [110]-oriented zinc-blende structured AlAs/GaAs superlattice using the 1 angstroms-scale resolution capability of the aberration-corrected SuperSTEM 1 instrument.
NASA Astrophysics Data System (ADS)
Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; McCleary, S. L.
1991-05-01
State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.
NASA Technical Reports Server (NTRS)
Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; Mccleary, S. L.
1991-01-01
State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.
Project Sell, Title VII: Final Evaluation 1970-1971.
ERIC Educational Resources Information Center
Condon, Elaine C.; And Others
This evaluative report consists of two parts. The first is a narrative report which represents a summary by the evaluation team and recommendations regarding project activities; the second part provides a statistical analysis of project achievements. Details are provided on evaluation techniques, staff, management, instructional materials,…
Photographic and photometric enhancement of Lunar Orbiter products, projects A, B and C
NASA Technical Reports Server (NTRS)
1972-01-01
A detailed discussion is presented of the framelet joining, photometric data improvement, and statistical error analysis. The Lunar Orbiter film handling system, readout system, and the digitization are described, along with the technique of joining adjacent framelets by a using a digital computer. Time and cost estimates are given. The problems and techniques involved in improving the digitized data are discussed. It was found that spectacular improvements are possible. Program documentations are included.
Determination of Local Densities in Accreted Ice Samples Using X-Rays and Digital Imaging
NASA Technical Reports Server (NTRS)
Broughton, Howard; Sims, James; Vargas, Mario
1996-01-01
At the NASA Lewis Research Center's Icing Research Tunnel ice shapes, similar to those which develop in-flight icing conditions, were formed on an airfoil. Under cold room conditions these experimental samples were carefully removed from the airfoil, sliced into thin sections, and x-rayed. The resulting microradiographs were developed and the film digitized using a high resolution scanner to extract fine detail in the radiographs. A procedure was devised to calibrate the scanner and to maintain repeatability during the experiment. The techniques of image acquisition and analysis provide accurate local density measurements and reveal the internal characteristics of the accreted ice with greater detail. This paper will discuss the methodology by which these samples were prepared with emphasis on the digital imaging techniques.
Comparative study of resist stabilization techniques for metal etch processing
NASA Astrophysics Data System (ADS)
Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.
1999-06-01
This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.
Ice Growth Measurements from Image Data to Support Ice Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter M.; Lynch, Christopher J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Ice Growth Measurements from Image Data to Support Ice-Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter, M; Lynch, Christopher, J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See
2005-01-01
The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1974-01-01
Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.
The evaluation of alternate methodologies for land cover classification in an urbanizing area
NASA Technical Reports Server (NTRS)
Smekofski, R. M.
1981-01-01
The usefulness of LANDSAT in classifying land cover and in identifying and classifying land use change was investigated using an urbanizing area as the study area. The question of what was the best technique for classification was the primary focus of the study. The many computer-assisted techniques available to analyze LANDSAT data were evaluated. Techniques of statistical training (polygons from CRT, unsupervised clustering, polygons from digitizer and binary masks) were tested with minimum distance to the mean, maximum likelihood and canonical analysis with minimum distance to the mean classifiers. The twelve output images were compared to photointerpreted samples, ground verified samples and a current land use data base. Results indicate that for a reconnaissance inventory, the unsupervised training with canonical analysis-minimum distance classifier is the most efficient. If more detailed ground truth and ground verification is available, the polygons from the digitizer training with the canonical analysis minimum distance is more accurate.
Nuclear analytical techniques in medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesareo, R.
1988-01-01
This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less
Assessment of Southern California environment from ERTS-1
NASA Technical Reports Server (NTRS)
Bowden, L. W.; Viellenave, J. H.
1973-01-01
ERTS-1 imagery is a useful source of data for evaluation of earth resources in Southern California. The improving quality of ERTS-1 imagery, and our increasing ability to enhance the imagery has resulted in studies of a variety of phenomena in several Southern California environments. These investigations have produced several significant results of varying detail. They include the detection and identification of macro-scale tectonic and vegetational patterns, as well as detailed analysis of urban and agricultural processes. The sequential nature of ERTS-1 imagery has allowed these studies to monitor significant changes in the environment. In addiation, some preliminary work has begun directed toward assessing the impact of expanding recreation, agriculture and urbanization into the fragile desert environment. Refinement of enhancement and mapping techniques and more intensive analysis of ERTS-1 imagery should lead to a greater capability to extract detailed information for more precise evaluations and more accurate monitoring of earth resources in Southern California.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowyer, Theodore W.; Gesh, Christopher J.; Haas, Daniel A.
This report details efforts to develop a technique which is able to detect and quantify the mass of 240Pu in waste storage tanks and other enclosed spaces. If the isotopic ratios of the plutonium contained in the enclosed space is also known, then this technique is capable of estimating the total mass of the plutonium without physical sample retrieval and radiochemical analysis of hazardous material. Results utilizing this technique are reported for a Hanford Site waste tank (TX-118) and a well-characterized plutonium sample in a laboratory environment.
Digital image processing for information extraction.
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1973-01-01
The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.
Ar-39-Ar-40 Ages of Two Nakhlites, MIL03346 and Y000593: A Detailed Analysis
NASA Technical Reports Server (NTRS)
Park, Jisun; Garrison, Daniel; Bogard, Donald
2007-01-01
Radiometric dating of martian nakhlites by several techniques have given similar ages of approx.1.2-1.4 Ga [e.g. 1, 2]. Unlike the case with shergottites, where the presence of martian atmosphere and inherited radiogenic Ar-40 produce apparent Ar-39-Ar-40 ages older than other radiometric ages, Ar-Ar ages of nakhlites are similar to ages derived by other techniques. However, even in some nakhlites the presence of trapped martian Ar produces some uncertainty in the Ar-Ar age. We present here an analysis of such Ar-Ar ages from the MIL03346 and Y000593 nakhlites.
Differential die-away analysis system response modeling and detector design
NASA Astrophysics Data System (ADS)
Jordan, K. A.; Gozani, T.; Vujic, J.
2008-05-01
Differential die-away-analysis (DDAA) is a sensitive technique to detect presence of fissile materials such as 235U and 239Pu. DDAA uses a high-energy (14 MeV) pulsed neutron generator to interrogate a shipping container. The signature is a fast neutron signal hundreds of microseconds after the cessation of the neutron pulse. This fast neutron signal has decay time identical to the thermal neutron diffusion decay time of the inspected cargo. The theoretical aspects of a cargo inspection system based on the differential die-away technique are explored. A detailed mathematical model of the system is developed, and experimental results validating this model are presented.
A Fishy Problem for Advanced Students
ERIC Educational Resources Information Center
Patterson, Richard A.
1977-01-01
While developing a research course for gifted high school students, improvements were made in a local pond. Students worked for a semester learning research techniques, statistical analysis, and limnology. At the end of the course, the three students produced a joint scientific paper detailing their study of the pond. (MA)
Implementation of radiation shielding calculation methods. Volume 2: Seminar/Workshop notes
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
Detailed descriptions are presented of the input data for each of the MSFC computer codes applied to the analysis of a realistic nuclear propelled vehicle. The analytical techniques employed include cross section data, preparation, one and two dimensional discrete ordinates transport, point kernel, and single scatter methods.
Applied Missing Data Analysis. Methodology in the Social Sciences Series
ERIC Educational Resources Information Center
Enders, Craig K.
2010-01-01
Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
An airborne combined radiometric and magnetic survey was performed for the Department of Energy (DOE) over the Durango A, Durango B, Durango C, and Durango D Detail Areas of southwestern Colorado. The Durango A Detail Area is within the coverage of the Needle Mountains and Silverton 15' map sheets, and the Pole Creek Mountain, Rio Grande Pyramid, Emerald Lake, Granite Peak, Vallecito Reservoir, and Lemon Reservoir 7.5' map sheets of the National Topographic Map Series (NTMS). The Durango B Detail Area is within the coverage of the Silverton 15' map sheet and the Wetterhorn Peak, Uncompahgre Peak, Lake City, Redcloudmore » Peak, Lake San Cristobal, Pole Creek Mountain, and Finger Mesa 7.5' map sheets of the NTMS. The Durango C Detail Area is within the coverage of the Platoro and Wolf Creek Pass 15' map sheets of the NTMS. The Durango D Detail Area is within the coverage of the Granite Lake, Cimarrona Peak, Bear Mountain, and Oakbrush Ridge 7.5' map sheets of the NTMS. Radiometric data were corrected for live time, aircraft and equipment background, cosmic background, atmospheric radon, Compton scatter, and altitude dependence. The corrected data were statistically evaluated, gridded, and contoured to produce maps of the radiometric variables, uranium, potassium, and thorium; their ratios; and the residual magnetic field. These maps have been analyzed in order to produce a multi-variant analysis contour map based on the radiometric response of the individual geological units. A geochemical analysis has been performed, using the radiometric and magnetic contour maps, the multi-variant analysis map, and factor analysis techniques, to produce a geochemical analysis map for the area.« less
A manual for inexpensive methods of analyzing and utilizing remote sensor data
NASA Technical Reports Server (NTRS)
Elifrits, C. D.; Barr, D. J.
1978-01-01
Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.
Multidimensional Processing and Visual Rendering of Complex 3D Biomedical Images
NASA Technical Reports Server (NTRS)
Sams, Clarence F.
2016-01-01
The proposed technology uses advanced image analysis techniques to maximize the resolution and utility of medical imaging methods being used during spaceflight. We utilize COTS technology for medical imaging, but our applications require higher resolution assessment of the medical images than is routinely applied with nominal system software. By leveraging advanced data reduction and multidimensional imaging techniques utilized in analysis of Planetary Sciences and Cell Biology imaging, it is possible to significantly increase the information extracted from the onboard biomedical imaging systems. Year 1 focused on application of these techniques to the ocular images collected on ground test subjects and ISS crewmembers. Focus was on the choroidal vasculature and the structure of the optic disc. Methods allowed for increased resolution and quantitation of structural changes enabling detailed assessment of progression over time. These techniques enhance the monitoring and evaluation of crew vision issues during space flight.
Uranium Detection - Technique Validation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle; Garduno, Katherine; Lujan, Elmer J.
As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples usingmore » the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H 2SO 4, at λ max = 419.5 nm.« less
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
Super-resolution mapping using multi-viewing CHRIS/PROBA data
NASA Astrophysics Data System (ADS)
Dwivedi, Manish; Kumar, Vinay
2016-04-01
High-spatial resolution Remote Sensing (RS) data provides detailed information which ensures high-definition visual image analysis of earth surface features. These data sets also support improved information extraction capabilities at a fine scale. In order to improve the spatial resolution of coarser resolution RS data, the Super Resolution Reconstruction (SRR) technique has become widely acknowledged which focused on multi-angular image sequences. In this study multi-angle CHRIS/PROBA data of Kutch area is used for SR image reconstruction to enhance the spatial resolution from 18 m to 6m in the hope to obtain a better land cover classification. Various SR approaches like Projection onto Convex Sets (POCS), Robust, Iterative Back Projection (IBP), Non-Uniform Interpolation and Structure-Adaptive Normalized Convolution (SANC) chosen for this study. Subjective assessment through visual interpretation shows substantial improvement in land cover details. Quantitative measures including peak signal to noise ratio and structural similarity are used for the evaluation of the image quality. It was observed that SANC SR technique using Vandewalle algorithm for the low resolution image registration outperformed the other techniques. After that SVM based classifier is used for the classification of SRR and data resampled to 6m spatial resolution using bi-cubic interpolation. A comparative analysis is carried out between classified data of bicubic interpolated and SR derived images of CHRIS/PROBA and SR derived classified data have shown a significant improvement of 10-12% in the overall accuracy. The results demonstrated that SR methods is able to improve spatial detail of multi-angle images as well as the classification accuracy.
Operationalizing strategic marketing.
Chambers, S B
1989-05-01
The strategic marketing process, like any administrative practice, is far simpler to conceptualize than operationalize within an organization. It is for this reason that this chapter focused on providing practical techniques and strategies for implementing the strategic marketing process. First and foremost, the marketing effort needs to be marketed to the various publics of the organization. This chapter advocated the need to organize the marketing analysis into organizational, competitive, and market phases, and it provided examples of possible designs of the phases. The importance and techniques for exhausting secondary data sources and conducting efficient primary data collection methods were explained and illustrated. Strategies for determining marketing opportunities and threats, as well as segmenting markets, were detailed. The chapter provided techniques for developing marketing strategies, including considering the five patterns of coverage available; determining competitor's position and the marketing mix; examining the stage of the product life cycle; and employing a consumer decision model. The importance of developing explicit objectives, goals, and detailed action plans was emphasized. Finally, helpful hints for operationalizing the communication variable and evaluating marketing programs were provided.
User modeling techniques for enhanced usability of OPSMODEL operations simulation software
NASA Technical Reports Server (NTRS)
Davis, William T.
1991-01-01
The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.
NASA Technical Reports Server (NTRS)
Maskew, B.
1979-01-01
The description of the modified code includes details of a doublet subpanel technique in which panels that are close to a velocity calculation point are replaced by a subpanel set. This treatment gives the effect of a higher panel density without increasing the number of unknowns. In particular, the technique removes the close approach problem of the earlier singularity model in which distortions occur in the detailed pressure calculation near panel corners. Removal of this problem allowed a complete wake relaxation and roll-up iterative procedure to be installed in the code. The geometry package developed for the new technique and also for the more general configurations is based on a multiple patch scheme. Each patch has a regular array of panels, but arbitrary relationships are allowed between neighboring panels at the edges of adjacent patches. This provides great versatility for treating general configurations.
Formal methods for modeling and analysis of hybrid systems
NASA Technical Reports Server (NTRS)
Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)
2009-01-01
A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.
High efficiency processing for reduced amplitude zones detection in the HRECG signal
NASA Astrophysics Data System (ADS)
Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.
2016-04-01
Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.
Analysis of Variance in Statistical Image Processing
NASA Astrophysics Data System (ADS)
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
Characterisation of the PXIE Allison-type emittance scanner
D'Arcy, R.; Alvarez, M.; Gaynier, J.; ...
2016-01-26
An Allison-type emittance scanner has been designed for PXIE at FNAL with the goal of providing fast and accurate phase space reconstruction. The device has been modified from previous LBNL/SNS designs to operate in both pulsed and DC modes with the addition of water-cooled front slits. Extensive calibration techniques and error analysis allowed confinement of uncertainty to the <5% level (with known caveats). With a 16-bit, 1 MHz electronics scheme the device is able to analyse a pulse with a resolution of 1 μs, allowing for analysis of neutralisation effects. As a result, this paper describes a detailed breakdown ofmore » the R&D, as well as post-run analysis techniques.« less
Statistical analysis of RHIC beam position monitors performance
NASA Astrophysics Data System (ADS)
Calaga, R.; Tomás, R.
2004-04-01
A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.
A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2011-01-01
An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.
NASA Astrophysics Data System (ADS)
Phelps, Mandy S.; Sturtevant, Drew; Chapman, Kent D.; Verbeck, Guido F.
2016-02-01
We describe a novel technique combining precise organelle microextraction with deposition and matrix-assisted laser desorption/ionization (MALDI) for a rapid, minimally invasive mass spectrometry (MS) analysis of single organelles from living cells. A dual-positioner nanomanipulator workstation was utilized for both extraction of organelle content and precise co-deposition of analyte and matrix solution for MALDI-direct organelle mass spectrometry (DOMS) analysis. Here, the triacylglycerol (TAG) profiles of single lipid droplets from 3T3-L1 adipocytes were acquired and results validated with nanoelectrospray ionization (NSI) MS. The results demonstrate the utility of the MALDI-DOMS technique as it enabled longer mass analysis time, higher ionization efficiency, MS imaging of the co-deposited spot, and subsequent MS/MS capabilities of localized lipid content in comparison to NSI-DOMS. This method provides selective organellar resolution, which complements current biochemical analyses and prompts for subsequent subcellular studies to be performed where limited samples and analyte volume are of concern.
Bigler, Erin D
2015-09-01
Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.
Software Safety Analysis of a Flight Guidance System
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
The analysis of latent fingermarks on polymer banknotes using MALDI-MS.
Scotcher, K; Bradshaw, R
2018-06-08
In September 2016, the UK adopted a new Bank of England (BoE) £5 polymer banknote, followed by the £10 polymer banknote in September 2017. They are designed to be cleaner, stronger and have increased counterfeit resilience; however, fingermark development can be problematic from the polymer material as various security features and coloured/textured areas have been found to alter the effectiveness of conventional fingermark enhancement techniques (FETs). As fingermarks are one of the most widely used forms of identification in forensic cases, it is important that maximum ridge detail be obtained in order to allow for comparison. This research explores the use of matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS) profiling and imaging for the analysis of fingermarks deposited on polymer banknotes. The proposed methodology was able to obtain both physical and chemical information from fingermarks deposited in a range of scenarios including; different note areas, depletion series, aged samples and following conventional FETs. The analysis of forensically important molecular targets within these fingermarks was also explored, focussing specifically on cocaine. The ability of MALDI-MS to provide ridge detail and chemical information highlights the forensic applicability of this technique and potential for the analysis of fingermarks deposited onto this problematic surface.
Improved numerical solutions for chaotic-cancer-model
NASA Astrophysics Data System (ADS)
Yasir, Muhammad; Ahmad, Salman; Ahmed, Faizan; Aqeel, Muhammad; Akbar, Muhammad Zubair
2017-01-01
In biological sciences, dynamical system of cancer model is well known due to its sensitivity and chaoticity. Present work provides detailed computational study of cancer model by counterbalancing its sensitive dependency on initial conditions and parameter values. Cancer chaotic model is discretized into a system of nonlinear equations that are solved using the well-known Successive-Over-Relaxation (SOR) method with a proven convergence. This technique enables to solve large systems and provides more accurate approximation which is illustrated through tables, time history maps and phase portraits with detailed analysis.
Advances in the application of holography for NDE
NASA Astrophysics Data System (ADS)
Sciammarella, C. A.
1985-01-01
The basic methodology of holographic interferometry in nondestructive testing (NDT) applications are described. Applications to crack detection in ceramic materials, including a crack 50 microns deep in a turbine blade, are discussed in detail. The theoretical principles of holographic interferometry are explained, and a general description of a holographic interferometric recording system is given. A nondestructive interferometric technique for measuring the gradual erosion of calcareous stones exposed to acid rain is also presented. Detailed line drawings illustrating the hologram recording and interferometric fringe pattern analysis elements in an interferometric holographic NDT device are provided.
Received optical power calculations for optical communications link performance analysis
NASA Technical Reports Server (NTRS)
Marshall, W. K.; Burk, B. D.
1986-01-01
The factors affecting optical communication link performance differ substantially from those at microwave frequencies, due to the drastically differing technologies, modulation formats, and effects of quantum noise in optical communications. In addition detailed design control table calculations for optical systems are less well developed than corresponding microwave system techniques, reflecting the relatively less mature state of development of optical communications. Described below are detailed calculations of received optical signal and background power in optical communication systems, with emphasis on analytic models for accurately predicting transmitter and receiver system losses.
A review of light-scattering techniques for the study of colloids in natural waters
Rees, T.F.
1987-01-01
In order to understand the movement of colloidal materials in natural waters, we first need to have a means of quantifying their physical characteristics. This paper reviews three techniques which utilize light-scattering phenomena to measure the translational diffusion coefficient, the rotational diffusion coefficient, and the electrophoretic mobility of colloids suspended in water. Primary emphasis is to provide sufficient theoretical detail so that hydrologists can evaluate the utility of photon correlation spectrometry, electrophoretic light scattering, and electric birefringence analysis. ?? 1987.
2014-01-01
Current musculoskeletal imaging techniques usually target the macro-morphology of articular cartilage or use histological analysis. These techniques are able to reveal advanced osteoarthritic changes in articular cartilage but fail to give detailed information to distinguish early osteoarthritis from healthy cartilage, and this necessitates high-resolution imaging techniques measuring cells and the extracellular matrix within the multilayer structure of articular cartilage. This review provides a comprehensive exploration of the cellular components and extracellular matrix of articular cartilage as well as high-resolution imaging techniques, including magnetic resonance image, electron microscopy, confocal laser scanning microscopy, second harmonic generation microscopy, and laser scanning confocal arthroscopy, in the measurement of multilayer ultra-structures of articular cartilage. This review also provides an overview for micro-structural analysis of the main components of normal or osteoarthritic cartilage and discusses the potential and challenges associated with developing non-invasive high-resolution imaging techniques for both research and clinical diagnosis of early to late osteoarthritis. PMID:24946278
Sizing and Lifecycle Cost Analysis of an Ares V Composite Interstage
NASA Technical Reports Server (NTRS)
Mann, Troy; Smeltzer, Stan; Grenoble, Ray; Mason, Brian; Rosario, Sev; Fairbairn, Bob
2012-01-01
The Interstage Element of the Ares V launch vehicle was sized using a commercially available structural sizing software tool. Two different concepts were considered, a metallic design and a composite design. Both concepts were sized using similar levels of analysis fidelity and included the influence of design details on each concept. Additionally, the impact of the different manufacturing techniques and failure mechanisms for composite and metallic construction were considered. Significant details were included in analysis models of each concept, including penetrations for human access, joint connections, as well as secondary loading effects. The designs and results of the analysis were used to determine lifecycle cost estimates for the two Interstage designs. Lifecycle cost estimates were based on industry provided cost data for similar launch vehicle components. The results indicated that significant mass as well as cost savings are attainable for the chosen composite concept as compared with a metallic option.
Algorithms and architecture for multiprocessor based circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deutsch, J.T.
Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less
NASA Astrophysics Data System (ADS)
Alves de Mesquita, Jayme; Lopes de Melo, Pedro
2004-03-01
Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the diagnoses of sleep-breathing disorders.
NASA Technical Reports Server (NTRS)
Scholz, A. L.; Hart, M. T.; Lowry, D. J.
1987-01-01
The Ground Operations Evaluation describes the breath and depth of the various study elements selected as a result of an operational analysis conducted during the early part of the study. Analysis techniques used for the evaluation are described in detail. Elements selected for further evaluation are identified; the results of the analysis documented; and a follow-on course of action recommended. The background and rationale for developing recommendations for the current Shuttle or for future programs is presented.
The application of digital techniques to the analysis of metallurgical experiments
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1977-01-01
The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.
NASA Astrophysics Data System (ADS)
Morandi, V.; Galli, M.; Marabelli, F.; Comoretto, D.
2010-04-01
In this work, we combined an experimental technique and a detailed data analysis to investigate the influence of an applied pressure on the anisotropic dielectric functions of highly oriented poly(p-phenylene vinylene) (PPV). The dielectric constants were derived from polarized reflectance spectra recorded through a diamond anvil cell up to 50 kbar. The presence of the diamond anvils strongly affects measured spectra requiring the development in an optical model able to take all spurious effects into account. A parametric procedure was then applied to derive the complex dielectric constants for both polarizations as a function of pressure. A detailed analysis of their pressure dependence allows addressing the role of intermolecular interactions and electron-phonon coupling in highly oriented PPV.
Optical method for measuring the surface area of a threaded fastener
Douglas Rammer; Samuel Zelinka
2010-01-01
This article highlights major aspects of a new optical technique to determine the surface area of a threaded fastener; the theoretical framework has been reported elsewhere. Specifically, this article describes general surface area expressions used in the analysis, details of image acquisition system, and major image processing steps contained within the measurement...
Samuel A. Cushman; Michael Chase; Curtice Griffin
2005-01-01
Autocorrelation in animal movements can be both a serious nuisance to analysis and a source of valuable information about the scale and patterns of animal behavior, depending on the question and the techniques employed. In this paper we present an approach to analyzing the patterns of autocorrelation in animal movements that provides a detailed picture of seasonal...
Ryazantsev, S. N.; Skobelev, I. Yu.; Faenov, A. Ya.; ...
2016-12-08
Here, in this paper, we detail the diagnostic technique used to infer the spatially resolved electron temperatures and densities in experiments dedicated to investigate the generation of magnetically collimated plasma jets. It is shown that the relative intensities of the resonance transitions in emitting He-like ions can be used to measure the temperature in such recombining plasmas. The intensities of these transitions are sensitive to the plasma density in the range of 10 16–10 20 cm -3 and to plasma temperature ranges from 10 to 100 eV for ions with a nuclear charge Z n ~10. We show how detailedmore » calculations of the emissivity of F VIII ions allow to determine the parameters of the plasma jets that were created using ELFIE ns laser facility (Ecole Polytechnique, France). Lastly, the diagnostic and analysis technique detailed here can be applied in a broader context than the one of this study, i.e., to diagnose any recombining plasma containing He-like fluorine ions.« less
NASA Astrophysics Data System (ADS)
Bernard, D.; Serot, O.; Simon, E.; Boucher, L.; Plumeri, S.
2018-01-01
The photon interrogation analysis is a nondestructive technique allowing to identify and quantify fissile materials in nuclear waste packages. This paper details an automatic procedure which has been developed to simulate the delayed γ-ray spectra for several actinide photofissions. This calculation tool will be helpful for the fine conception (collimation, shielding, noise background optimizations, etc.) and for the on-line analysis of such a facility.
Hydrogen-fueled scramjets: Potential for detailed combustor analysis
NASA Technical Reports Server (NTRS)
Beach, H. L., Jr.
1976-01-01
Combustion research related to hypersonic scramjet (supersonic combustion ramjet) propulsion is discussed from the analytical point of view. Because the fuel is gaseous hydrogen, mixing is single phase and the chemical kinetics are well known; therefore, the potential for analysis is good relative to hydro-carbon fueled engines. Recent progress in applying two and three dimensional analytical techniques to mixing and reacting flows indicates cause for optimism, and identifies several areas for continuing effort.
Develop advanced nonlinear signal analysis topographical mapping system
NASA Technical Reports Server (NTRS)
1994-01-01
The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.
Develop advanced nonlinear signal analysis topographical mapping system
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1993-01-01
The SSME has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) Develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system. (2) Develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amounts of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. A high compression ratio can be achieved to allow the minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities. (3) Integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for a quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.
Diffraction Techniques in Structural Biology
Egli, Martin
2016-01-01
A detailed understanding of chemical and biological function and the mechanisms underlying the molecular activities ultimately requires atomic-resolution structural data. Diffraction-based techniques such as single-crystal X-ray crystallography, electron microscopy, and neutron diffraction are well established and they have paved the road to the stunning successes of modern-day structural biology. The major advances achieved in the last 20 years in all aspects of structural research, including sample preparation, crystallization, the construction of synchrotron and spallation sources, phasing approaches, and high-speed computing and visualization, now provide specialists and nonspecialists alike with a steady flow of molecular images of unprecedented detail. The present unit combines a general overview of diffraction methods with a detailed description of the process of a single-crystal X-ray structure determination experiment, from chemical synthesis or expression to phasing and refinement, analysis, and quality control. For novices it may serve as a stepping-stone to more in-depth treatises of the individual topics. Readers relying on structural information for interpreting functional data may find it a useful consumer guide. PMID:27248784
Diffraction Techniques in Structural Biology.
Egli, Martin
2016-06-01
A detailed understanding of chemical and biological function and the mechanisms underlying the molecular activities ultimately requires atomic-resolution structural data. Diffraction-based techniques such as single-crystal X-ray crystallography, electron microscopy, and neutron diffraction are well established and they have paved the road to the stunning successes of modern-day structural biology. The major advances achieved in the last twenty years in all aspects of structural research, including sample preparation, crystallization, the construction of synchrotron and spallation sources, phasing approaches, and high-speed computing and visualization, now provide specialists and nonspecialists alike with a steady flow of molecular images of unprecedented detail. The present unit combines a general overview of diffraction methods with a detailed description of the process of a single-crystal X-ray structure determination experiment, from chemical synthesis or expression to phasing and refinement, analysis, and quality control. For novices it may serve as a stepping-stone to more in-depth treatises of the individual topics. Readers relying on structural information for interpreting functional data may find it a useful consumer guide. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Evaluation of Factors that Influence Residential Solar Panel Installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M.; Omitaomu, Olufemi A.; Kotikot, Susan M.
Though rooftop photovoltaic (PV) systems are the fastest growing source of distributed generation, detailed information about where they are located and who their owners are is often known only to installers and utility companies. This lack of detailed information is a barrier to policy and financial assessment of solar energy generation and use. To bridge the described data gap, Oak Ridge National Laboratory (ORNL) was sponsored by the Department of Energy (DOE) Office of Energy Policy and Systems Analysis (EPSA) to create an automated approach for detecting and characterizing buildings with installed solar panels using high-resolution overhead imagery. Additionally, ORNLmore » was tasked with using machine learning techniques to classify parcels on which solar panels were automatically detected in the Washington, DC, and Boston areas as commercial or residential, and then providing a list of recommended variables and modeling techniques that could be combined with these results to identify attributes that motivate the installation of residential solar panels. This technical report describes the methodology, results, and recommendations in greater detail, including lessons learned and future work.« less
Direct solar energy conversion for large scale terrestrial use
NASA Technical Reports Server (NTRS)
Boeer, K. W.; Meakin, J. D.
1975-01-01
Various techniques to increase the open circuit voltage are being explored. It had been previously observed that cells made on CdS deposited from a single source gave a consistently higher V sub oc. Further tests have now shown that this effect may in fact relate to differences in source and substrate temperatures. The resulting differences in CdS structure and crystallinity are being documented. Deposits of mixed CdS and ZnS are being produced and will be initially made into cells using the conventional barriering technique. Analysis of I-V characteristics at temperatures between 25 and 110 C is being perfected to provide nondestructive analysis of the Cu2S. Changes due to vacuum heat treatments and exposure to oxygen are also being monitored by the same technique. Detailed spectral response measurements are being made.
Redígolo, M M; Sato, I M; Metairon, S; Zamboni, C B
2016-04-01
Several diseases can be diagnosed observing the variation of specific elements concentration in body fluids. In this study the concentration of inorganic elements in blood samples of dystrophic (Dmd(mdx)/J) and C57BL/6J (control group) mice strain were determined. The results obtained from Energy Dispersive X-ray Fluorescence (EDXRF) were compared with Neutron Activation Analysis (NAA) technique. Both analytical techniques showed to be appropriate and complementary offering a new contribution for veterinary medicine as well as detailed knowledge of this pathology. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
OBrien, T. Kevin (Technical Monitor); Krueger, Ronald; Minguet, Pierre J.
2004-01-01
The application of a shell/3D modeling technique for the simulation of skin/stringer debond in a specimen subjected to tension and three-point bending was studied. The global structure was modeled with shell elements. A local three-dimensional model, extending to about three specimen thicknesses on either side of the delamination front was used to model the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from shell/3D simulations were in good agreement with results obtained from full solid models. The good correlation of the results demonstrated the effectiveness of the shell/3D modeling technique for the investigation of skin/stiffener separation due to delamination in the adherents. In addition, the application of the submodeling technique for the simulation of skin/stringer debond was also studied. Global models made of shell elements and solid elements were studied. Solid elements were used for local submodels, which extended between three and six specimen thicknesses on either side of the delamination front to model the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from the simulations using the submodeling technique were not in agreement with results obtained from full solid models.
In vivo testing for gold nanoparticle toxicity.
Simpson, Carrie A; Huffman, Brian J; Cliffel, David E
2013-01-01
A technique for measuring the toxicity of nanomaterials using a murine model is described. Blood samples are collected via submandibular bleeding while urine samples are collected on cellophane sheets. Both biosamples are then analyzed by inductively coupled plasma optical emission spectroscopy (ICP-OES) for nanotoxicity. Blood samples are further tested for immunological response using a standard Coulter counter. The major organs of interest for filtration are also digested and analyzed via ICP-OES, producing useful information regarding target specificity of the nanomaterial of interest. Collection of the biosamples and analysis afterward is detailed, and the operation of the technique is described and illustrated by analysis of the nanotoxicity of an injection of a modified tiopronin monolayer-protected cluster.
Second-order shaped pulsed for solid-state quantum computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Pinaki
2008-01-01
We present the construction and detailed analysis of highly optimized self-refocusing pulse shapes for several rotation angles. We characterize the constructed pulses by the coefficients appearing in the Magnus expansion up to second order. This allows a semianalytical analysis of the performance of the constructed shapes in sequences and composite pulses by computing the corresponding leading-order error operators. Higher orders can be analyzed with the numerical technique suggested by us previously. We illustrate the technique by analyzing several composite pulses designed to protect against pulse amplitude errors, and on decoupling sequences for potentially long chains of qubits with on-site andmore » nearest-neighbor couplings.« less
Application of sensitivity-analysis techniques to the calculation of topological quantities
NASA Astrophysics Data System (ADS)
Gilchrist, Stuart
2017-08-01
Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.
LIBS: a potential tool for industrial/agricultural waste water analysis
NASA Astrophysics Data System (ADS)
Karpate, Tanvi; K. M., Muhammed Shameem; Nayak, Rajesh; V. K., Unnikrishnan; Santhosh, C.
2016-04-01
Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique with various advantages and has the ability to detect any element in real time. This technique holds a potential for environmental monitoring and various such analysis has been done in soil, glass, paint, water, plastic etc confirms the robustness of this technique for such applications. Compared to the currently available water quality monitoring methods and techniques, LIBS has several advantages, viz. no need for sample preparation, fast and easy operation, and chemical free during the process. In LIBS, powerful pulsed laser generates plasma which is then analyzed to get quantitative and qualitative details of the elements present in the sample. Another main advantage of LIBS technique is that it can perform in standoff mode for real time analysis. Water samples from industries and agricultural strata tend to have a lot of pollutants making it harmful for consumption. The emphasis of this project is to determine such harmful pollutants present in trace amounts in industrial and agricultural wastewater. When high intensity laser is made incident on the sample, a plasma is generated which gives a multielemental emission spectra. LIBS analysis has shown outstanding success for solids samples. For liquid samples, the analysis is challenging as the liquid sample has the chances of splashing due to the high energy of laser and thus making it difficult to generate plasma. This project also deals with determining the most efficient method for testing of water sample for qualitative as well as quantitative analysis using LIBS.
Practical semen analysis: from A to Z
Brazil, Charlene
2010-01-01
Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076
Modular techniques for dynamic fault-tree analysis
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Dugan, Joanne B.
1992-01-01
It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.
Using object-oriented analysis to design a multi-mission ground data system
NASA Technical Reports Server (NTRS)
Shames, Peter
1995-01-01
This paper describes an analytical approach and descriptive methodology that is adapted from Object-Oriented Analysis (OOA) techniques. The technique is described and then used to communicate key issues of system logical architecture. The essence of the approach is to limit the analysis to only service objects, with the idea of providing a direct mapping from the design to a client-server implementation. Key perspectives on the system, such as user interaction, data flow and management, service interfaces, hardware configuration, and system and data integrity are covered. A significant advantage of this service-oriented approach is that it permits mapping all of these different perspectives on the system onto a single common substrate. This services substrate is readily represented diagramatically, thus making details of the overall design much more accessible.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
Analyzing women's roles through graphic representation of narratives.
Hall, Joanne M
2003-08-01
A 1992 triangulated international nursing study of women's health was reported. The researchers used the perspectives of feminism and symbolic interactionism, specifically role theory. A narrative analysis was done to clarify the concept of role integration. The narrative analysis was reported in 1992, but graphic/visual techniques used in the team dialogue process of narrative analysis were not reported due to space limitations. These techniques have not been reported elsewhere and thus remain innovative. Specific steps in the method are outlined here in detail as an audit trail. The process would be useful to other qualitative researchers as an exemplar of one novel way that verbal data can be abstracted visually/graphically. Suggestions are included for aspects of narrative, in addition to roles, that could be depicted graphically in qualitative research.
NASA Astrophysics Data System (ADS)
Uysal, Selcuk Can
In this research, MATLAB SimulinkRTM was used to develop a cooled engine model for industrial gas turbines and aero-engines. The model consists of uncooled on-design, mean-line turbomachinery design and a cooled off-design analysis in order to evaluate the engine performance parameters by using operating conditions, polytropic efficiencies, material information and cooling system details. The cooling analysis algorithm involves a 2nd law analysis to calculate losses from the cooling technique applied. The model is used in a sensitivity analysis that evaluates the impacts of variations in metal Biot number, thermal barrier coating Biot number, film cooling effectiveness, internal cooling effectiveness and maximum allowable blade temperature on main engine performance parameters of aero and industrial gas turbine engines. The model is subsequently used to analyze the relative performance impact of employing Anti-Vortex Film Cooling holes (AVH) by means of data obtained for these holes by Detached Eddy Simulation-CFD Techniques that are valid for engine-like turbulence intensity conditions. Cooled blade configurations with AVH and other different external cooling techniques were used in a performance comparison study. (Abstract shortened by ProQuest.).
Zhang, Zhaowei; Li, Peiwu; Hu, Xiaofeng; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen
2012-01-01
Chemical contaminants in food have caused serious health issues in both humans and animals. Microarray technology is an advanced technique suitable for the analysis of chemical contaminates. In particular, immuno-microarray approach is one of the most promising methods for chemical contaminants analysis. The use of microarrays for the analysis of chemical contaminants is the subject of this review. Fabrication strategies and detection methods for chemical contaminants are discussed in detail. Application to the analysis of mycotoxins, biotoxins, pesticide residues, and pharmaceutical residues is also described. Finally, future challenges and opportunities are discussed.
Cardiovascular magnetic resonance physics for clinicians: part II
2012-01-01
This is the second of two reviews that is intended to cover the essential aspects of cardiovascular magnetic resonance (CMR) physics in a way that is understandable and relevant to clinicians using CMR in their daily practice. Starting with the basic pulse sequences and contrast mechanisms described in part I, it briefly discusses further approaches to accelerate image acquisition. It then continues by showing in detail how the contrast behaviour of black blood fast spin echo and bright blood cine gradient echo techniques can be modified by adding rf preparation pulses to derive a number of more specialised pulse sequences. The simplest examples described include T2-weighted oedema imaging, fat suppression and myocardial tagging cine pulse sequences. Two further important derivatives of the gradient echo pulse sequence, obtained by adding preparation pulses, are used in combination with the administration of a gadolinium-based contrast agent for myocardial perfusion imaging and the assessment of myocardial tissue viability using a late gadolinium enhancement (LGE) technique. These two imaging techniques are discussed in more detail, outlining the basic principles of each pulse sequence, the practical steps required to achieve the best results in a clinical setting and, in the case of perfusion, explaining some of the factors that influence current approaches to perfusion image analysis. The key principles of contrast-enhanced magnetic resonance angiography (CE-MRA) are also explained in detail, especially focusing on timing of the acquisition following contrast agent bolus administration, and current approaches to achieving time resolved MRA. Alternative MRA techniques that do not require the use of an endogenous contrast agent are summarised, and the specialised pulse sequence used to image the coronary arteries, using respiratory navigator gating, is described in detail. The article concludes by explaining the principle behind phase contrast imaging techniques which create images that represent the phase of the MR signal rather than the magnitude. It is shown how this principle can be used to generate velocity maps by designing gradient waveforms that give rise to a relative phase change that is proportional to velocity. Choice of velocity encoding range and key pitfalls in the use of this technique are discussed. PMID:22995744
Mayrand, Dominique; Fradette, Julie
2018-01-01
Optimal imaging methods are necessary in order to perform a detailed characterization of thick tissue samples from either native or engineered tissues. Tissue-engineered substitutes are featuring increasing complexity including multiple cell types and capillary-like networks. Therefore, technical approaches allowing the visualization of the inner structural organization and cellular composition of tissues are needed. This chapter describes an optical clearing technique which facilitates the detailed characterization of whole-mount samples from skin and adipose tissues (ex vivo tissues and in vitro tissue-engineered substitutes) when combined with spectral confocal microscopy and quantitative analysis on image renderings.
Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders
NASA Technical Reports Server (NTRS)
Lovejoy, Andrew E.; Schultz, Marc R.
2012-01-01
Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.
Auricular reconstruction for microtia: Part II. Surgical techniques.
Walton, Robert L; Beahm, Elisabeth K
2002-07-01
Reconstruction of the microtic ear represents one of the most demanding challenges in reconstructive surgery. In this review the two most commonly used techniques for ear reconstruction, the Brent and Nagata techniques, are addressed in detail. Unique to this endeavor, the originator of each technique has been allowed to submit representative case material and to address the pros and cons of the other's technique. What follows is a detailed, insightful overview of microtia reconstruction, as a state of the art. The review then details commonly encountered problems in ear reconstruction and pertinent technical points. Finally, a glimpse into the future is offered with an accounting of the advances made in tissue engineering as this technology applies to auricular reconstruction.
Use of Iba Techniques to Characterize High Velocity Thermal Spray Coatings
NASA Astrophysics Data System (ADS)
Trompetter, W.; Markwitz, A.; Hyland, M.
Spray coatings are being used in an increasingly wide range of industries to improve the abrasive, erosive and sliding wear of machine components. Over the past decade industries have moved to the application of supersonic high velocity thermal spray techniques. These coating techniques produce superior coating quality in comparison to other traditional techniques such as plasma spraying. To date the knowledge of the bonding processes and the structure of the particles within thermal spray coatings is very subjective. The aim of this research is to improve our understanding of these materials through the use of IBA techniques in conjunction with other materials analysis techniques. Samples were prepared by spraying a widely used commercial NiCr powder onto substrates using a HVAF (high velocity air fuel) thermal spraying technique. Detailed analysis of the composition and structure of the power particles revealed two distinct types of particles. The majority was NiCr particles with a significant minority of particles composing of SiO2/CrO3. When the particles were investigated both as raw powder and in the sprayed coating, it was surprising to find that the composition of the coating meterial remained unchanged during the coating process despite the high velocity application.
Atomic characterization of Si nanoclusters embedded in SiO2 by atom probe tomography
2011-01-01
Silicon nanoclusters are of prime interest for new generation of optoelectronic and microelectronics components. Physical properties (light emission, carrier storage...) of systems using such nanoclusters are strongly dependent on nanostructural characteristics. These characteristics (size, composition, distribution, and interface nature) are until now obtained using conventional high-resolution analytic methods, such as high-resolution transmission electron microscopy, EFTEM, or EELS. In this article, a complementary technique, the atom probe tomography, was used for studying a multilayer (ML) system containing silicon clusters. Such a technique and its analysis give information on the structure at the atomic level and allow obtaining complementary information with respect to other techniques. A description of the different steps for such analysis: sample preparation, atom probe analysis, and data treatment are detailed. An atomic scale description of the Si nanoclusters/SiO2 ML will be fully described. This system is composed of 3.8-nm-thick SiO layers and 4-nm-thick SiO2 layers annealed 1 h at 900°C. PMID:21711666
Transmission ultrasonography. [time delay spectrometry for soft tissue transmission imaging
NASA Technical Reports Server (NTRS)
Heyser, R. C.; Le Croissette, D. H.
1973-01-01
Review of the results of the application of an advanced signal-processing technique, called time delay spectrometry, in obtaining soft tissue transmission images by transmission ultrasonography, both in vivo and in vitro. The presented results include amplitude ultrasound pictures and phase ultrasound pictures obtained by this technique. While amplitude ultrasonographs of tissue are closely analogous to X-ray pictures in that differential absorption is imaged, phase ultrasonographs represent an entirely new source of information based on differential time of propagation. Thus, a new source of information is made available for detailed analysis.
Flood damage assessment using computer-assisted analysis of color infrared photography
Anderson, William H.
1978-01-01
Use of digitized aerial photographs for flood damage assessment in agriculture is new and largely untested. However, under flooding circumstances similar to the 1975 Red River Valley flood, computer-assisted techniques can be extremely useful, especially if detailed crop damage estimates are needed within a relatively short period of time.Airphoto interpretation techniques, manual or computer-assisted, are not intended to replace conventional ground survey and sampling procedures. But their use should be considered a valuable addition to the tools currently available for assessing agricultural flood damage.
Sager, Monica; Yeat, Nai Chien; Pajaro-Van der Stadt, Stefan; Lin, Charlotte; Ren, Qiuyin; Lin, Jimmy
2015-01-01
Transcriptomic technologies are evolving to diagnose cancer earlier and more accurately to provide greater predictive and prognostic utility to oncologists and patients. Digital techniques such as RNA sequencing are replacing still-imaging techniques to provide more detailed analysis of the transcriptome and aberrant expression that causes oncogenesis, while companion diagnostics are developing to determine the likely effectiveness of targeted treatments. This article examines recent advancements in molecular profiling research and technology as applied to cancer diagnosis, clinical applications and predictions for the future of personalized medicine in oncology.
A Computational Observer For Performing Contrast-Detail Analysis Of Ultrasound Images
NASA Astrophysics Data System (ADS)
Lopez, H.; Loew, M. H.
1988-06-01
Contrast-Detail (C/D) analysis allows the quantitative determination of an imaging system's ability to display a range of varying-size targets as a function of contrast. Using this technique, a contrast-detail plot is obtained which can, in theory, be used to compare image quality from one imaging system to another. The C/D plot, however, is usually obtained by using data from human observer readings. We have shown earlier(7) that the performance of human observers in the task of threshold detection of simulated lesions embedded in random ultrasound noise is highly inaccurate and non-reproducible for untrained observers. We present an objective, computational method for the determination of the C/D curve for ultrasound images. This method utilizes digital images of the C/D phantom developed at CDRH, and lesion-detection algorithms that simulate the Bayesian approach using the likelihood function for an ideal observer. We present the results of this method, and discuss the relationship to the human observer and to the comparability of image quality between systems.
Mesenchymal-epithelial interaction techniques
Baskin, Lawrence
2016-01-01
This paper reviews the importance of mesenchymal-epithelial interactions in development and gives detailed technical protocols for investigating these interactions. Successful analysis of mesenchymal-epithelial interactions requires knowing the ages in which embryonic, neonatal and adult organs can be separated into mesenchymal and epithelial tissues. Methods for separation of mesenchymal and epithelial and preparation of tissue recombinants are described. PMID:26610327
USDA-ARS?s Scientific Manuscript database
Proteins exist in every plant cell wall. Certain protein residues interfere with lignin characterization and quantification. The current solution-state 2D-NMR technique (gel-NMR) for whole plant cell wall structural profiling provides detailed information regarding cell walls and proteins. However, ...
Center of Excellence for Hypersonics Research
2012-01-25
detailed simulations of actual combustor configurations, and ultimately for the optimization of hypersonic air - breathing propulsion system flow paths... vehicle development programs. The Center engaged leading experts in experimental and computational analysis of hypersonic flows to provide research...advanced hypersonic vehicles and space access systems will require significant advances in the design methods and ground testing techniques to ensure
Adverbials of Result: Phraseology and Functions in the Problem-Solution Pattern
ERIC Educational Resources Information Center
Charles, Maggie
2011-01-01
This paper combines the use of corpus techniques with discourse analysis in order to investigate adverbials of result in the writing of advanced academic student writers. It focuses in detail on the phraseology and functions of "thus," "therefore," "then," "hence," "so" and "consequently." Two corpora of native-speaker theses are examined: 190,000…
Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.
ERIC Educational Resources Information Center
Lindahl, William H.; Gardner, James H.
Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 1 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in radiation. (BT)
ERIC Educational Resources Information Center
Gardner, C. H.; And Others
The classroom behaviors recorded during three second grade reading lessons provide suitable evidence for comparing the relative merits of using narrative observations versus videotapes as data collection techniques. The comparative analysis illustrates the detail and precision of videotape. Primarily, videotape gives a true picture of linear time,…
techniques is presented. Two methods for linearizing the data are given. An expression for the specular-to-spattered power ratio is derived, and the inverse ... transform of the autocorrelation function is discussed. The surface roughness of the reflector, the discrete fading rates, and fading frequencies
The Influence of Video Technology in Adolescence. Media Panel Report No. 27.
ERIC Educational Resources Information Center
Roe, Keith
This report provides a detailed analysis of the video use and preferences of Swedish adolescents based on data drawn from the Media Panel project, a three-wave, longitudinal research program on video use conducted at the Department of Sociology, The University of Lund, and the Department for Information Techniques, the University College of Vaxjo,…
Surface degradation of uranium tetrafluoride
Tobin, J. G.; Duffin, A. M.; Yu, S. -W.; ...
2017-05-01
A detailed analysis of a single crystal of uranium tetrafluoride has been carried out. The techniques include x-ray absorption spectroscopy, as well as x-ray photoelectron spectroscopy and x-ray emission spectroscopy. Evidence will be presented for the presence of a uranyl species, possibly UO 2F 2, as a product of, or participant in the surface degradation.
Bhatia, Tripta
2018-07-01
Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.
Thin film processing of photorefractive BaTiO3
NASA Technical Reports Server (NTRS)
Schuster, Paul R.
1993-01-01
During the period covered by this report, October 11, 1991 through October 10, 1992, the research has progressed in a number of different areas. The sol-gel technique was initially studied and experimentally evaluated for depositing films of BaTiO3. The difficulties with the precursors and the poor quality of the films deposited lead to the investigation of pulsed laser deposition as an alternative approach. The development of the pulsed laser deposition technique has resulted in continuous improvements to the quality of deposited films of BaTiO3. The initial depositions of BaTiO3 resulted in amorphous films, however, as the pulsed laser deposition technique continued to evolve, films were deposited in the polycrystalline state, then the textured polycrystalline state, and most recently heteroepitaxial films have also been successfully deposited on cubic (100) oriented SrTiO3 substrates. A technique for poling samples at room temperature and in air is also undergoing development with some very preliminary but positive results. The analytical techniques, which include x-ray diffraction, ferroelectric analysis, UV-Vis spectrophotometry, scanning electron microscopy with x-ray compositional analysis, optical and polarized light microscopy, and surface profilometry have been enhanced to allow for more detailed evaluation of the samples. In the area of optical characterization, a pulsed Nd:YAG laser has been incorporated into the experimental configuration. Now data can also be acquired within various temporal domains resulting in more detailed information on the optical response of the samples and on their photorefractive sensitivity. The recent establishment of collaborative efforts with two departments at Johns Hopkins University and the Army Research Lab at Fort Belvoir has also produced preliminary results using the metallo-organic decomposition technique as an alternative method for thin film processing of BaTiO3. RF and DC sputtering is another film deposition approach that should be initiated in the near future. Other techniques for optical characterization, which may even allow for intragrannular (within single grains) investigations, are also being considered.
Physicians and Drug Representatives: Exploring the Dynamics of the Relationship
Chimonas, Susan; Brennan, Troyen A.
2007-01-01
Background Interactions between physicians and drug representatives are common, even though research shows that physicians understand the conflict of interest between marketing and patient care. Little is known about how physicians resolve this contradiction. Objective To determine physicians’ techniques for managing cognitive inconsistencies within their relationships with drug representatives. Design, Setting, and Participants Six focus groups were conducted with 32 academic and community physicians in San Diego, Atlanta, and Chicago. Measurements Qualitative analysis of focus group transcripts to determine physicians’ attitudes towards conflict of interest and detailing, their beliefs about the quality of information conveyed and the impact on prescribing, and their resolution of the conflict between detailers’ desire to sell product and patient care. Results Physicians understood the concept of conflict of interest and applied it to relationships with detailers. However, they maintained favorable views of physician–detailer exchanges. Holding these mutually contradictory attitudes, physicians were in a position of cognitive dissonance. To resolve the dissonance, they used a variety of denials and rationalizations: They avoided thinking about the conflict of interest, they disagreed that industry relationships affected physician behavior, they denied responsibility for the problem, they enumerated techniques for remaining impartial, and they reasoned that meetings with detailers were educational and benefited patients. Conclusions Although physicians understood the concept of conflict of interest, relationships with detailers set up psychological dynamics that influenced their reasoning. Our findings suggest that voluntary guidelines, like those proposed by most major medical societies, are inadequate. It may be that only the prohibition of physician–detailer interactions will be effective. PMID:17356984
Data analysis of the COMPTEL instrument on the NASA gamma ray observatory
NASA Technical Reports Server (NTRS)
Diehl, R.; Bennett, K.; Collmar, W.; Connors, A.; Denherder, J. W.; Hermsen, W.; Lichti, G. G.; Lockwood, J. A.; Macri, J.; Mcconnell, M.
1992-01-01
The Compton imaging telescope (COMPTEL) on the Gamma Ray Observatory (GRO) is a wide field of view instrument. The coincidence measurement technique in two scintillation detector layers requires specific analysis methods. Straightforward event projection into the sky is impossible. Therefore, detector events are analyzed in a multi-dimensional dataspace using a gamma ray sky hypothesis convolved with the point spread function of the instrument in this dataspace. Background suppression and analysis techniques have important implications on the gamma ray source results for this background limited telescope. The COMPTEL collaboration applies a software system of analysis utilities, organized around a database management system. The use of this system for the assistance of guest investigators at the various collaboration sites and external sites is foreseen and allows different detail levels of cooperation with the COMPTEL institutes, dependent on the type of data to be studied.
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)
2002-01-01
In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory
Chromosome rearrangements in canine fibrosarcomas.
Sargan, D R; Milne, B S; Hernandez, J Aguirre; O'Brien, P C M; Ferguson-Smith, M A; Hoather, T; Dobson, J M
2005-01-01
We have previously reported the use of six- and seven-color paint sets in the analysis of canine soft tissue sarcomas. Here we combine this technique with flow sorting of translocation chromosomes, reverse painting, and polymerase chain reaction (PCR) analysis of the gene content of the reverse paint in order to provide a more detailed analysis of cytogenetic abnormalities in canine tumors. We examine two fibrosarcomas, both from female Labrador retrievers, and show abnormalities in chromosomes 11 and 30 in both cases. Evidence of involvement of TGFBR1 is presented for one tumor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, D.W.; Selvin, S.; Close, E.R.
In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less
Design optimization of a prescribed vibration system using conjoint value analysis
NASA Astrophysics Data System (ADS)
Malinga, Bongani; Buckner, Gregory D.
2016-12-01
This article details a novel design optimization strategy for a prescribed vibration system (PVS) used to mechanically filter solids from fluids in oil and gas drilling operations. A dynamic model of the PVS is developed, and the effects of disturbance torques are detailed. This model is used to predict the effects of design parameters on system performance and efficiency, as quantified by system attributes. Conjoint value analysis, a statistical technique commonly used in marketing science, is utilized to incorporate designer preferences. This approach effectively quantifies and optimizes preference-based trade-offs in the design process. The effects of designer preferences on system performance and efficiency are simulated. This novel optimization strategy yields improvements in all system attributes across all simulated vibration profiles, and is applicable to other industrial electromechanical systems.
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.
1999-01-01
Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.
Process mining techniques: an application to time management
NASA Astrophysics Data System (ADS)
Khowaja, Ali Raza
2018-04-01
In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.
The analysis of composite laminated beams using a 2D interpolating meshless technique
NASA Astrophysics Data System (ADS)
Sadek, S. H. M.; Belinha, J.; Parente, M. P. L.; Natal Jorge, R. M.; de Sá, J. M. A. César; Ferreira, A. J. M.
2018-02-01
Laminated composite materials are widely implemented in several engineering constructions. For its relative light weight, these materials are suitable for aerospace, military, marine, and automotive structural applications. To obtain safe and economical structures, the modelling analysis accuracy is highly relevant. Since meshless methods in the recent years achieved a remarkable progress in computational mechanics, the present work uses one of the most flexible and stable interpolation meshless technique available in the literature—the Radial Point Interpolation Method (RPIM). Here, a 2D approach is considered to numerically analyse composite laminated beams. Both the meshless formulation and the equilibrium equations ruling the studied physical phenomenon are presented with detail. Several benchmark beam examples are studied and the results are compared with exact solutions available in the literature and the results obtained from a commercial finite element software. The results show the efficiency and accuracy of the proposed numeric technique.
Air-to-air radar flight testing
NASA Astrophysics Data System (ADS)
Scott, Randall E.
1988-06-01
This volume in the AGARD Flight Test Techniques Series describes flight test techniques, flight test instrumentation, ground simulation, data reduction and analysis methods used to determine the performance characteristics of a modern air-to-air (a/a) radar system. Following a general coverage of specification requirements, test plans, support requirements, development and operational testing, and management information systems, the report goes into more detailed flight test techniques covering a/a radar capabilities of: detection, manual acquisition, automatic acquisition, tracking a single target, and detection and tracking of multiple targets. There follows a section on additional flight test considerations such as electromagnetic compatibility, electronic countermeasures, displays and controls, degraded and backup modes, radome effects, environmental considerations, and use of testbeds. Other sections cover ground simulation, flight test instrumentation, and data reduction and analysis. The final sections deal with reporting and a discussion of considerations for the future and how they may affect radar flight testing.
Liao, Hongjing; Hitchcock, John
2018-06-01
This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description. Copyright © 2018 Elsevier Ltd. All rights reserved.
Wheeze sound analysis using computer-based techniques: a systematic review.
Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian
2017-10-31
Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
Prosa, T J; Alvis, R; Tsakalakos, L; Smentkowski, V S
2010-08-01
Three-dimensional quantitative compositional analysis of nanowires is a challenge for standard techniques such as secondary ion mass spectrometry because of specimen size and geometry considerations; however, it is precisely the size and geometry of nanowires that makes them attractive candidates for analysis via atom probe tomography. The resulting boron composition of various trimethylboron vapour-liquid-solid grown silicon nanowires were measured both with time-of-flight secondary ion mass spectrometry and pulsed-laser atom probe tomography. Both characterization techniques yielded similar results for relative composition. Specialized specimen preparation for pulsed-laser atom probe tomography was utilized and is described in detail whereby individual silicon nanowires are first protected, then lifted out, trimmed, and finally wet etched to remove the protective layer for subsequent three-dimensional analysis.
NASA Technical Reports Server (NTRS)
Erb, R. B.
1974-01-01
The results of the ERTS-1 investigations conducted by the Earth Observations Division at the NASA Lyndon B. Johnson Space Center are summarized in this report, which is an overview of documents detailing individual investigations. Conventional image interpretation and computer-aided classification procedures were the two basic techniques used in analyzing the data for detecting, identifying, locating, and measuring surface features related to earth resources. Data from the ERTS-1 multispectral scanner system were useful for all applications studied, which included agriculture, coastal and estuarine analysis, forestry, range, land use and urban land use, and signature extension. Percentage classification accuracies are cited for the conventional and computer-aided techniques.
Comparison of two target classification techniques
NASA Astrophysics Data System (ADS)
Chen, J. S.; Walton, E. K.
1986-01-01
Radar target classification techniques based on backscatter measurements in the resonance region (1.0-20.0 MHz) are discussed. Attention is given to two novel methods currently being tested at the radar range of Ohio State University. The methods include: (1) the nearest neighbor (NN) algorithm for determining the radar cross section (RCS) magnitude and range corrected phase at various operating frequencies; and (2) an inverse Fourier transformation of the complex multifrequency radar returns of the time domain, followed by cross correlation analysis. Comparisons are made of the performance of the two techniques as a function of signal-to-error noise ratio for different types of processing. The results of the comparison are discussed in detail.
Chao, T.T.; Sanzolone, R.F.
1992-01-01
Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.
Development of Low-cost, High Energy-per-unit-area Solar Cell Modules
NASA Technical Reports Server (NTRS)
Jones, G. T.; Chitre, S.; Rhee, S. S.
1978-01-01
The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.
Vulnerability-attention analysis for space-related activities
NASA Technical Reports Server (NTRS)
Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John
1988-01-01
Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.
Determination of Phenols and Trimethylamine in Industrial Effluents
NASA Technical Reports Server (NTRS)
Levaggi, D. A.; Feldstein, M.
1971-01-01
For regulatory purposes to control certain odorous compounds the analysis of phenols and trimethylamines in industrial effluents is necessary. The Bay Area Air Pollution Control District laboratory has been determining these gases by gas chromatographic techniques. The procedures for sample collection, preparation for analysis and determination are described in detail. Typical data from various sources showing the effect of proposed regulations is shown. Extensive sampling and usage of these procedures has shown them to be accurate, reliable and suitable to all types of source effluents.
Reverse genetics: Its origins and prospects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, P.
1991-04-01
The nucleotide sequence of a gene and its flanking segments alone will not tell us how its expression is regulated during development and differentiation, or in response to environmental changes. To comprehend the physiological significance of the molecular details requires biological analysis. Recombinant DNA techniques provide a powerful experimental approach. A strategy termed reverse genetics' utilizes the analysis of the activities of mutant and normal genes and experimentally constructed mutants to explore the relationship between gene structure and function thereby helping elucidate the relationship between genotype and phenotype.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
ERIC Educational Resources Information Center
Pinkerton, Steven D.; Benotsch, Eric G.; Mikytuck, John
2007-01-01
The "gold standard" for evaluating human immunodeficiency virus (HIV) prevention programs is a partner-by-partner sexual behavior assessment that elicits information about each sex partner and the activities engaged in with that partner. When collection of detailed partner-by-partner data is not feasible, aggregate data (e.g., total…
Evaluation of scheduling techniques for payload activity planning
NASA Technical Reports Server (NTRS)
Bullington, Stanley F.
1991-01-01
Two tasks related to payload activity planning and scheduling were performed. The first task involved making a comparison of space mission activity scheduling problems with production scheduling problems. The second task consisted of a statistical analysis of the output of runs of the Experiment Scheduling Program (ESP). Details of the work which was performed on these two tasks are presented.
C-band radar pulse Doppler error: Its discovery, modeling, and elimination
NASA Technical Reports Server (NTRS)
Krabill, W. B.; Dempsey, D. J.
1978-01-01
The discovery of a C Band radar pulse Doppler error is discussed and use of the GEOS 3 satellite's coherent transponder to isolate the error source is described. An analysis of the pulse Doppler tracking loop is presented and a mathematical model for the error was developed. Error correction techniques were developed and are described including implementation details.
Land use analysis of US urban areas using high-resolution imagery from Skylab
NASA Technical Reports Server (NTRS)
Gallagher, D. B. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The S-190B imagery from Skylab 3 permitted the detection of higher levels of land use detail than any satellite imagery previously evaluated using manual interpretation techniques. Resolution approaches that of 1:100,000 scale infrared aircraft photography, especially regarding urban areas. Nonurban areas are less distinct.
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 9 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in medical laboratory technology. (BT)
Activating the Worker in Elderly Care: A Technique and Tactics of Invitation
ERIC Educational Resources Information Center
Fejes, A.; Nicoll, K.
2011-01-01
Relatively little attention has been paid to questions of how language acts in and through the interactions of language in situations where people are encouraged to learn to be active in contexts of work. This paper argues that detailed analysis is needed to understand how activation through language acts in the shaping and governing of workers.…
The mechanisms of renal tubule electrolyte and water absorption, 100 years after Carl Ludwig.
Greger, R
1996-01-01
Some 154 years after Carl Ludwig's Habilitationsschrift "Contributions to the theory of the mechanism of urine secretion" renal physiology has come a long way. The mechanisms of urine formation are now understood as the result of glomerular filtration and tubule absorption of most of the filtrate. The detailed understanding of tubule transport processes has become possible with the invention of several refined techniques such as the micropuncture techniques; the microchemical analysis of nanolitre tubule fluid samples; the in vitro perfusion of isolated tubule segments of defined origin; electrophysiological analysis of electrolyte transport including micropuncture and patch-clamp techniques; transport studies in membrane vesicle preparations; recordings of intracellular electrolyte concentrations and cloning techniques of the individual membrane transport proteins. With this wealth of information we are now starting to build an integrative understanding of the function of the individual nephron segments, the regulatory processes, the integrated function of the nephron and hence the formation of the final urine. Like anatomists of previous centuries we still state that the kidney is an "organum mirable" and we recognize that basic research in this area has fertilized the analysis of the function of a large number of other organs and cells.
NASA Astrophysics Data System (ADS)
Moore, Jeffrey R.; Pankow, Kristine L.; Ford, Sean R.; Koper, Keith D.; Hale, J. Mark; Aaron, Jordan; Larsen, Chris F.
2017-03-01
The 2013 Bingham Canyon Mine rock avalanches represent one of the largest cumulative landslide events in recorded U.S. history and provide a unique opportunity to test remote analysis techniques for landslide characterization. Here we combine aerial photogrammetry surveying, topographic reconstruction, numerical runout modeling, and analysis of broadband seismic and infrasound data to extract salient details of the dynamics and evolution of the multiphase landslide event. Our results reveal a cumulative intact rock source volume of 52 Mm3, which mobilized in two main rock avalanche phases separated by 1.5 h. We estimate that the first rock avalanche had 1.5-2 times greater volume than the second. Each failure initiated by sliding along a gently dipping (21°), highly persistent basal fault before transitioning to a rock avalanche and spilling into the inner pit. The trajectory and duration of the two rock avalanches were reconstructed using runout modeling and independent force history inversion of intermediate-period (10-50 s) seismic data. Intermediate- and shorter-period (1-50 s) seismic data were sensitive to intervals of mass redirection and constrained finer details of the individual slide dynamics. Back projecting short-period (0.2-1 s) seismic energy, we located the two rock avalanches within 2 and 4 km of the mine. Further analysis of infrasound and seismic data revealed that the cumulative event included an additional 11 smaller landslides (volumes 104-105 m3) and that a trailing signal following the second rock avalanche may result from an air-coupled Rayleigh wave. Our results demonstrate new and refined techniques for detailed remote characterization of the dynamics and evolution of large landslides.
Moore, Jeffrey R.; Pankow, Kristine L.; Ford, Sean R.; ...
2017-03-01
The 2013 Bingham Canyon Mine rock avalanches represent one of the largest cumulative landslide events in recorded U.S. history and provide a unique opportunity to test remote analysis techniques for landslide characterization. We combine aerial photogrammetry surveying, topographic reconstruction, numerical runout modeling, and analysis of broadband seismic and infrasound data to extract salient details of the dynamics and evolution of the multiphase landslide event. Our results reveal a cumulative intact rock source volume of 52 Mm 3, which mobilized in two main rock avalanche phases separated by 1.5 h. We estimate that the first rock avalanche had 1.5–2 times greatermore » volume than the second. Each failure initiated by sliding along a gently dipping (21°), highly persistent basal fault before transitioning to a rock avalanche and spilling into the inner pit. The trajectory and duration of the two rock avalanches were reconstructed using runout modeling and independent force history inversion of intermediate-period (10–50 s) seismic data. Intermediate- and shorter-period (1–50 s) seismic data were sensitive to intervals of mass redirection and constrained finer details of the individual slide dynamics. Back projecting short-period (0.2–1 s) seismic energy, we located the two rock avalanches within 2 and 4 km of the mine. Further analysis of infrasound and seismic data revealed that the cumulative event included an additional 11 smaller landslides (volumes ~10 4–10 5 m 3) and that a trailing signal following the second rock avalanche may result from an air-coupled Rayleigh wave. These results demonstrate new and refined techniques for detailed remote characterization of the dynamics and evolution of large landslides.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Jeffrey R.; Pankow, Kristine L.; Ford, Sean R.
The 2013 Bingham Canyon Mine rock avalanches represent one of the largest cumulative landslide events in recorded U.S. history and provide a unique opportunity to test remote analysis techniques for landslide characterization. We combine aerial photogrammetry surveying, topographic reconstruction, numerical runout modeling, and analysis of broadband seismic and infrasound data to extract salient details of the dynamics and evolution of the multiphase landslide event. Our results reveal a cumulative intact rock source volume of 52 Mm 3, which mobilized in two main rock avalanche phases separated by 1.5 h. We estimate that the first rock avalanche had 1.5–2 times greatermore » volume than the second. Each failure initiated by sliding along a gently dipping (21°), highly persistent basal fault before transitioning to a rock avalanche and spilling into the inner pit. The trajectory and duration of the two rock avalanches were reconstructed using runout modeling and independent force history inversion of intermediate-period (10–50 s) seismic data. Intermediate- and shorter-period (1–50 s) seismic data were sensitive to intervals of mass redirection and constrained finer details of the individual slide dynamics. Back projecting short-period (0.2–1 s) seismic energy, we located the two rock avalanches within 2 and 4 km of the mine. Further analysis of infrasound and seismic data revealed that the cumulative event included an additional 11 smaller landslides (volumes ~10 4–10 5 m 3) and that a trailing signal following the second rock avalanche may result from an air-coupled Rayleigh wave. These results demonstrate new and refined techniques for detailed remote characterization of the dynamics and evolution of large landslides.« less
Reduced modeling of signal transduction – a modular approach
Koschorreck, Markus; Conzelmann, Holger; Ebert, Sybille; Ederer, Michael; Gilles, Ernst Dieter
2007-01-01
Background Combinatorial complexity is a challenging problem in detailed and mechanistic mathematical modeling of signal transduction. This subject has been discussed intensively and a lot of progress has been made within the last few years. A software tool (BioNetGen) was developed which allows an automatic rule-based set-up of mechanistic model equations. In many cases these models can be reduced by an exact domain-oriented lumping technique. However, the resulting models can still consist of a very large number of differential equations. Results We introduce a new reduction technique, which allows building modularized and highly reduced models. Compared to existing approaches further reduction of signal transduction networks is possible. The method also provides a new modularization criterion, which allows to dissect the model into smaller modules that are called layers and can be modeled independently. Hallmarks of the approach are conservation relations within each layer and connection of layers by signal flows instead of mass flows. The reduced model can be formulated directly without previous generation of detailed model equations. It can be understood and interpreted intuitively, as model variables are macroscopic quantities that are converted by rates following simple kinetics. The proposed technique is applicable without using complex mathematical tools and even without detailed knowledge of the mathematical background. However, we provide a detailed mathematical analysis to show performance and limitations of the method. For physiologically relevant parameter domains the transient as well as the stationary errors caused by the reduction are negligible. Conclusion The new layer based reduced modeling method allows building modularized and strongly reduced models of signal transduction networks. Reduced model equations can be directly formulated and are intuitively interpretable. Additionally, the method provides very good approximations especially for macroscopic variables. It can be combined with existing reduction methods without any difficulties. PMID:17854494
Multiscale morphological filtering for analysis of noisy and complex images
NASA Astrophysics Data System (ADS)
Kher, A.; Mitra, S.
Images acquired with passive sensing techniques suffer from illumination variations and poor local contrasts that create major difficulties in interpretation and identification tasks. On the other hand, images acquired with active sensing techniques based on monochromatic illumination are degraded with speckle noise. Mathematical morphology offers elegant techniques to handle a wide range of image degradation problems. Unlike linear filters, morphological filters do not blur the edges and hence maintain higher image resolution. Their rich mathematical framework facilitates the design and analysis of these filters as well as their hardware implementation. Morphological filters are easier to implement and are more cost effective and efficient than several conventional linear filters. Morphological filters to remove speckle noise while maintaining high resolution and preserving thin image regions that are particularly vulnerable to speckle noise were developed and applied to SAR imagery. These filters used combination of linear (one-dimensional) structuring elements in different (typically four) orientations. Although this approach preserves more details than the simple morphological filters using two-dimensional structuring elements, the limited orientations of one-dimensional elements approximate the fine details of the region boundaries. A more robust filter designed recently overcomes the limitation of the fixed orientations. This filter uses a combination of concave and convex structuring elements. Morphological operators are also useful in extracting features from visible and infrared imagery. A multiresolution image pyramid obtained with successive filtering and a subsampling process aids in the removal of the illumination variations and enhances local contrasts. A morphology-based interpolation scheme was also introduced to reduce intensity discontinuities created in any morphological filtering task. The generality of morphological filtering techniques in extracting information from a wide variety of images obtained with active and passive sensing techniques is discussed. Such techniques are particularly useful in obtaining more information from fusion of complex images by different sensors such as SAR, visible, and infrared.
Multiscale Morphological Filtering for Analysis of Noisy and Complex Images
NASA Technical Reports Server (NTRS)
Kher, A.; Mitra, S.
1993-01-01
Images acquired with passive sensing techniques suffer from illumination variations and poor local contrasts that create major difficulties in interpretation and identification tasks. On the other hand, images acquired with active sensing techniques based on monochromatic illumination are degraded with speckle noise. Mathematical morphology offers elegant techniques to handle a wide range of image degradation problems. Unlike linear filters, morphological filters do not blur the edges and hence maintain higher image resolution. Their rich mathematical framework facilitates the design and analysis of these filters as well as their hardware implementation. Morphological filters are easier to implement and are more cost effective and efficient than several conventional linear filters. Morphological filters to remove speckle noise while maintaining high resolution and preserving thin image regions that are particularly vulnerable to speckle noise were developed and applied to SAR imagery. These filters used combination of linear (one-dimensional) structuring elements in different (typically four) orientations. Although this approach preserves more details than the simple morphological filters using two-dimensional structuring elements, the limited orientations of one-dimensional elements approximate the fine details of the region boundaries. A more robust filter designed recently overcomes the limitation of the fixed orientations. This filter uses a combination of concave and convex structuring elements. Morphological operators are also useful in extracting features from visible and infrared imagery. A multiresolution image pyramid obtained with successive filtering and a subsampling process aids in the removal of the illumination variations and enhances local contrasts. A morphology-based interpolation scheme was also introduced to reduce intensity discontinuities created in any morphological filtering task. The generality of morphological filtering techniques in extracting information from a wide variety of images obtained with active and passive sensing techniques is discussed. Such techniques are particularly useful in obtaining more information from fusion of complex images by different sensors such as SAR, visible, and infrared.
Complications in proximal humeral fractures.
Calori, Giorgio Maria; Colombo, Massimiliano; Bucci, Miguel Simon; Fadigati, Piero; Colombo, Alessandra Ines Maria; Mazzola, Simone; Cefalo, Vittorio; Mazza, Emilio
2016-10-01
Necrosis of the humeral head, infections and non-unions are among the most dangerous and difficult-to-treat complications of proximal humeral fractures. The aim of this work was to analyse in detail non-unions and post-traumatic bone defects and to suggest an algorithm of care. Treatment options are based not only on the radiological frame, but also according to a detailed analysis of the patient, who is classified using a risk factor analysis. This method enables the surgeon to choose the most suitable treatment for the patient, thereby facilitating return of function in the shortest possible time. The treatment of such serious complications requires the surgeon to be knowledgeable about the following possible solutions: increased mechanical stability; biological stimulation; and reconstructive techniques in two steps, with application of biotechnologies and prosthetic substitution. Copyright © 2016 Elsevier Ltd. All rights reserved.
Molecular tools for carotenogenesis analysis in the zygomycete Mucor circinelloides.
Torres-Martínez, Santiago; Ruiz-Vázquez, Rosa M; Garre, Victoriano; López-García, Sergio; Navarro, Eusebio; Vila, Ana
2012-01-01
The carotene producer fungus Mucor circinelloides is the zygomycete more amenable to genetic manipulations by using molecular tools. Since the initial development of an effective procedure of genetic transformation, more than two decades ago, the availability of new molecular approaches such as gene replacement techniques and gene expression inactivation by RNA silencing, in addition to the sequencing of its genome, has made Mucor a valuable organism for the study of a number of processes. Here we describe in detail the main techniques and methods currently used to manipulate M. circinelloides, including transformation, gene replacement, gene silencing, RNAi, and immunoprecipitation.
Spacecraft Charging Calculations: NASCAP-2K and SEE Spacecraft Charging Handbook
NASA Technical Reports Server (NTRS)
Davis, V. A.; Neergaard, L. F.; Mandell, M. J.; Katz, I.; Gardner, B. M.; Hilton, J. M.; Minor, J.
2002-01-01
For fifteen years NASA and the Air Force Charging Analyzer Program for Geosynchronous Orbits (NASCAP/GEO) has been the workhorse of spacecraft charging calculations. Two new tools, the Space Environment and Effects (SEE) Spacecraft Charging Handbook (recently released), and Nascap-2K (under development), use improved numeric techniques and modern user interfaces to tackle the same problem. The SEE Spacecraft Charging Handbook provides first-order, lower-resolution solutions while Nascap-2K provides higher resolution results appropriate for detailed analysis. This paper illustrates how the improvements in the numeric techniques affect the results.
Virtopsy: An integration of forensic science and imageology
Joseph, T. Isaac; Girish, K. L.; Sathyan, Pradeesh; Kiran, M. Shashi; Vidya, S.
2017-01-01
In an era where noninvasive and minimally invasive techniques are heralding medical innovations and health science technology, necrological analysis is not bereft of this wave. Virtopsy is virtual autopsy. It is a new-age complimentary documentation approach to identify and analyze the details of demise. Utilizing virtual autopsy for orofacial forensic examination is an emerging specialty which holds a plethora of potential for future trends in forensic science. Being a noninvasive technique, it is a rapid method which facilitates the medicolegal process and aids in the delivery of justice. The present article is an overview of this emerging methodology. PMID:29657485
Virtopsy: An integration of forensic science and imageology.
Joseph, T Isaac; Girish, K L; Sathyan, Pradeesh; Kiran, M Shashi; Vidya, S
2017-01-01
In an era where noninvasive and minimally invasive techniques are heralding medical innovations and health science technology, necrological analysis is not bereft of this wave. Virtopsy is virtual autopsy. It is a new-age complimentary documentation approach to identify and analyze the details of demise. Utilizing virtual autopsy for orofacial forensic examination is an emerging specialty which holds a plethora of potential for future trends in forensic science. Being a noninvasive technique, it is a rapid method which facilitates the medicolegal process and aids in the delivery of justice. The present article is an overview of this emerging methodology.
Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.
2014-01-01
Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811
Simple lock-in detection technique utilizing multiple harmonics for digital PGC demodulators.
Duan, Fajie; Huang, Tingting; Jiang, Jiajia; Fu, Xiao; Ma, Ling
2017-06-01
A simple lock-in detection technique especially suited for digital phase-generated carrier (PGC) demodulators is proposed in this paper. It mixes the interference signal with rectangular waves whose Fourier expansions contain multiple odd or multiple even harmonics of the carrier to recover the quadrature components needed for interference phase demodulation. In this way, the use of a multiplier is avoided and the efficiency of the algorithm is improved. Noise performance with regard to light intensity variation and circuit noise is analyzed theoretically for both the proposed technique and the traditional lock-in technique, and results show that the former provides a better signal-to-noise ratio than the latter with proper modulation depth and average interference phase. Detailed simulations were conducted and the theoretical analysis was verified. A fiber-optic Michelson interferometer was constructed and the feasibility of the proposed technique is demonstrated.
A comparison of solute-transport solution techniques based on inverse modelling results
Mehl, S.; Hill, M.C.
2000-01-01
Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.
NASA Astrophysics Data System (ADS)
Mora, R.; Barahona, A.; Aguilar, H.
2015-04-01
This paper presents a method for using high detail volumetric information, captured with a land based photogrammetric survey, to obtain information from individual trees. Applying LIDAR analysis techniques it is possible to measure diameter at breast height, height at first branch (commercial height), basal area and volume of an individual tree. Given this information it is possible to calculate how much of that tree can be exploited as wood. The main objective is to develop a methodology for successfully surveying one individual tree, capturing every side of the stem a using high resolution digital camera and reference marks with GPS coordinates. The process is executed for several individuals of two species present in the metropolitan area in San Jose, Costa Rica, Delonix regia (Bojer) Raf. and Tabebuia rosea (Bertol.) DC., each one with different height, stem shape and crown area. Using a photogrammetry suite all the pictures are aligned, geo-referenced and a dense point cloud is generated with enough detail to perform the required measurements, as well as a solid tridimensional model for volume measurement. This research will open the way to develop a capture methodology with an airborne camera using close range UAVs. An airborne platform will make possible to capture every individual in a forest plantation, furthermore if the analysis techniques applied in this research are automated it will be possible to calculate with high precision the exploit potential of a forest plantation and improve its management.
Methods for the design and analysis of power optimized finite-state machines using clock gating
NASA Astrophysics Data System (ADS)
Chodorowski, Piotr
2017-11-01
The paper discusses two methods of design of power optimized FSMs. Both methods use clock gating techniques. The main objective of the research was to write a program capable of generating automatic hardware description of finite-state machines in VHDL and testbenches to help power analysis. The creation of relevant output files is detailed step by step. The program was tested using the LGSynth91 FSM benchmark package. An analysis of the generated circuits shows that the second method presented in this paper leads to significant reduction of power consumption.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Spatial/Spectral Identification of Endmembers from AVIRIS Data using Mathematical Morphology
NASA Technical Reports Server (NTRS)
Plaza, Antonio; Martinez, Pablo; Gualtieri, J. Anthony; Perez, Rosa M.
2001-01-01
During the last several years, a number of airborne and satellite hyperspectral sensors have been developed or improved for remote sensing applications. Imaging spectrometry allows the detection of materials, objects and regions in a particular scene with a high degree of accuracy. Hyperspectral data typically consist of hundreds of thousands of spectra, so the analysis of this information is a key issue. Mathematical morphology theory is a widely used nonlinear technique for image analysis and pattern recognition. Although it is especially well suited to segment binary or grayscale images with irregular and complex shapes, its application in the classification/segmentation of multispectral or hyperspectral images has been quite rare. In this paper, we discuss a new completely automated methodology to find endmembers in the hyperspectral data cube using mathematical morphology. The extension of classic morphology to the hyperspectral domain allows us to integrate spectral and spatial information in the analysis process. In Section 3, some basic concepts about mathematical morphology and the technical details of our algorithm are provided. In Section 4, the accuracy of the proposed method is tested by its application to real hyperspectral data obtained from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) imaging spectrometer. Some details about these data and reference results, obtained by well-known endmember extraction techniques, are provided in Section 2. Finally, in Section 5 we expose the main conclusions at which we have arrived.
NASA Astrophysics Data System (ADS)
Sui, Xiao-lin; Zhou, Shou-huan
2013-05-01
The design and performance of Optical frequency modulation continuous wave (OFMCW) coherent laser radar is presented. By employing a combination of optical heterodyne and linear frequency modulation techniques and utilizing fiber optic technologies, highly efficient, compact and reliable laser radar suitable for operation in a space environment is being developed.We also give a hardware structure of the OFMCW coherent laser radar. We made a detailed analysis of the measurement error. Its accuracy in the speed range is less than 0.5%.Measurement results for the movement of the carrier has also made a detailed assessment. The results show that its acceleration vector has better adaptability. The circuit structure is also given a detailed design. At the end of the article, we give the actual authentication method and experimental results.
Linear prediction and single-channel recording.
Carter, A A; Oswald, R E
1995-08-01
The measurement of individual single-channel events arising from the gating of ion channels provides a detailed data set from which the kinetic mechanism of a channel can be deduced. In many cases, the pattern of dwells in the open and closed states is very complex, and the kinetic mechanism and parameters are not easily determined. Assuming a Markov model for channel kinetics, the probability density function for open and closed time dwells should consist of a sum of decaying exponentials. One method of approaching the kinetic analysis of such a system is to determine the number of exponentials and the corresponding parameters which comprise the open and closed dwell time distributions. These can then be compared to the relaxations predicted from the kinetic model to determine, where possible, the kinetic constants. We report here the use of a linear technique, linear prediction/singular value decomposition, to determine the number of exponentials and the exponential parameters. Using simulated distributions and comparing with standard maximum-likelihood analysis, the singular value decomposition techniques provide advantages in some situations and are a useful adjunct to other single-channel analysis techniques.
Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.
Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun
2018-05-08
Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
Measurement Protocols for In situ Analysis of Organic Compounds at Mars and Comets
NASA Technical Reports Server (NTRS)
Mahaffy, P. R.; Brinckerhuff, W. B.; Buch, A.; Cabane, M.; Coll, P.; Demick, J.; Glavin, D. P.; Navarro-Gonzalez, R.
2005-01-01
The determination of the abundance and chemical and isotopic composition of organic molecules in comets and those that might be found in protected environments at Mars is a first step toward understanding prebiotic chemistries on these solar system bodies. While future sample return missions from Mars and comets will enable detailed chemical and isotopic analysis with a wide range of analytical techniques, precursor insitu investigations can complement these missions and facilitate the identification of optimal sites for sample return. Robust automated experiments that make efficient use of limited spacecraft power, mass, and data volume resources are required for use by insitu missions. Within these constraints we continue to explore a range of instrument techniques and measurement protocols that can maximize the return from such insitu investigations.
Method for Determination of Less Than 5 ppm Oxygen in Sodium Samples
NASA Technical Reports Server (NTRS)
Reid, R. S.; Martin, J. J.; Schmidt, G. L.
2005-01-01
Alkali metals used in pumped loops or heat pipes must be sufficiently free of nonmetallic impurities to ensure long heat rejection system life. Life issues are well established for alkali metal systems. Impurities can form ternary compounds between the container and working fluid, leading to corrosion. This Technical Memorandum discusses the consequences of impurities and candidate measurement techniques to determine whether impurities have been reduced to suf.ciently low levels within a single-phase liquid metal loop or a closed two-phase heat transfer system, such as a heat pipe. These techniques include the vanadium wire equilibration, neutron activation analysis, plug traps, distillation, and chemical analysis. Conceptual procedures for performing vanadium wire equilibration purity measurements on sodium contained in a heat pipe are discussed in detail.
NASA Astrophysics Data System (ADS)
Davis, Benjamin L.; Berrier, Joel C.; Shields, Douglas W.; Kennefick, Julia; Kennefick, Daniel; Seigar, Marc S.; Lacy, Claud H. S.; Puerari, Ivânio
2012-04-01
A logarithmic spiral is a prominent feature appearing in a majority of observed galaxies. This feature has long been associated with the traditional Hubble classification scheme, but historical quotes of pitch angle of spiral galaxies have been almost exclusively qualitative. We have developed a methodology, utilizing two-dimensional fast Fourier transformations of images of spiral galaxies, in order to isolate and measure the pitch angles of their spiral arms. Our technique provides a quantitative way to measure this morphological feature. This will allow comparison of spiral galaxy pitch angle to other galactic parameters and test spiral arm genesis theories. In this work, we detail our image processing and analysis of spiral galaxy images and discuss the robustness of our analysis techniques.
De Luca, Michele; Ragno, Gaetano; Ioele, Giuseppina; Tauler, Romà
2014-07-21
An advanced and powerful chemometric approach is proposed for the analysis of incomplete multiset data obtained by fusion of hyphenated liquid chromatographic DAD/MS data with UV spectrophotometric data from acid-base titration and kinetic degradation experiments. Column- and row-wise augmented data blocks were combined and simultaneously processed by means of a new version of the multivariate curve resolution-alternating least squares (MCR-ALS) technique, including the simultaneous analysis of incomplete multiset data from different instrumental techniques. The proposed procedure was applied to the detailed study of the kinetic photodegradation process of the amiloride (AML) drug. All chemical species involved in the degradation and equilibrium reactions were resolved and the pH dependent kinetic pathway described. Copyright © 2014 Elsevier B.V. All rights reserved.
Measurement Protocols for In Situ Analysis of Organic Compounds at Mars and Comets
NASA Technical Reports Server (NTRS)
Mahaffy, P. R.; Brinckerhoff, W. B.; Buch, A.; Cabane, M.; Coll, P.; Demick, J.; Glavin, D. P.; Navarro-Gonzalez, R.
2005-01-01
The determination of the abundance and chemical and isotopic composition of organic molecules in comets and those that might be found in protected environments at Mars is a first step toward understanding prebiotic chemistries on these solar system bodies. While future sample return missions from Mars and comets will enable detailed chemical and isotopic analysis with a wide range of analytical techniques, precursor insitu investigations can complement these missions and facilitate the identification of optimal sites for sample return. Robust automated experiments that make efficient use of limited spacecraft power, mass, and data volume resources are required for use by insitu missions. Within these constraints we continue to explore a range of instrument techniques and measurement protocols that can maximize the return from such insitu investigations.
Jiang, Zhi-quan; Hu, Ke-liang
2016-03-01
In the field of forensic science, conventional infrared spectral analysis technique is usually unable to meet the detection requirements, because only very a few trace material evidence with diverse shapes and complex compositions, can be extracted from the crime scene. Infrared microscopic technique is developed based on a combination of Fourier-transform infrared spectroscopic technique and microscopic technique. Infrared microscopic technique has a lot of advantages over conventional infrared spectroscopic technique, such as high detection sensitivity, micro-area analysisand nondestructive examination. It has effectively solved the problem of authentication of trace material evidence in the field of forensic science. Additionally, almost no external interference is introduced during measurements by infrared microscopic technique. It can satisfy the special need that the trace material evidence must be reserved for witness in court. It is illustrated in detail through real case analysis in this experimental center that, infrared microscopic technique has advantages in authentication of trace material evidence in forensic science field. In this paper, the vibration features in infrared spectra of material evidences, including paints, plastics, rubbers, fibers, drugs and toxicants, can be comparatively analyzed by means of infrared microscopic technique, in an attempt to provide powerful spectroscopic evidence for qualitative diagnosis of various criminal and traffic accident cases. The experimental results clearly suggest that infrared microscopic technique has an incomparable advantage and it has become an effective method for authentication of trace material evidence in the field of forensic science.
Shining light on neurons--elucidation of neuronal functions by photostimulation.
Eder, Matthias; Zieglgänsberger, Walter; Dodt, Hans-Ulrich
2004-01-01
Many neuronal functions can be elucidated by techniques that allow for a precise stimulation of defined regions of a neuron and its afferents. Photolytic release of neurotransmitters from 'caged' derivates in the vicinity of visualized neurons in living brain slices meets this request. This technique allows the study of the subcellular distribution and properties of functional native neurotransmitter receptors. These are prerequisites for a detailed analysis of the expression and spatial specificity of synaptic plasticity. Photostimulation can further be used to fast map the synaptic connectivity between nearby and, more importantly, distant cells in a neuronal network. Here we give a personal review of some of the technical aspects of photostimulation and recent findings, which illustrate the advantages of this technique.
Soft errors in commercial off-the-shelf static random access memories
NASA Astrophysics Data System (ADS)
Dilillo, L.; Tsiligiannis, G.; Gupta, V.; Bosser, A.; Saigne, F.; Wrobel, F.
2017-01-01
This article reviews state-of-the-art techniques for the evaluation of the effect of radiation on static random access memory (SRAM). We detailed irradiation test techniques and results from irradiation experiments with several types of particles. Two commercial SRAMs, in 90 and 65 nm technology nodes, were considered as case studies. Besides the basic static and dynamic test modes, advanced stimuli for the irradiation tests were introduced, as well as statistical post-processing techniques allowing for deeper analysis of the correlations between bit-flip cross-sections and design/architectural characteristics of the memory device. Further insight is provided on the response of irradiated stacked layer devices and on the use of characterized SRAM devices as particle detectors.
Preparing Colorful Astronomical Images II
NASA Astrophysics Data System (ADS)
Levay, Z. G.; Frattare, L. M.
2002-12-01
We present additional techniques for using mainstream graphics software (Adobe Photoshop and Illustrator) to produce composite color images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope to produce photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to present more detail and additional techniques, taking advantage of new or improved features available in the latest software versions. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels.
The study of the thermally expanded core technique in end-pumped (N+1)×1 type combiner
NASA Astrophysics Data System (ADS)
Wu, Juan; Sun, Yinhong; Wang, Yanshan; Li, Tenglong; Feng, Yujun; Ma, Yi
2015-02-01
Tapering will raise the signal loss in an end-pumped (N+1)×1 type combiner. In this paper, the Thermally Expanded Core (TEC) technique is used in the signal loss optimization experiment with the tapering ratio of the pump combiner is 0.6. The experimental results indicate that the coupling efficiency of the 1.55μm signal light increases from 81.1% to 86.6%, after being heated 10 minutes at the homo-waist region of the tapered signal fiber with an 8mm wide hydroxygen flame. Detail analysis shows that the TEC technique can both reduce the loss of the LP01 mode and the LP11 mode in the signal fiber.
Application of the Shell/3D Modeling Technique for the Analysis of Skin-Stiffener Debond Specimens
NASA Technical Reports Server (NTRS)
Krueger, Ronald; O'Brien, T. Kevin; Minguet, Pierre J.
2002-01-01
The application of a shell/3D modeling technique for the simulation of skin/stringer debond in a specimen subjected to three-point bending is demonstrated. The global structure was modeled with shell elements. A local three-dimensional model, extending to about three specimen thicknesses on either side of the delamination front was used to capture the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from shell/13D simulations were in good agreement with results obtained from full solid models. The good correlations of the results demonstrated the effectiveness of the shell/3D modeling technique for the investigation of skin/stiffener separation due to delamination in the adherents.
Identifying Solution Paths in Cognitive Diagnosis.
1985-03-01
the clinical tradi- tion. Psychoanalysis gets its distinctive flavor to a large extent from the various diagnostic techniques pioneered by Freud and...his disciples: free association, dream analysis, and pro- jective tests. These methods are content-oriented, and aim at a detailed description of the...Also, Psychoanalysis shares with Test Psychology the ambition to find out about stable properties of the individual, properties of his mental life
Microlensing observations rapid search for exoplanets: MORSE code for GPUs
NASA Astrophysics Data System (ADS)
McDougall, Alistair; Albrow, Michael D.
2016-02-01
The rapid analysis of ongoing gravitational microlensing events has been integral to the successful detection and characterization of cool planets orbiting low-mass stars in the Galaxy. In this paper, we present an implementation of search and fit techniques on graphical processing unit (GPU) hardware. The method allows for the rapid identification of candidate planetary microlensing events and their subsequent follow-up for detailed characterization.
A technique for mapping urban ash trees using object-based image analysis
Dacia M. Meneguzzo; Susan J. Crocker; Greg C. Liknes
2010-01-01
Ash trees are an important resource in the State of Minnesota and a common fixture lining the streets of the Twin Cities metropolitan area. In 2009, the emerald ash borer (EAB), an invasive pest of ash, was discovered in the city of St. Paul. To properly respond to the new-found threat, decisionmakers would benefit from detailed, spatially explicit information on the...
NASA Astrophysics Data System (ADS)
Sureshkumar, B.; Mary, Y. Sheena; Resmi, K. S.; Panicker, C. Yohannan; Armaković, Stevan; Armaković, Sanja J.; Van Alsenoy, C.; Narayana, B.; Suma, S.
2018-03-01
Two 8-hydroxyquinoline derivatives, 5,7-dichloro-8-hydroxyquinoline (57DC8HQ) and 5-chloro-7-iodo-8-hydroxy quinoline (5CL7I8HQ) have been investigated in details by means of spectroscopic characterization and computational molecular modelling techniques. FT-IR and FT-Raman experimental spectroscopic approaches have been utilized in order to obtain detailed spectroscopic signatures of title compounds, while DFT calculations have been used in order to visualize and assign vibrations. The computed values of dipole moment, polarizability and hyperpolarizability indicate that the title molecules exhibit NLO properties. The evaluated HOMO and LUMO energies demonstrate the chemical stability of the molecules. NBO analysis is made to study the stability of the molecules arising from hyperconjugative interactions and charge delocalization. DFT calculations have been also used jointly with MD simulations in order to investigate in details global and local reactivity properties of title compounds. Also, molecular docking has been also used in order to investigate affinity of title compounds against decarboxylase inhibitor and quinoline derivatives can be a lead compounds for developing new antiparkinsonian drug.
NASA Astrophysics Data System (ADS)
Adamson, Kathryn; Candy, Ian; Whitfield, Liz
2015-04-01
Pedogenic calcretes are abundant in arid and semi-arid regions, and they are widely used as proxy records of palaeoclimatic change. Calcrete oxygen (δ18O) and carbon (δ13C) isotopic signatures are indicative of temperature, aridity, or vegetation at the time of calcrete formation. Their microfabrics also reflect carbonate formation mechanisms in response to the prevailing environmental conditions. Many studies have explored calcrete micromorphology or stable isotope composition, but these techniques have not yet been applied simultaneously. This co-analysis is important as it allows us to establish whether calcrete morphology directly reflects environmental change. This study tests the potential of combining these analyses to examine the relationships between calcrete microfabrics, their isotopic signals, and Quaternary climate change. Calcretes from four river terraces of the Rio Alias in southeast Spain have been analysed in detail. On the basis of morphostratigraphic correlation (Maher et al., 2007) and Uranium-series ages (Candy et al., 2005), these span the period from 304 ± 26 ka (MIS 9) to the Holocene. The oldest profiles have therefore been exposed to multiple glacial-interglacial cycles. A total of 37 micromorphological profiles have been used to extract stable oxygen and carbon isotopic indicators from 77 microfacies. The morphological and isotopic complexity of the calcrete profiles increases with progressive age. The oldest samples display multiple calcretisation phases, and their microfabrics have a larger isotopic range than the younger samples. Alpha (non-biogenic) fabrics have higher δ13C and δ18O values than beta (biogenic) fabrics. Strong positive covariance between δ13C and δ18O within all profiles suggests that both isotopes are responding to the same environmental parameter. We suggest that this is relative aridity. The study demonstrates that the detailed co-analysis of calcrete micromorphology and stable isotope signatures allows calcrete formation patterns to be placed into a wider palaeoclimatic context. Importantly, this technique provides a level of detail that is not possible through bulk isotope sampling alone. It demonstrates the potential of this technique to more reliably constrain the palaeoenvironmental significance of secondary carbonates in dryland settings where other proxy records may be poorly preserved.
An interactive local flattening operator to support digital investigations on artwork surfaces.
Pietroni, Nico; Massimiliano, Corsini; Cignoni, Paolo; Scopigno, Roberto
2011-12-01
Analyzing either high-frequency shape detail or any other 2D fields (scalar or vector) embedded over a 3D geometry is a complex task, since detaching the detail from the overall shape can be tricky. An alternative approach is to move to the 2D space, resolving shape reasoning to easier image processing techniques. In this paper we propose a novel framework for the analysis of 2D information distributed over 3D geometry, based on a locally smooth parametrization technique that allows us to treat local 3D data in terms of image content. The proposed approach has been implemented as a sketch-based system that allows to design with a few gestures a set of (possibly overlapping) parameterizations of rectangular portions of the surface. We demonstrate that, due to the locality of the parametrization, the distortion is under an acceptable threshold, while discontinuities can be avoided since the parametrized geometry is always homeomorphic to a disk. We show the effectiveness of the proposed technique to solve specific Cultural Heritage (CH) tasks: the analysis of chisel marks over the surface of a unfinished sculpture and the local comparison of multiple photographs mapped over the surface of an artwork. For this very difficult task, we believe that our framework and the corresponding tool are the first steps toward a computer-based shape reasoning system, able to support CH scholars with a medium they are more used to. © 2011 IEEE
A comparison of measured and theoretical predictions for STS ascent and entry sonic booms
NASA Technical Reports Server (NTRS)
Garcia, F., Jr.; Jones, J. H.; Henderson, H. R.
1983-01-01
Sonic boom measurements have been obtained during the flights of STS-1 through 5. During STS-1, 2, and 4, entry sonic boom measurements were obtained and ascent measurements were made on STS-5. The objectives of this measurement program were (1) to define the sonic boom characteristics of the Space Transportation System (STS), (2) provide a realistic assessment of the validity of xisting theoretical prediction techniques, and (3) establish a level of confidence for predicting future STS configuration sonic boom environments. Detail evaluation and reporting of the results of this program are in progress. This paper will address only the significant results, mainly those data obtained during the entry of STS-1 at Edwards Air Force Base (EAFB), and the ascent of STS-5 from Kennedy Space Center (KSC). The theoretical prediction technique employed in this analysis is the so called Thomas Program. This prediction technique is a semi-empirical method that required definition of the near field signatures, detailed trajectory characteristics, and the prevailing meteorological characteristics as an input. This analytical procedure then extrapolates the near field signatures from the flight altitude to an altitude consistent with each measurement location.
Performance evaluation of a digital mammography unit using a contrast-detail phantom
NASA Astrophysics Data System (ADS)
Elizalde-Cabrera, J.; Brandan, M.-E.
2015-01-01
The relation between image quality and mean glandular dose (MGD) has been studied for a Senographe 2000D mammographic unit used for research in our laboratory. The magnitudes were evaluated for a clinically relevant range of acrylic thicknesses and radiological techniques. The CDMAM phantom was used to determine the contrast-detail curve. Also, an alternative method based on the analysis of signal-to-noise (SNR) and contrast-to-noise (CNR) ratios from the CDMAM image was proposed and applied. A simple numerical model was utilized to successfully interpret the results. Optimum radiological techniques were determined using the figures-of-merit FOMSNR=SNR2/MGD and FOMCNR=CNR2/MGD. Main results were: the evaluation of the detector response flattening process (it reduces by about one half the spatial non-homogeneities due to the X- ray field), MGD measurements (the values comply with standards), and verification of the automatic exposure control performance (it is sensitive to fluence attenuation, not to contrast). For 4-5 cm phantom thicknesses, the optimum radiological techniques were Rh/Rh 34 kV to optimize SNR, and Rh/Rh 28 kV to optimize CNR.
Active Neutron and Gamma-Ray Instrumentation for In Situ Planetary Science Applications
NASA Technical Reports Server (NTRS)
Parsons, A.; Bodnarik, J.; Evans, L.; Floyd, A.; Lim, L.; McClanahan, T.; Namkung, M.; Nowicki, S.; Schweitzer, J.; Starr, R.;
2011-01-01
We describe the development of an instrument capable of detailed in situ bulk geochemical analysis of the surface of planets, moons, asteroids, and comets. This instrument technology uses a pulsed neutron generator to excite the solid materials of a planet and measures the resulting neutron and gamma-ray emission with its detector system. These time-resolved neutron and gamma-ray data provide detailed information about the bulk elemental composition, chemical context, and density distribution of the soil within 50 cm of the surface. While active neutron scattering and neutron-induced gamma-ray techniques have been used extensively for terrestrial nuclear well logging applications, our goal is to apply these techniques to surface instruments for use on any solid solar system body. As described, experiments at NASA Goddard Space Flight Center use a prototype neutron-induced gamma-ray instrument and the resulting data presented show the promise of this technique for becoming a versatile, robust, workhorse technology for planetary science, and exploration of any of the solid bodies in the solar system. The detection of neutrons at the surface also provides useful information about the material. This paper focuses on the data provided by the gamma-ray detector.
NASA Technical Reports Server (NTRS)
Whiteman, David N.
2003-01-01
The intent of this paper and its companion is to compile together the essential information required for the analysis of Raman lidar water vapor and aerosol data acquired using a single laser wavelength. In this first paper several details concerning the evaluation of the lidar equation when measuring Raman scattering are considered. These details include the influence of the temperature dependence of both pure rotational and vibrational-rotational Raman scattering on the lidar profile. These are evaluated for the first time using a new form of the lidar equation. The results indicate that, for the range of temperatures encountered in the troposphere, the magnitude of the temperature dependent effect can reach 10% or more for narrowband Raman water vapor measurements. Also the calculation of atmospheric transmission is examined carefully including the effects of depolarization. Different formulations of Rayleigh cross section determination commonly used in the lidar field are compared revealing differences up to 5% among the formulations. The influence of multiple scattering on the measurement of aerosol extinction using the Raman lidar technique is considered as are several photon pulse-pileup correction techniques.
Making a big thing of a small cell--recent advances in single cell analysis.
Galler, Kerstin; Bräutigam, Katharina; Große, Christina; Popp, Jürgen; Neugebauer, Ute
2014-03-21
Single cell analysis is an emerging field requiring a high level interdisciplinary collaboration to provide detailed insights into the complex organisation, function and heterogeneity of life. This review is addressed to life science researchers as well as researchers developing novel technologies. It covers all aspects of the characterisation of single cells (with a special focus on mammalian cells) from morphology to genetics and different omics-techniques to physiological, mechanical and electrical methods. In recent years, tremendous advances have been achieved in all fields of single cell analysis: (1) improved spatial and temporal resolution of imaging techniques to enable the tracking of single molecule dynamics within single cells; (2) increased throughput to reveal unexpected heterogeneity between different individual cells raising the question what characterizes a cell type and what is just natural biological variation; and (3) emerging multimodal approaches trying to bring together information from complementary techniques paving the way for a deeper understanding of the complexity of biological processes. This review also covers the first successful translations of single cell analysis methods to diagnostic applications in the field of tumour research (especially circulating tumour cells), regenerative medicine, drug discovery and immunology.
A finite difference-time domain technique for modeling narrow apertures in conducting scatterers
NASA Technical Reports Server (NTRS)
Demarest, Kenneth R.
1987-01-01
The finite difference-time domain (FDTD) technique has proven to be a valuable tool for the calculation of the transient and steady state scattering characteristics of relatively complex scatterer and source configurations. In spite of its usefulness, it exhibits serious deficiencies when used to analyze geometries that contain fine detail. An FDTD technique is described that utilizes Babinet's principle to decouple the regions on both sides of the aperture. The result is an FDTD technique that is capable of modeling apertures that are much smaller than the spatial grid used in the analysis and yet is not perturbed by numerical noise when used in the 'scattered field' mode. Numerical results are presented that show the field penetration through cavity-backed apertures that are much smaller than the spatial grid used during the solution.
NASA Astrophysics Data System (ADS)
Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.
Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.
On the Power of Abstract Interpretation
NASA Technical Reports Server (NTRS)
Reddy, Uday S.; Kamin, Samuel N.
1991-01-01
Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.
Puppe, B; Schön, P C; Wendland, K
1999-07-01
The paper presents a new system for the automatic monitoring of open field activity and choice behaviour of medium-sized animals. Passive infrared motion detectors (PID) were linked on-line via a digital I/O interface to a personal computer provided with self-developed analysis software based on LabVIEW (PID technique). The set up was used for testing 18 one-week-old piglets (Sus scrofa) for their approach to their mother's nursing vocalization replayed through loudspeakers. The results were validated by comparison with a conventional Observer technique, a computer-aided direct observation. In most of the cases, no differences were seen between the Observer and PID technique regarding the percentage of stay in previously defined open field segments, the locomotor open field activity, and the choice behaviour. The results revealed that piglets are clearly attracted by their mother's nursing vocalization. The monitoring system presented in this study is thus suitable for detailed behavioural investigations of individual acoustic recognition. In general, the PID technique is a useful tool for research into the behaviour of individual animals in a restricted open field which does not rely on subjective analysis by a human observer.
Advanced technology development multi-color holography
NASA Technical Reports Server (NTRS)
Vikram, Chandra S.
1993-01-01
This is the final report of the Multi-color Holography project. The comprehensive study considers some strategic aspects of multi-color holography. First, various methods of available techniques for accurate fringe counting are reviewed. These are heterodyne interferometry, quasi-heterodyne interferometry, and phase-shifting interferometry. Phase-shifting interferometry was found to be the most suitable for multi-color holography. Details of experimentation with a sugar solution are also reported where better than 1/200 of a fringe order measurement capability was established. Rotating plate glass phase shifter was used for the experimentation. The report then describes the possible role of using more than two wavelengths with special reference-to-object beam intensity ratio needs in multicolor holography. Some specific two- and three-color cases are also described in detail. Then some new analysis methods of the reconstructed wavefront are considered. These are deflectometry, speckle metrology, confocal optical signal processing, and phase shifting technique related applications. Finally, design aspects of an experimental breadboard are presented.
NASA Technical Reports Server (NTRS)
Yost, E. F. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The first part of the study resulted in photographic procedures for making multispectral positive images which greatly enhance the color differences in land detail using an additive color viewer. An additive color analysis of the geologic features near Willcox, Arizona using enhanced black and white multispectral positives allowed compilation of a significant number of unmapped geologic units which do not appear on geologic maps of the area. The second part demonstrated the feasibility of utilizing Skylab remote sensor data to monitor and manage the coastal environment by relating physical, chemical, and biological ship sampled data to S190A, S190B, and S192 image characteristics. Photographic reprocessing techniques were developed which greatly enhanced subtle low brightness water detail. Using these photographic contrast-stretch techniques, two water masses having an extinction coefficient difference of only 0.07 measured simultaneously with the acquisition of S190A data were readily differentiated.
Sarkar, Debasish; Mandal, Kalyan; Mandal, Madhuri
2014-03-01
Here solvo-thermal technique has been used to synthesize hollow-nanospheres of magnetite. We have shown that PVP plays an important role to control the particle size and also helps the particles to take the shape of hollow spheres. Structural analysis was done by XRD measurement and morphological measurements like SEM and TEM were performed to confirm the hollow type spherical particles formation and their shape and sizes were also investigated. The detail ac-dc magnetic measurements give an idea about the application of these nano spheres for hyperthermia therapy and spontaneous dye adsorption properties (Gibbs free energy deltaG0 = -0.526 kJ/mol for Eosin and -1.832 kJ/mol for MB) of these particles indicate its use in dye manufacturing company. Being hollow in structure and magnetic in nature such materials will also be useful in other application fields like in drug delivery, arsenic and heavy metal removal by adsorption technique, magnetic separation etc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Matthew Christopher, E-mail: wardm3@ccf.org; Pham, Yvonne D.; Kotecha, Rupesh
2016-04-01
Conventional parallel-opposed radiotherapy (PORT) is the established standard technique for early-stage glottic carcinoma. However, case reports have reported the utility of intensity-modulated radiotherapy (IMRT) and volumetric-modulated arc therapy (VMAT) with or without image guidance (image-guided radiotherapy, IGRT) in select patients. The proposed advantages of IMRT/VMAT include sparing of the carotid artery, thyroid gland, and the remaining functional larynx, although these benefits remain unclear. The following case study presents a patient with multiple vascular comorbidities treated with VMAT for early-stage glottic carcinoma. A detailed explanation of the corresponding treatment details, dose-volume histogram (DVH) analysis, and a review of the relevant literaturemore » are provided. Conventional PORT remains the standard of care for early-stage glottic carcinoma. IMRT or VMAT may be beneficial for select patients, although great care is necessary to avoid a geographical miss. Clinical data supporting the benefit of CRT are lacking. Therefore, these techniques should be used with caution and only in selected patients.« less
Umbilical Connect Techniques Improvement-Technology Study
NASA Technical Reports Server (NTRS)
Valkema, Donald C.
1972-01-01
The objective of this study was to develop concepts, specifications, designs, techniques, and procedures capable of significantly reducing the time required to connect and verify umbilicals for ground services to the space shuttle. The desired goal was to reduce the current time requirement of several shifts for the Saturn 5/Apollo to an elapsed time of less than one hour to connect and verify all of the space shuttle ground service umbilicals. The study was conducted in four phases: (1) literature and hardware examination, (2) concept development, (3) concept evaluation and tradeoff analysis, and (4) selected concept design. The final product of this study was a detail design of a rise-off disconnect panel prototype test specimen for a LO2/LH2 booster (or an external oxygen/hydrogen tank for an orbiter), a detail design of a swing-arm mounted preflight umbilical carrier prototype test specimen, and a part 1 specification for the umbilical connect and verification design for the vehicles as defined in the space shuttle program.
Modelling, design and stability analysis of an improved SEPIC converter for renewable energy systems
NASA Astrophysics Data System (ADS)
G, Dileep; Singh, S. N.; Singh, G. K.
2017-09-01
In this paper, a detailed modelling and analysis of a switched inductor (SI)-based improved single-ended primary inductor converter (SEPIC) has been presented. To increase the gain of conventional SEPIC converter, input and output side inductors are replaced with SI structures. Design and stability analysis for continuous conduction mode operation of the proposed SI-SEPIC converter has also been presented in this paper. State space averaging technique is used to model the converter and carry out the stability analysis. Performance and stability analysis of closed loop configuration is predicted by observing the open loop behaviour using Nyquist diagram and Nichols chart. System was found to stable and critically damped.
Eckhard, Ulrich; Huesgen, Pitter F; Schilling, Oliver; Bellac, Caroline L; Butler, Georgina S; Cox, Jennifer H; Dufour, Antoine; Goebeler, Verena; Kappelhoff, Reinhild; Auf dem Keller, Ulrich; Klein, Theo; Lange, Philipp F; Marino, Giada; Morrison, Charlotte J; Prudova, Anna; Rodriguez, David; Starr, Amanda E; Wang, Yili; Overall, Christopher M
2016-06-01
The data described provide a comprehensive resource for the family-wide active site specificity portrayal of the human matrix metalloproteinase family. We used the high-throughput proteomic technique PICS (Proteomic Identification of protease Cleavage Sites) to comprehensively assay 9 different MMPs. We identified more than 4300 peptide cleavage sites, spanning both the prime and non-prime sides of the scissile peptide bond allowing detailed subsite cooperativity analysis. The proteomic cleavage data were expanded by kinetic analysis using a set of 6 quenched-fluorescent peptide substrates designed using these results. These datasets represent one of the largest specificity profiling efforts with subsequent structural follow up for any protease family and put the spotlight on the specificity similarities and differences of the MMP family. A detailed analysis of this data may be found in Eckhard et al. (2015) [1]. The raw mass spectrometry data and the corresponding metadata have been deposited in PRIDE/ProteomeXchange with the accession number PXD002265.
Analyzing Visibility Configurations.
Dachsbacher, C
2011-04-01
Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.
NASA Technical Reports Server (NTRS)
Olivas, J. D.; Melroy, P.; McDanels, S.; Wallace, T.; Zapata, M. C.
2006-01-01
In connection with the accident investigation of the space shuttle Columbia, an analysis methodology utilizing well established microscopic and spectroscopic techniques was implemented for evaluating the environment to which the exterior fused silica glass was exposed. Through the implementation of optical microscopy, scanning electron microscopy, energy dispersive spectroscopy, transmission electron microscopy, and electron diffraction, details emerged regarding the manner in which a charred metallic deposited layer formed on top of the exposed glass. Due to nature of the substrate and the materials deposited, the methodology proved to allow for a more detailed analysis of the vehicle breakup. By contrast, similar analytical methodologies on metallic substrates have proven to be challenging due to strong potential for error resulting from substrate contamination. This information proved to be valuable to not only those involved in investigating the break up of Columbia, but also provides a potential guide for investigating future high altitude and high energy accidents.
An advanced software suite for the processing and analysis of silicon luminescence images
NASA Astrophysics Data System (ADS)
Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.
2017-06-01
Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.
Zimmerman, Heather A; Meizel-Lambert, Cayli J; Schultz, John J; Sigman, Michael E
2015-03-01
Forensic anthropologists are generally able to identify skeletal materials (bone and tooth) using gross anatomical features; however, highly fragmented or taphonomically altered materials may be problematic to identify. Several chemical analysis techniques have been shown to be reliable laboratory methods that can be used to determine if questionable fragments are osseous, dental, or non-skeletal in nature. The purpose of this review is to provide a detailed background of chemical analysis techniques focusing on elemental compositions that have been assessed for use in differentiating osseous, dental, and non-skeletal materials. More recently, chemical analysis studies have also focused on using the elemental composition of osseous/dental materials to evaluate species and provide individual discrimination, but have generally been successful only in small, closed groups, limiting their use forensically. Despite significant advances incorporating a variety of instruments, including handheld devices, further research is necessary to address issues in standardization, error rates, and sample size/diversity. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
The view from the tip of the iceberg.
Josephs, L
1997-01-01
In recent years there has been a growing interest in refining the technique of ego defense analysis. All of these approaches share in common an attempt to work closely with the patient's free associations, to interpret at a level that is accessible to the patient's consciously observing ego, and to avoid bypassing the analysis of the patient's most surface-level resistances in an effort to understand unconscious conflict. These innovations reflect a commendable effort to work in a way that is rigorously empirical, that respects the patient's autonomy, and that minimizes the pressure of the analyst's transferential authority in the patient's acceptance of the analyst's interpretations. Despite the undeniable value of these technical innovations, such approaches to ego defense analysis may inadvertently result in certain overemphases in technique that may unnecessarily constrain the analytic process. They may result in a sort of obsessive tunnel vision that is overly focused on small details to the exclusion of the larger picture. An approach that counterbalances the microscopic and the macroscopic analysis of ego defense is recommended.
Using multiple group modeling to test moderators in meta-analysis.
Schoemann, Alexander M
2016-12-01
Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Reliability studies of Integrated Modular Engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1991-01-01
The behavior of stochastic processes was studied whose power spectra are described by power-law behavior. The details of the analysis and the conclusions that were reached are presented. This analysis was extended to compare detection capabilities of different measurement techniques (e.g., gravimetry and GPS for the vertical, and seismometers and GPS for horizontal), both in general and for the specific case of the deformations produced by a dislocation in a half-space (which applies to seismic of preseismic sources). The time-domain behavior of power-law noises is also investigated.
2006-04-21
C. M., and Prendergast, J. P., 2002, "Thermial Analysis of Hypersonic Inlet Flow with Exergy -Based Design Methods," International Journal of Applied...parametric study of the PS and its components is first presented in order to show the type of detailed information on internal system losses which an exergy ...Thermoeconomic Isolation Applied to the Optimal Synthesis/Design of an Advanced Fighter Aircraft System," International Journal of Thermodynamics, ICAT
Reliability studies of integrated modular engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
NASA Technical Reports Server (NTRS)
Kindle, E. C.; Bandy, E. C.; Copeland, G.; Blais, R.; Levy, G.; Sonenshine, D.
1975-01-01
Past research projects for the year 1974-1975 are listed along with future research programs in the area of air pollution control, remote sensor analysis of smoke plumes, the biosphere component, and field experiments. A detailed budget analysis is presented. Attachments are included on the following topics: mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques, and use of LARS system for the quantitative determination of smoke plume lateral diffusion coefficients from ERTS images of Virginia.
Reliability studies of integrated modular engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of Integrated Modular Engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Troy, Karen L; Edwards, W Brent
2018-05-01
Quantitative CT (QCT) analysis involves the calculation of specific parameters such as bone volume and density from CT image data, and can be a powerful tool for understanding bone quality and quantity. However, without careful attention to detail during all steps of the acquisition and analysis process, data can be of poor- to unusable-quality. Good quality QCT for research requires meticulous attention to detail and standardization of all aspects of data collection and analysis to a degree that is uncommon in a clinical setting. Here, we review the literature to summarize practical and technical considerations for obtaining high quality QCT data, and provide examples of how each recommendation affects calculated variables. We also provide an overview of the QCT analysis technique to illustrate additional opportunities to improve data reproducibility and reliability. Key recommendations include: standardizing the scanner and data acquisition settings, minimizing image artifacts, selecting an appropriate reconstruction algorithm, and maximizing repeatability and objectivity during QCT analysis. The goal of the recommendations is to reduce potential sources of error throughout the analysis, from scan acquisition to the interpretation of results. Copyright © 2018 Elsevier Inc. All rights reserved.
The Evolution of 3D Microimaging Techniques in Geosciences
NASA Astrophysics Data System (ADS)
Sahagian, D.; Proussevitch, A.
2009-05-01
In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.
Automated diagnosis of fetal alcohol syndrome using 3D facial image analysis
Fang, Shiaofen; McLaughlin, Jason; Fang, Jiandong; Huang, Jeffrey; Autti-Rämö, Ilona; Fagerlund, Åse; Jacobson, Sandra W.; Robinson, Luther K.; Hoyme, H. Eugene; Mattson, Sarah N.; Riley, Edward; Zhou, Feng; Ward, Richard; Moore, Elizabeth S.; Foroud, Tatiana
2012-01-01
Objectives Use three-dimensional (3D) facial laser scanned images from children with fetal alcohol syndrome (FAS) and controls to develop an automated diagnosis technique that can reliably and accurately identify individuals prenatally exposed to alcohol. Methods A detailed dysmorphology evaluation, history of prenatal alcohol exposure, and 3D facial laser scans were obtained from 149 individuals (86 FAS; 63 Control) recruited from two study sites (Cape Town, South Africa and Helsinki, Finland). Computer graphics, machine learning, and pattern recognition techniques were used to automatically identify a set of facial features that best discriminated individuals with FAS from controls in each sample. Results An automated feature detection and analysis technique was developed and applied to the two study populations. A unique set of facial regions and features were identified for each population that accurately discriminated FAS and control faces without any human intervention. Conclusion Our results demonstrate that computer algorithms can be used to automatically detect facial features that can discriminate FAS and control faces. PMID:18713153
Nanoscale infrared spectroscopy as a non-destructive probe of extraterrestrial samples.
Dominguez, Gerardo; Mcleod, A S; Gainsforth, Zack; Kelly, P; Bechtel, Hans A; Keilmann, Fritz; Westphal, Andrew; Thiemens, Mark; Basov, D N
2014-12-09
Advances in the spatial resolution of modern analytical techniques have tremendously augmented the scientific insight gained from the analysis of natural samples. Yet, while techniques for the elemental and structural characterization of samples have achieved sub-nanometre spatial resolution, infrared spectral mapping of geochemical samples at vibrational 'fingerprint' wavelengths has remained restricted to spatial scales >10 μm. Nevertheless, infrared spectroscopy remains an invaluable contactless probe of chemical structure, details of which offer clues to the formation history of minerals. Here we report on the successful implementation of infrared near-field imaging, spectroscopy and analysis techniques capable of sub-micron scale mineral identification within natural samples, including a chondrule from the Murchison meteorite and a cometary dust grain (Iris) from NASA's Stardust mission. Complementary to scanning electron microscopy, energy-dispersive X-ray spectroscopy and transmission electron microscopy probes, this work evidences a similarity between chondritic and cometary materials, and inaugurates a new era of infrared nano-spectroscopy applied to small and invaluable extraterrestrial samples.
Filtering of high noise breast thermal images using fast non-local means.
Suganthi, S S; Ramakrishnan, S
2014-01-01
Analyses of breast thermograms are still a challenging task primarily due to the limitations such as low contrast, low signal to noise ratio and absence of clear edges. Therefore, always there is a requirement for preprocessing techniques before performing any quantitative analysis. In this work, a noise removal framework using fast non-local means algorithm, method noise and median filter was used to denoise breast thermograms. The images considered were subjected to Anscombe transformation to convert the distribution from Poisson to Gaussian. The pre-denoised image was obtained by subjecting the transformed image to fast non-local means filtering. The method noise which is the difference between the original and pre-denoised image was observed with the noise component merged in few structures and fine detail of the image. The image details presented in the method noise was extracted by smoothing the noise part using the median filter. The retrieved image part was added to the pre-denoised image to obtain the final denoised image. The performance of this technique was compared with that of Wiener and SUSAN filters. The results show that all the filters considered are able to remove the noise component. The performance of the proposed denoising framework is found to be good in preserving detail and removing noise. Further, the method noise is observed with negligible image details. Similarly, denoised image with no noise and smoothed edges are observed using Wiener filter and its method noise is contained with few structures and image details. The performance results of SUSAN filter is found to be blurred denoised image with little noise and also method noise with extensive structure and image details. Hence, it appears that the proposed denoising framework is able to preserve the edge information and generate clear image that could help in enhancing the diagnostic relevance of breast thermograms. In this paper, the introduction, objectives, materials and methods, results and discussion and conclusions are presented in detail.
Baù, Marco; Ferrari, Marco; Ferrari, Vittorio
2017-01-01
A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors. PMID:28574459
Baù, Marco; Ferrari, Marco; Ferrari, Vittorio
2017-06-02
A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors.
Change analysis in the United Arab Emirates: An investigation of techniques
Sohl, Terry L.
1999-01-01
Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change.
Detailed Modeling and Analysis of the CPFM Dataset
NASA Technical Reports Server (NTRS)
Swartz, William H.; Lloyd, Steven A.; DeMajistre, Robert
2004-01-01
A quantitative understanding of photolysis rate coefficients (or "j-values") is essential to determining the photochemical reaction rates that define ozone loss and other crucial processes in the atmosphere. j-Values can be calculated with radiative transfer models, derived from actinic flux observations, or inferred from trace gas measurements. The principal objective of this study is to cross-validate j-values from the Composition and Photodissociative Flux Measurement (CPFM) instrument during the Photochemistry of Ozone Loss in the Arctic Region In Summer (POLARIS) and SAGE I11 Ozone Loss and Validation Experiment (SOLVE) field campaigns with model calculations and other measurements and to use this detailed analysis to improve our ability to determine j-values. Another objective is to analyze the spectral flux from the CPFM (not just the j-values) and, using a multi-wavelength/multi-species spectral fitting technique, determine atmospheric composition.
Fulop, Sean A; Fitz, Kelly
2006-01-01
A modification of the spectrogram (log magnitude of the short-time Fourier transform) to more accurately show the instantaneous frequencies of signal components was first proposed in 1976 [Kodera et al., Phys. Earth Planet. Inter. 12, 142-150 (1976)], and has been considered or reinvented a few times since but never widely adopted. This paper presents a unified theoretical picture of this time-frequency analysis method, the time-corrected instantaneous frequency spectrogram, together with detailed implementable algorithms comparing three published techniques for its computation. The new representation is evaluated against the conventional spectrogram for its superior ability to track signal components. The lack of a uniform framework for either mathematics or implementation details which has characterized the disparate literature on the schemes has been remedied here. Fruitful application of the method is shown in the realms of speech phonation analysis, whale song pitch tracking, and additive sound modeling.
Using Social Network Measures in Wildlife Disease Ecology, Epidemiology, and Management
Silk, Matthew J.; Croft, Darren P.; Delahay, Richard J.; Hodgson, David J.; Boots, Mike; Weber, Nicola; McDonald, Robbie A.
2017-01-01
Abstract Contact networks, behavioral interactions, and shared use of space can all have important implications for the spread of disease in animals. Social networks enable the quantification of complex patterns of interactions; therefore, network analysis is becoming increasingly widespread in the study of infectious disease in animals, including wildlife. We present an introductory guide to using social-network-analytical approaches in wildlife disease ecology, epidemiology, and management. We focus on providing detailed practical guidance for the use of basic descriptive network measures by suggesting the research questions to which each technique is best suited and detailing the software available for each. We also discuss how using network approaches can be used beyond the study of social contacts and across a range of spatial and temporal scales. Finally, we integrate these approaches to examine how network analysis can be used to inform the implementation and monitoring of effective disease management strategies. PMID:28596616
Three-Dimensional Integrated Survey for Building Investigations.
Costantino, Domenica; Angelini, Maria Giuseppa
2015-11-01
The study shows the results of a survey aimed to represent a building collapse and the feasibility of the modellation as a support of structure analysis. An integrated survey using topographic, photogrammetric, and terrestrial laser techniques was carried out to obtain a three-dimensional (3D) model of the building, plans and prospects, and the particulars of the collapsed area. Authors acquired, by a photogrammetric survey, information about regular parties of the structure; while using laser scanner data they reconstructed a set of more interesting architectural details and areas with higher surface curvature. Specifically, the process of texture provided a detailed 3D structure of the areas under investigation. The analysis of the data acquired resulted to be very useful both in identifying the causes of the disaster and also in helping the reconstruction of the collapsed corner showing the contribution that the integrated surveys can give in preserving architectural and historic heritage. © 2015 American Academy of Forensic Sciences.
DANSS Neutrino Spectrometer: Detector Calibration, Response Stability, and Light Yield
NASA Astrophysics Data System (ADS)
Alekseev, I. G.; Belov, V. V.; Danilov, M. V.; Zhitnikov, I. V.; Kobyakin, A. S.; Kuznetsov, A. S.; Machikhiliyan, I. V.; Medvedev, D. V.; Rusinov, V. Yu.; Svirida, D. N.; Skrobova, N. A.; Starostin, A. S.; Tarkovsky, E. I.; Fomina, M. V.; Shevchik, E. A.; Shirchenko, M. V.
2018-05-01
Apart from monitoring nuclear reactor parameters, the DANSS neutrino experiment is aimed at searching for sterile neutrinos through a detailed analysis of the ratio of reactor antineutrino spectra measured at different distances from the reactor core. The light collection system of the detector is dual, comprising both the vacuum photomultiplier tubes (PMTs) and silicon photomultipliers (SiPMs). In this paper, the techniques developed to calibrate the responses of these photodetectors are discussed in detail. The long-term stability of the key parameters of the detector and their dependences on the ambient temperature are investigated. The results of detector light yield measurements, performed independently with PMTs and SiPMs are reported.
Kobayashi, Yuta; Kawaguchi, Yoshikuni; Kobayashi, Kosuke; Mori, Kazuhiro; Arita, Junichi; Sakamoto, Yoshihiro; Hasegawa, Kiyoshi; Kokudo, Norihiro
2017-12-01
Portal vein (PV) territory identification during liver resection may be performed using indocyanine green (ICG) fluorescence imaging technique. However, the technical details of the fluorescence staining technique have not been fully elucidated. This study was performed to demonstrate the technical details of PV territory identification using fluorescence imaging and evaluates the short-term outcomes. From 2011 to 2015, 105 underwent liver resection at the University of Tokyo Hospital with one of the following fluorescence staining techniques by transhepatic PV injection or intravenous injection of ICG: single staining (n = 36), multiple staining (n = 31), counterstaining (n = 22), negative staining (n = 13), or paradoxical negative staining (n = 3). The PV territory was identified as a region with fluorescence or a defect of fluorescence using one of the five staining techniques. ICG was administered by transhepatic PV injection in all but the negative staining technique, which employed intravenous injection. No adverse events associated with the ICG administration occurred. The mortality, postoperative total morbidity, and the major complication (Clavien-Dindo grade ≥III) rates were 0.0%, 14.3%, and 7.6%. We have demonstrated the technical details of five types of fluorescence staining techniques. These techniques are safe to perform and facilitate clear visualization of the PV territory in real time, enhancing the efficacy of anatomical removal of such territories. © 2017 Wiley Periodicals, Inc.
Clinical process cost analysis.
Marrin, C A; Johnson, L C; Beggs, V L; Batalden, P B
1997-09-01
New systems of reimbursement are exerting enormous pressure on clinicians and hospitals to reduce costs. Using cheaper supplies or reducing the length of stay may be a satisfactory short-term solution, but the best strategy for long-term success is radical reduction of costs by reengineering the processes of care. However, few clinicians or institutions know the actual costs of medical care; nor do they understand, in detail, the activities involved in the delivery of care. Finally, there is no accepted method for linking the two. Clinical process cost analysis begins with the construction of a detailed flow diagram incorporating each activity in the process of care. The cost of each activity is then calculated, and the two are linked. This technique was applied to Diagnosis Related Group 75 to analyze the real costs of the operative treatment of lung cancer at one institution. Total costs varied between $6,400 and $7,700. The major driver of costs was personnel time, which accounted for 55% of the total. Forty percent of the total cost was incurred in the operating room. The cost of care decreased progressively during hospitalization. Clinical process cost analysis provides detailed information about the costs and processes of care. The insights thus obtained may be used to reduce costs by reengineering the process.
Forty years of temporal analysis of products
Morgan, K.; Maguire, N.; Fushimi, R.; ...
2017-05-16
Detailed understanding of mechanisms and reaction kinetics are required in order to develop and optimize catalysts and catalytic processes. While steady state investigations are known to give a global view of the catalytic system, transient studies are invaluable since they can provide more detailed insight into elementary steps. For almost thirty years temporal analysis of products (TAP) has been successfully utilized for transient studies of gas phase heterogeneous catalysis, and there have been a number of advances in instrumentation and numerical modeling methods in that time. In the current work, the range of available TAP apparatus will be discussed whilemore » detailed explanations of the types of TAP experiment, the information that can be determined from these experiments and the analysis methods are also included. TAP is a complex methodology and is often viewed as a niche specialty. Here, part of the intention of this work is to highlight the significant contributions TAP can make to catalytic research, while also discussing the issues which will make TAP more relevant and approachable to a wider segment of the catalytic research community. With this in mind, an outlook is also disclosed for the technique in terms of what is needed to revitalize the field and make it more applicable to the recent advances in catalyst characterization (e.g. operando modes).« less
Forty years of temporal analysis of products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, K.; Maguire, N.; Fushimi, R.
Detailed understanding of mechanisms and reaction kinetics are required in order to develop and optimize catalysts and catalytic processes. While steady state investigations are known to give a global view of the catalytic system, transient studies are invaluable since they can provide more detailed insight into elementary steps. For almost thirty years temporal analysis of products (TAP) has been successfully utilized for transient studies of gas phase heterogeneous catalysis, and there have been a number of advances in instrumentation and numerical modeling methods in that time. In the current work, the range of available TAP apparatus will be discussed whilemore » detailed explanations of the types of TAP experiment, the information that can be determined from these experiments and the analysis methods are also included. TAP is a complex methodology and is often viewed as a niche specialty. Here, part of the intention of this work is to highlight the significant contributions TAP can make to catalytic research, while also discussing the issues which will make TAP more relevant and approachable to a wider segment of the catalytic research community. With this in mind, an outlook is also disclosed for the technique in terms of what is needed to revitalize the field and make it more applicable to the recent advances in catalyst characterization (e.g. operando modes).« less
An observational model for biomechanical assessment of sprint kayaking technique.
McDonnell, Lisa K; Hume, Patria A; Nolte, Volker
2012-11-01
Sprint kayaking stroke phase descriptions for biomechanical analysis of technique vary among kayaking literature, with inconsistencies not conducive for the advancement of biomechanics applied service or research. We aimed to provide a consistent basis for the categorisation and analysis of sprint kayak technique by proposing a clear observational model. Electronic databases were searched using key words kayak, sprint, technique, and biomechanics, with 20 sources reviewed. Nine phase-defining positions were identified within the kayak literature and were divided into three distinct types based on how positions were defined: water-contact-defined positions, paddle-shaft-defined positions, and body-defined positions. Videos of elite paddlers from multiple camera views were reviewed to determine the visibility of positions used to define phases. The water-contact-defined positions of catch, immersion, extraction, and release were visible from multiple camera views, therefore were suitable for practical use by coaches and researchers. Using these positions, phases and sub-phases were created for a new observational model. We recommend that kayaking data should be reported using single strokes and described using two phases: water and aerial. For more detailed analysis without disrupting the basic two-phase model, a four-sub-phase model consisting of entry, pull, exit, and aerial sub-phases should be used.
Guevara, María Ángeles; de María, Nuria; Sáez-Laguna, Enrique; Vélez, María Dolores; Cervera, María Teresa; Cabezas, José Antonio
2017-01-01
Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is the methylation-sensitive amplified polymorphism technique (MSAP) which is a modification of amplified fragment length polymorphism (AFLP). It has been used to study methylation of anonymous CCGG sequences in different fungi, plants, and animal species. The main variation of this technique resides on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent-cutter restriction enzyme. For each sample, MSAP analysis is performed using both EcoRI/HpaII- and EcoRI/MspI-digested samples. A comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) methylation-insensitive polymorphisms that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples and (2) methylation-sensitive polymorphisms which are associated with the amplified fragments that differ in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses the modifications that can be applied to adjust the technology to different species of interest.
OPERATIONS RESEARCH IN THE DESIGN OF MANAGEMENT INFORMATION SYSTEMS
management information systems is concerned with the identification and detailed specification of the information and data processing...of advanced data processing techniques in management information systems today, the close coordination of operations research and data systems activities has become a practical necessity for the modern business firm.... information systems in which mathematical models are employed as the basis for analysis and systems design. Operations research provides a
Computerized series solution of relativistic equations of motion.
NASA Technical Reports Server (NTRS)
Broucke, R.
1971-01-01
A method of solution of the equations of planetary motion is described. It consists of the use of numerical general perturbations in orbital elements and in rectangular coordinates. The solution is expanded in Fourier series in the mean anomaly with the aid of harmonic analysis and computerized series manipulation techniques. A detailed application to the relativistic motion of the planet Mercury is described both for Schwarzschild and isotropic coordinates.
Dynamic characteristics of a vibrating beam with periodic variation in bending stiffness
NASA Technical Reports Server (NTRS)
Townsend, John S.
1987-01-01
A detailed dynamic analysis is performed of a vibrating beam with bending stiffness periodic in the spatial coordinate. The effects of system parameters on beam response are explored with a perturbation expansion technique. It is found that periodic stiffness acts to modulate the modal displacements from the characteristic shape of a simple sine wave. The results are verified by a finite element solution and through experimental testing.
How to fracture formations (in Spanish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
del Risco V.M.
1971-01-01
Government-owned Petroleos del Peru has found the limited-entry fracturing technique to be the most suitable under prevailing conditions for its NW. Peruvian oil fields. There, most formations available for stimulation are low- permeability and highly compact sands interbedded with thin and thick layers of clay. After experimenting with 8 different commercially available methods, a detailed analysis of the results showed the Shoot-Frac system to be the most effective.
Molecular Beam Studies of Volatile Liquids and Fuel Surrogates Using Liquid Microjets
2014-12-18
themselves. Detailed discussions of the microjet technique are carried out in the following publications. Nozzle Liquid Jet Chopper Wheel...heating and evaporation occur within 1 ms of fuel leaving the fuel injector . This atomization proves is often the limiting process in combustion...This analysis leads to criteria for selecting the temperature and nozzle radius for producing stable jets in vacuum. Figure 4 depicts the
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 4 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties for clinical physician assistants. (BT)
Intercomparison between ozone profiles measured above Spitsbergen by lidar and sonde techniques
NASA Technical Reports Server (NTRS)
Fabian, Rolf; Vondergathen, Peter; Ehlers, J.; Krueger, Bernd C.; Neuber, Roland; Beyerle, Georg
1994-01-01
This paper compares coincident ozone profile measurements by electrochemical sondes and lidar performed at Ny-Alesund/Spitsbergen. A detailed height dependent statistical analysis of the differences between these complementary methods was performed for the overlapping altitude region (13-35 km). The data set comprises ozone profile measurements conducted between Jan. 1989 and Jan. 1991. Differences of up to 25 percent were found above 30 km altitude.
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 5 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in cardio-pulmonary, EEG, EKG, and inhalation therapy. (BT)
Operational Based Vision Assessment Cone Contrast Test: Description and Operation
2016-06-02
Jun 2016. Report contains color . 14. ABSTRACT The work detailed in this report was conducted by the Operational Based Vision Assessment (OBVA...currently used by the Air Force for aircrew color vision screening. The new OBVA CCT is differentiated from the Rabin device primarily by hardware...test procedures, and analysis techniques. Like the Rabin CCT, the OBVA CCT uses colors that selectively stimulate the cone photoreceptors of the
Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae.
Toma, Milan; Bloodworth, Charles H; Pierce, Eric L; Einstein, Daniel R; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S
2017-03-01
The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations.
Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae
Toma, Milan; Bloodworth, Charles H.; Pierce, Eric L.; Einstein, Daniel R.; Cochran, Richard P.; Yoganathan, Ajit P.; Kunzelman, Karyn S.
2016-01-01
The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations. PMID:27624659
Direct numerical simulation of the flow around an aerofoil in ramp-up motion
NASA Astrophysics Data System (ADS)
Rosti, Marco E.; Omidyeganeh, Mohammad; Pinelli, Alfredo
2016-02-01
A detailed analysis of the flow around a NACA0020 aerofoil at Rec = 2 × 104 undergoing a ramp up motion has been carried out by means of direct numerical simulations. During the manoeuvre, the angle of attack is linearly varied in time between 0° and 20° with a constant rate of change of α ˙ rad = 0 . 12 U ∞ / c . When the angle of incidence has reached the final value, the lift experiences a first overshoot and then suddenly decreases towards the static stall asymptotic value. The transient instantaneous flow is dominated by the generation and detachment of the dynamic stall vortex, a large scale structure formed by the merging of smaller scales vortices generated by an instability originating at the trailing edge. New insights on the vorticity dynamics leading to the lift overshoot, lift crisis, and the damped oscillatory cycle that gradually matches the steady condition are discussed using a number of post-processing techniques. These include a detailed analysis of the flow ensemble average statistics and coherent structures identification carried out using the Q -criterion and the finite-time Lyapunov exponent technique. The results are compared with the one obtained in a companion simulation considering a static stall condition at the final angle of incidence α = 20°.
The new ATLAS Fast Calorimeter Simulation
NASA Astrophysics Data System (ADS)
Schaarschmidt, J.; ATLAS Collaboration
2017-10-01
Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.
NASA Astrophysics Data System (ADS)
Kanawade, Rajesh; Stelzle, Florian; Schmidt, Michael
This paper presents a novel methodology in early detection of clinical shock by monitoring hemodynamic changes using diffuse reflectance measurement technique. Detailed prototype of the reflectance measurement system and data analysis technique of hemodynamic monitoring was carried out in our laboratory. The real time in-vivo measurements were done from the index finger. This study demonstrates preliminary results of real time monitoring of reduced/- oxyhemoglobin changes during clogging and unclogging of blood flow in the finger tip. The obtained results were verified with pulse-oximeter values, connected to the tip of the same index finger.
Liquid-propellant rocket engines health-monitoring—a survey
NASA Astrophysics Data System (ADS)
Wu, Jianjun
2005-02-01
This paper is intended to give a summary on the health-monitoring technology, which is one of the key technologies both for improving and enhancing the reliability and safety of current rocket engines and for developing new-generation high reliable reusable rocket engines. The implication of health-monitoring and the fundamental principle obeyed by the fault detection and diagnostics are elucidated. The main aspects of health-monitoring such as system frameworks, failure modes analysis, algorithms of fault detection and diagnosis, control means and advanced sensor techniques are illustrated in some detail. At last, the evolution trend of health-monitoring techniques of liquid-propellant rocket engines is set out.
Scalable nuclear density functional theory with Sky3D
NASA Astrophysics Data System (ADS)
Afibuzzaman, Md; Schuetrumpf, Bastian; Aktulga, Hasan Metin
2018-02-01
In nuclear astrophysics, quantum simulations of large inhomogeneous dense systems as they appear in the crusts of neutron stars present big challenges. The number of particles in a simulation with periodic boundary conditions is strongly limited due to the immense computational cost of the quantum methods. In this paper, we describe techniques for an efficient and scalable parallel implementation of Sky3D, a nuclear density functional theory solver that operates on an equidistant grid. Presented techniques allow Sky3D to achieve good scaling and high performance on a large number of cores, as demonstrated through detailed performance analysis on a Cray XC40 supercomputer.
Application of full field optical studies for pulsatile flow in a carotid artery phantom
Nemati, M.; Loozen, G. B.; van der Wekken, N.; van de Belt, G.; Urbach, H. P.; Bhattacharya, N.; Kenjeres, S.
2015-01-01
A preliminary comparative measurement between particle imaging velocimetry (PIV) and laser speckle contrast analysis (LASCA) to study pulsatile flow using ventricular assist device in a patient-specific carotid artery phantom is reported. These full-field optical techniques have both been used to study flow and extract complementary parameters. We use the high spatial resolution of PIV to generate a full velocity map of the flow field and the high temporal resolution of LASCA to extract the detailed frequency spectrum of the fluid pulses. Using this combination of techniques a complete study of complex pulsatile flow in an intricate flow network can be studied. PMID:26504652
Flight test trajectory control analysis
NASA Technical Reports Server (NTRS)
Walker, R.; Gupta, N.
1983-01-01
Recent extensions to optimal control theory applied to meaningful linear models with sufficiently flexible software tools provide powerful techniques for designing flight test trajectory controllers (FTTCs). This report describes the principal steps for systematic development of flight trajectory controllers, which can be summarized as planning, modeling, designing, and validating a trajectory controller. The techniques have been kept as general as possible and should apply to a wide range of problems where quantities must be computed and displayed to a pilot to improve pilot effectiveness and to reduce workload and fatigue. To illustrate the approach, a detailed trajectory guidance law is developed and demonstrated for the F-15 aircraft flying the zoom-and-pushover maneuver.
'From the moment of conception...': the Vatican instruction on artificial procreation techniques.
Coughlan, Michael J
1988-10-01
An analysis is presented of Instruction on Respect for Human Life in Its Origins and on the Dignity of Procreation, the official response of the Catholic Church to moral questions raised by the new reproductive technologies which sets down ethical guidelines for the treatment to be accorded human embryos and for procreative techniques from artificial insemination to surrogate motherhood. The document is viewed in the perspective of earlier Church pronouncements, such as The Declaration on Procured Abortion , and its definition of a person as an individual animated by a rational soul is explored in detail for its implications for discussions on the personhood of the human embryo.
Surface emitting ring quantum cascade lasers for chemical sensing
NASA Astrophysics Data System (ADS)
Szedlak, Rolf; Hayden, Jakob; Martín-Mateos, Pedro; Holzbauer, Martin; Harrer, Andreas; Schwarz, Benedikt; Hinkov, Borislav; MacFarland, Donald; Zederbauer, Tobias; Detz, Hermann; Andrews, Aaron Maxwell; Schrenk, Werner; Acedo, Pablo; Lendl, Bernhard; Strasser, Gottfried
2018-01-01
We review recent advances in chemical sensing applications based on surface emitting ring quantum cascade lasers (QCLs). Such lasers can be implemented in monolithically integrated on-chip laser/detector devices forming compact gas sensors, which are based on direct absorption spectroscopy according to the Beer-Lambert law. Furthermore, we present experimental results on radio frequency modulation up to 150 MHz of surface emitting ring QCLs. This technique provides detailed insight into the modulation characteristics of such lasers. The gained knowledge facilitates the utilization of ring QCLs in combination with spectroscopic techniques, such as heterodyne phase-sensitive dispersion spectroscopy for gas detection and analysis.
Study of inelastic e-Cd and e-Zn collisions
NASA Astrophysics Data System (ADS)
Piwinski, Mariusz; Klosowski, Lukasz; Dziczek, Darek; Chwirot, Stanislaw
2016-09-01
Electron-photon coincidence experiments are well known for providing more detailed information about electron-atom collision than any other technique. The Electron Impact Coherence Parameters (EICP) values obtained in such studies deliver the most complete characterization of the inelastic collision and allow for a verification of proposed theoretical models. We present the results of Stokes and EICP parameters characterising electronic excitation of the lowest singlet P-state of cadmium and zinc atoms for various collision energies. The experiments were performed using electron-photon coincidence technique in the coherence analysis version. The obtained data are presented and compared with existing CCC and RDWA theoretical predictions.
Alikhasi, Marzieh; Siadat, Hakimeh; Kharazifard, Mohammad Javad
2015-01-01
Objectives: The purpose of this study was to compare the accuracy of implant position transfer and surface detail reproduction using two impression techniques and materials. Materials and Methods: A metal model with two implants and three grooves of 0.25, 0.50 and 0.75 mm in depth on the flat superior surface of a die was fabricated. Ten regular-body polyether (PE) and 10 regular-body polyvinyl siloxane (PVS) impressions with square and conical transfer copings using open tray and closed tray techniques were made for each group. Impressions were poured with type IV stone, and linear and angular displacements of the replica heads were evaluated using a coordinate measuring machine (CMM). Also, accurate reproduction of the grooves was evaluated by a video measuring machine (VMM). These measurements were compared with the measurements calculated on the reference model that served as control, and the data were analyzed with two-way ANOVA and t-test at P= 0.05. Results: There was less linear displacement for PVS and less angular displacement for PE in closed-tray technique, and less linear displacement for PE in open tray technique (P<0.001). Also, the open tray technique showed less angular displacement with the use of PVS impression material. Detail reproduction accuracy was the same in all the groups (P>0.05). Conclusion: The open tray technique was more accurate using PE, and also both closed tray and open tray techniques had acceptable results with the use of PVS. The choice of impression material and technique made no significant difference in surface detail reproduction. PMID:27252761
Lee, Ki-Wook; Kim, Yeun; Perinpanayagam, Hiran; Lee, Jong-Ki; Yoo, Yeon-Jee; Lim, Sang-Min; Chang, Seok Woo; Ha, Byung-Hyun; Zhu, Qiang; Kum, Kee-Yeon
2014-03-01
Micro-computed tomography (MCT) shows detailed root canal morphology that is not seen with traditional tooth clearing. However, alternative image reformatting techniques in MCT involving 2-dimensional (2D) minimum intensity projection (MinIP) and 3-dimensional (3D) volume-rendering reconstruction have not been directly compared with clearing. The aim was to compare alternative image reformatting techniques in MCT with tooth clearing on the mesiobuccal (MB) root of maxillary first molars. Eighteen maxillary first molar MB roots were scanned, and 2D MinIP and 3D volume-rendered images were reconstructed. Subsequently, the same MB roots were processed by traditional tooth clearing. Images from 2D, 3D, 2D + 3D, and clearing techniques were assessed by 4 endodontists to classify canal configuration and to identify fine anatomic structures such as accessory canals, intercanal communications, and loops. All image reformatting techniques in MCT showed detailed configurations and numerous fine structures, such that none were classified as simple type I or II canals; several were classified as types III and IV according to Weine classification or types IV, V, and VI according to Vertucci; and most were nonclassifiable because of their complexity. The clearing images showed less detail, few fine structures, and numerous type I canals. Classification of canal configuration was in 100% intraobserver agreement for all 18 roots visualized by any of the image reformatting techniques in MCT but for only 4 roots (22.2%) classified according to Weine and 6 (33.3%) classified according to Vertucci, when using the clearing technique. The combination of 2D MinIP and 3D volume-rendered images showed the most detailed canal morphology and fine anatomic structures. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Raman Plus X: Biomedical Applications of Multimodal Raman Spectroscopy.
Das, Nandan K; Dai, Yichuan; Liu, Peng; Hu, Chuanzhen; Tong, Lieshu; Chen, Xiaoya; Smith, Zachary J
2017-07-07
Raman spectroscopy is a label-free method of obtaining detailed chemical information about samples. Its compatibility with living tissue makes it an attractive choice for biomedical analysis, yet its translation from a research tool to a clinical tool has been slow, hampered by fundamental Raman scattering issues such as long integration times and limited penetration depth. In this review we detail the how combining Raman spectroscopy with other techniques yields multimodal instruments that can help to surmount the translational barriers faced by Raman alone. We review Raman combined with several optical and non-optical methods, including fluorescence, elastic scattering, OCT, phase imaging, and mass spectrometry. In each section we highlight the power of each combination along with a brief history and presentation of representative results. Finally, we conclude with a perspective detailing both benefits and challenges for multimodal Raman measurements, and give thoughts on future directions in the field.
Raman Plus X: Biomedical Applications of Multimodal Raman Spectroscopy
Das, Nandan K.; Dai, Yichuan; Liu, Peng; Hu, Chuanzhen; Tong, Lieshu; Chen, Xiaoya
2017-01-01
Raman spectroscopy is a label-free method of obtaining detailed chemical information about samples. Its compatibility with living tissue makes it an attractive choice for biomedical analysis, yet its translation from a research tool to a clinical tool has been slow, hampered by fundamental Raman scattering issues such as long integration times and limited penetration depth. In this review we detail the how combining Raman spectroscopy with other techniques yields multimodal instruments that can help to surmount the translational barriers faced by Raman alone. We review Raman combined with several optical and non-optical methods, including fluorescence, elastic scattering, OCT, phase imaging, and mass spectrometry. In each section we highlight the power of each combination along with a brief history and presentation of representative results. Finally, we conclude with a perspective detailing both benefits and challenges for multimodal Raman measurements, and give thoughts on future directions in the field. PMID:28686212
Transverse mucoperiosteal flap inset by rotation for cleft palate repair: technique and outcomes.
Black, Jonathan S; Gampper, Thomas J
2014-01-01
Cleft palate is a relatively common deformity with various techniques described for its repair. Most techniques address the hard palate portion of the cleft with bilateral mucoperiosteal flaps transposed to the midline. This results in superimposed, linear closure layers directly over the cleft and may predispose the repair to oronasal fistula formation. This report details an alternative technique of flap rotation with an outcome analysis. A retrospective chart analysis was performed of all patients having undergone primary palatoplasty for cleft palate. Demographics and cleft Veau type were recorded. Postoperative speech outcomes were assessed by standardized speech evaluation performed by 2 speech language pathologists. The presence and location of oronasal fistulae was assessed and recorded by the surgeon and speech language pathologists in follow-up evaluations. The study revealed an overall incidence of velopharyngeal insufficiency of 5.7% using this surgical technique. It also revealed a fistula rate of 8.6%. Secondary surgery has been successful in those patients in which it was indicated. Eleven (31%) patients were diagnosed with Robin sequence. This technique demonstrates excellent early outcomes in a difficult subset of cleft patients including a high proportion of those with Pierre Robin sequence. The technique addresses the inherent disadvantages to a linear closure over the bony cleft. The variability in its design provides the surgeon another option for correction of this deformity.
PBF Cubicle 13. Shield wall details illustrate shielding technique of ...
PBF Cubicle 13. Shield wall details illustrate shielding technique of stepped penetrations and brick layout scheme for valve stem extension sleeve. Aerojet Nuclear Company. Date: May 1976. INEEL index no. 761-0620-00-400-195280 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Analysis of the regulation of viral transcription.
Gloss, Bernd; Kalantari, Mina; Bernard, Hans-Ulrich
2005-01-01
Despite the small genomes and number of genes of papillomaviruses, regulation of their transcription is very complex and governed by numerous transcription factors, cis-responsive elements, and epigenetic phenomena. This chapter describes the strategies of how one can approach a systematic analysis of these factors, elements, and mechanisms. From the numerous different techniques useful for studying transcription, we describe in detail three selected protocols of approaches that have been relevant in shaping our knowledge of human papillomavirus transcription. These are DNAse I protection ("footprinting") for location of transcription-factor binding sites, electrophoretic mobility shifts ("gelshifts") for analysis of bound transcription factors, and bisulfite sequencing for analysis of DNA methylation as a prerequisite for epigenetic transcriptional regulation.
NASA Astrophysics Data System (ADS)
Cazzani, Antonio; Malagù, Marcello; Turco, Emilio
2016-03-01
We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.
General Analytical Schemes for the Characterization of Pectin-Based Edible Gelled Systems
Haghighi, Maryam; Rezaei, Karamatollah
2012-01-01
Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484
Electron/proton separation and analysis techniques used in the AMS-02 (e+ + e-) flux measurement
NASA Astrophysics Data System (ADS)
Graziani, Maura; AMS-02 Collaboration
2016-04-01
AMS-02 is a large acceptance cosmic ray detector which has been installed on the International Space Station (ISS) in May 2011, where it is collecting cosmic rays up to TeV energies. The search for Dark Matter indirect signatures in the rare components of the cosmic ray fluxes is among the main objectives of the experiment. AMS-02 is providing cosmic electrons and positrons data with an unprecedented precision. This is achieved by means to the excellent hadron/electron separation power obtained combining the independent measurements from the Transition Radiation Detector, electromagnetic Calorimeter and Tracker detectors. In this contribution we will detail the analysis techniques used to distinguish electrons from the hadronic background and show the in-flight performances of these detectors relevant for the electron/positron measurements.
NASA Technical Reports Server (NTRS)
Puttkamer, J. V.
1973-01-01
An analysis has been conducted to find out whether the management techniques developed in connection with the Apollo project could be used for dealing with such urgent problems of modern society as the crisis of the cities, the increasing environmental pollution, and the steadily growing traffic. Basic concepts and definitions of program and system management are discussed together with details regarding the employment of these concepts in connection with the solution of the problems of the Apollo program. Principles and significance of a systems approach are considered, giving attention to planning, system analysis, system integration, and project management. An application of the methods of project management to the problems of the civil sector is possible if the special characteristics of each particular case are taken into account.
Noise analysis for CCD-based ultraviolet and visible spectrophotometry.
Davenport, John J; Hodgkinson, Jane; Saffell, John R; Tatam, Ralph P
2015-09-20
We present the results of a detailed analysis of the noise behavior of two CCD spectrometers in common use, an AvaSpec-3648 CCD UV spectrometer and an Ocean Optics S2000 Vis spectrometer. Light sources used include a deuterium UV/Vis lamp and UV and visible LEDs. Common noise phenomena include source fluctuation noise, photoresponse nonuniformity, dark current noise, fixed pattern noise, and read noise. These were identified and characterized by varying light source, spectrometer settings, or temperature. A number of noise-limiting techniques are proposed, demonstrating a best-case spectroscopic noise equivalent absorbance of 3.5×10(-4) AU for the AvaSpec-3648 and 5.6×10(-4) AU for the Ocean Optics S2000 over a 30 s integration period. These techniques can be used on other CCD spectrometers to optimize performance.
Tools and techniques for developing policies for complex and uncertain systems.
Bankes, Steven C
2002-05-14
Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1983-01-01
The present investigation is concerned with the formulation of energy management strategies for stand-alone photovoltaic (PV) systems, taking into account a basic control algorithm for a possible predictive, (and adaptive) controller. The control system controls the flow of energy in the system according to the amount of energy available, and predicts the appropriate control set-points based on the energy (insolation) available by using an appropriate system model. Aspects of adaptation to the conditions of the system are also considered. Attention is given to a statistical analysis technique, the analysis inputs, the analysis procedure, and details regarding the basic control algorithm.
Simulation of wind turbine wakes using the actuator line technique
Sørensen, Jens N.; Mikkelsen, Robert F.; Henningson, Dan S.; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J.
2015-01-01
The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. PMID:25583862
Flat-plate solar array project process development area: Process research of non-CZ silicon material
NASA Technical Reports Server (NTRS)
Campbell, R. B.
1986-01-01
Several different techniques to simultaneously diffuse the front and back junctions in dendritic web silicon were investigated. A successful simultaneous diffusion reduces the cost of the solar cell by reducing the number of processing steps, the amount of capital equipment, and the labor cost. The three techniques studied were: (1) simultaneous diffusion at standard temperatures and times using a tube type diffusion furnace or a belt furnace; (2) diffusion using excimer laser drive-in; and (3) simultaneous diffusion at high temperature and short times using a pulse of high intensity light as the heat source. The use of an excimer laser and high temperature short time diffusion experiment were both more successful than the diffusion at standard temperature and times. The three techniques are described in detail and a cost analysis of the more successful techniques is provided.
A fast efficient implicit scheme for the gasdynamic equations using a matrix reduction technique
NASA Technical Reports Server (NTRS)
Barth, T. J.; Steger, J. L.
1985-01-01
An efficient implicit finite-difference algorithm for the gasdynamic equations utilizing matrix reduction techniques is presented. A significant reduction in arithmetic operations is achieved without loss of the stability characteristics generality found in the Beam and Warming approximate factorization algorithm. Steady-state solutions to the conservative Euler equations in generalized coordinates are obtained for transonic flows and used to show that the method offers computational advantages over the conventional Beam and Warming scheme. Existing Beam and Warming codes can be retrofit with minimal effort. The theoretical extension of the matrix reduction technique to the full Navier-Stokes equations in Cartesian coordinates is presented in detail. Linear stability, using a Fourier stability analysis, is demonstrated and discussed for the one-dimensional Euler equations.
Cismesia, Adam P.; Bailey, Laura S.; Bell, Matthew R.; Tesler, Larry F.; Polfer, Nicolas C.
2016-01-01
The detailed chemical information contained in the vibrational spectrum of a cryogenically cooled analyte would, in principle, make infrared (IR) ion spectroscopy a gold standard technique for molecular identification in mass spectrometry. Despite this immense potential, there are considerable challenges in both instrumentation and methodology to overcome before the technique is analytically useful. Here, we discuss the promise of IR ion spectroscopy for small molecule analysis in the context of metabolite identification. Experimental strategies to address sensitivity constraints, poor overall duty cycle, and speed of the experiment are intimately tied to the development of a mass-selective cryogenic trap. Therefore, the most likely avenues for success, in the authors? opinion, are presented here, alongside alternative approaches and some thoughts on data interpretation. PMID:26975370
Ree, Moonhor
2014-05-01
For advanced functional polymers such as biopolymers, biomimic polymers, brush polymers, star polymers, dendritic polymers, and block copolymers, information about their surface structures, morphologies, and atomic structures is essential for understanding their properties and investigating their potential applications. Grazing incidence X-ray scattering (GIXS) is established for the last 15 years as the most powerful, versatile, and nondestructive tool for determining these structural details when performed with the aid of an advanced third-generation synchrotron radiation source with high flux, high energy resolution, energy tunability, and small beam size. One particular merit of this technique is that GIXS data can be obtained facilely for material specimens of any size, type, or shape. However, GIXS data analysis requires an understanding of GIXS theory and of refraction and reflection effects, and for any given material specimen, the best methods for extracting the form factor and the structure factor from the data need to be established. GIXS theory is reviewed here from the perspective of practical GIXS measurements and quantitative data analysis. In addition, schemes are discussed for the detailed analysis of GIXS data for the various self-assembled nanostructures of functional homopolymers, brush, star, and dendritic polymers, and block copolymers. Moreover, enhancements to the GIXS technique are discussed that can significantly improve its structure analysis by using the new synchrotron radiation sources such as third-generation X-ray sources with picosecond pulses and partial coherence and fourth-generation X-ray laser sources with femtosecond pulses and full coherence. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Alduino, C.; Alfonso, K.; Artusa, D. R.; ...
2016-04-25
Here, we describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta (0νββ) decay in 130Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0νββ decay search in its own right and functions as a platform to further develop the analysis tools and procedures tomore » be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0νββ search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0νββ decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0νββ decay half-life limits previously reported for CUORE-0, T 0ν 1/2 > 2.7×10 24yr, and in combination with the Cuoricino limit, T 0ν 1/2 > 4.0×10 24yr.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alduino, C.; Alfonso, K.; Artusa, D. R.
2016-04-25
We describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta ( 0 ν β β ) decay in 130 Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0 ν β β decay search in its own right and functions as a platform tomore » further develop the analysis tools and procedures to be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0 ν β β search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0 ν β β decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0 ν β β decay half-life limits previously reported for CUORE-0, T 0 ν 1 / 2 > 2.7 × 10 24 yr , and in combination with the Cuoricino limit, T 0 ν 1 / 2 > 4.0 × 10 24 yr .« less
Developments in remote sensing technology enable more detailed urban flood risk analysis.
NASA Astrophysics Data System (ADS)
Denniss, A.; Tewkesbury, A.
2009-04-01
Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest digital airborne sensors, both optical and lidar, to produce the input layer for surface water flood modelling. A national flood map product has been created. The new product utilises sophisticated modelling techniques, perfected over many years, which harness graphical processing power. This product will prove particularly valuable for risk assessment decision support within insurance/reinsurance, property/environmental, utilities, risk management and government agencies. However, it is not just the ground elevation that determines the behaviour of surface water. By combining height information (surface and terrain) with high resolution aerial photography and colour infrared imagery, a high definition land cover mapping dataset (LandBase) is being produced, which provides a precise measure of sealed versus non sealed surface. This will allows even more sophisticated modelling of flood scenarios. Thus, the value of airborne survey data can be demonstrated by flood risk analysis down to individual addresses in urban areas. However for some risks, an even more detailed survey may be justified. In order to achieve this, Infoterra is testing new 360˚ mobile lidar technology. Collecting lidar data from a moving vehicle allows each street to be mapped in very high detail, allowing precise information about the location, size and shape of features such as kerbstones, gullies, road camber and building threshold level to be captured quickly and accurately. These data can then be used to model the problem of overland flood risk at the scale of individual properties. Whilst at present it might be impractical to undertake such detailed modelling for all properties, these techniques can certainly be used to improve the flood risk analysis of key locations. This paper will demonstrate how these new high resolution remote sensing techniques can be combined to provide a new resolution of detail to aid urban flood modelling.
The Contribution of Particle Swarm Optimization to Three-Dimensional Slope Stability Analysis
A Rashid, Ahmad Safuan; Ali, Nazri
2014-01-01
Over the last few years, particle swarm optimization (PSO) has been extensively applied in various geotechnical engineering including slope stability analysis. However, this contribution was limited to two-dimensional (2D) slope stability analysis. This paper applied PSO in three-dimensional (3D) slope stability problem to determine the critical slip surface (CSS) of soil slopes. A detailed description of adopted PSO was presented to provide a good basis for more contribution of this technique to the field of 3D slope stability problems. A general rotating ellipsoid shape was introduced as the specific particle for 3D slope stability analysis. A detailed sensitivity analysis was designed and performed to find the optimum values of parameters of PSO. Example problems were used to evaluate the applicability of PSO in determining the CSS of 3D slopes. The first example presented a comparison between the results of PSO and PLAXI-3D finite element software and the second example compared the ability of PSO to determine the CSS of 3D slopes with other optimization methods from the literature. The results demonstrated the efficiency and effectiveness of PSO in determining the CSS of 3D soil slopes. PMID:24991652
The contribution of particle swarm optimization to three-dimensional slope stability analysis.
Kalatehjari, Roohollah; Rashid, Ahmad Safuan A; Ali, Nazri; Hajihassani, Mohsen
2014-01-01
Over the last few years, particle swarm optimization (PSO) has been extensively applied in various geotechnical engineering including slope stability analysis. However, this contribution was limited to two-dimensional (2D) slope stability analysis. This paper applied PSO in three-dimensional (3D) slope stability problem to determine the critical slip surface (CSS) of soil slopes. A detailed description of adopted PSO was presented to provide a good basis for more contribution of this technique to the field of 3D slope stability problems. A general rotating ellipsoid shape was introduced as the specific particle for 3D slope stability analysis. A detailed sensitivity analysis was designed and performed to find the optimum values of parameters of PSO. Example problems were used to evaluate the applicability of PSO in determining the CSS of 3D slopes. The first example presented a comparison between the results of PSO and PLAXI-3D finite element software and the second example compared the ability of PSO to determine the CSS of 3D slopes with other optimization methods from the literature. The results demonstrated the efficiency and effectiveness of PSO in determining the CSS of 3D soil slopes.
Accurate 3d Scanning of Damaged Ancient Greek Inscriptions for Revealing Weathered Letters
NASA Astrophysics Data System (ADS)
Papadaki, A. I.; Agrafiotis, P.; Georgopoulos, A.; Prignitz, S.
2015-02-01
In this paper two non-invasive non-destructive alternative techniques to the traditional and invasive technique of squeezes are presented alongside with specialized developed processing methods, aiming to help the epigraphists to reveal and analyse weathered letters in ancient Greek inscriptions carved in masonry or marble. The resulting 3D model would serve as a detailed basis for the epigraphists to try to decipher the inscription. The data were collected by using a Structured Light scanner. The creation of the final accurate three dimensional model is a complicated procedure requiring large computation cost and human effort. It includes the collection of geometric data in limited space and time, the creation of the surface, the noise filtering and the merging of individual surfaces. The use of structured light scanners is time consuming and requires costly hardware and software. Therefore an alternative methodology for collecting 3D data of the inscriptions was also implemented for reasons of comparison. Hence, image sequences from varying distances were collected using a calibrated DSLR camera aiming to reconstruct the 3D scene through SfM techniques in order to evaluate the efficiency and the level of precision and detail of the obtained reconstructed inscriptions. Problems in the acquisition processes as well as difficulties in the alignment step and mesh optimization are also encountered. A meta-processing framework is proposed and analysed. Finally, the results of processing and analysis and the different 3D models are critically inspected and then evaluated by a specialist in terms of accuracy, quality and detail of the model and the capability of revealing damaged and "hidden" letters.
A methodology for commonality analysis, with applications to selected space station systems
NASA Technical Reports Server (NTRS)
Thomas, Lawrence Dale
1989-01-01
The application of commonality in a system represents an attempt to reduce costs by reducing the number of unique components. A formal method for conducting commonality analysis has not been established. In this dissertation, commonality analysis is characterized as a partitioning problem. The cost impacts of commonality are quantified in an objective function, and the solution is that partition which minimizes this objective function. Clustering techniques are used to approximate a solution, and sufficient conditions are developed which can be used to verify the optimality of the solution. This method for commonality analysis is general in scope. It may be applied to the various types of commonality analysis required in the conceptual, preliminary, and detail design phases of the system development cycle.
Detailed analysis and test correlation of a stiffened composite wing panel
NASA Technical Reports Server (NTRS)
Davis, D. Dale, Jr.
1991-01-01
Nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings supplied by the Bell Helicopter Textron Corporation, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain (ANS) elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain displacements relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis. Strain predictions from both the linear and nonlinear stress analyses are shown to compare well with experimental data up through the Design Ultimate Load (DUL) of the panel. However, due to the extreme nonlinear response of the panel, the linear analysis was not accurate at loads above the DUL. The nonlinear analysis more accurately predicted the strain at high values of applied load, and even predicted complicated nonlinear response characteristics, such as load reversals, at the observed failure load of the test panel. In order to understand the failure mechanism of the panel, buckling and first ply failure analyses were performed. The buckling load was 17 percent above the observed failure load while first ply failure analyses indicated significant material damage at and below the observed failure load.
Empirical transfer functions for stations in the Central California seismological network
Bakun, W.H.; Dratler, Jay
1976-01-01
A sequence of calibration signals composed of a station identification code, a transient from the release of the seismometer mass at rest from a known displacement from the equilibrium position, and a transient from a known step in voltage to the amplifier input are generated by the automatic daily calibration system (ADCS) now operational in the U.S. Geological Survey central California seismographic network. Documentation of a sequence of interactive programs to compute, from the calibration data, the complex transfer functions for the seismographic system (ground motion through digitizer) the electronics (amplifier through digitizer), and the seismometer alone are presented. The analysis utilizes the Fourier transform technique originally suggested by Espinosa et al (1962). Section I is a general description of seismographic calibration. Section II contrasts the 'Fourier transform' and the 'least-squares' techniques for analyzing transient calibration signals. Theoretical consideration for the Fourier transform technique used here are described in Section III. Section IV is a detailed description of the sequence of calibration signals generated by the ADCS. Section V is a brief 'cookbook description' of the calibration programs; Section VI contains a detailed sample program execution. Section VII suggests the uses of the resultant empirical transfer functions. Supplemental interactive programs by which smooth response functions, suitable for reducing seismic data to ground motion, are also documented in Section VII. Appendices A and B contain complete listings of the Fortran source Codes while Appendix C is an update containing preliminary results obtained from an analysis of some of the calibration signals from stations in the seismographic network near Oroville, California.
Development and verification of local/global analysis techniques for laminated composites
NASA Technical Reports Server (NTRS)
Griffin, O. Hayden, Jr.
1989-01-01
Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.
Status of Thermal NDT of Space Shuttle Materials at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Reinhardt, Walter W.
2006-01-01
Since the Space Shuttle Columbia accident, NASA has focused on improving advanced nondestructive evaluation (NDE) techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter's wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.
Status of Thermal NDT of Space Shuttle Materials at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Reinhardt, Walter W.
2007-01-01
Since the Space Shuttle Columbia accident, NASA has focused on improving advanced NDE techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter s wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.
Status of Thermal NDT of Space Shuttle Materials at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Rweinhardt, Walter W.
2006-01-01
Since the Space Shuttle Columbia accident, NASA has focused on improving advanced NDE techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter's wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.
Multimodal Neuroimaging: Basic Concepts and Classification of Neuropsychiatric Diseases.
Tulay, Emine Elif; Metin, Barış; Tarhan, Nevzat; Arıkan, Mehmet Kemal
2018-06-01
Neuroimaging techniques are widely used in neuroscience to visualize neural activity, to improve our understanding of brain mechanisms, and to identify biomarkers-especially for psychiatric diseases; however, each neuroimaging technique has several limitations. These limitations led to the development of multimodal neuroimaging (MN), which combines data obtained from multiple neuroimaging techniques, such as electroencephalography, functional magnetic resonance imaging, and yields more detailed information about brain dynamics. There are several types of MN, including visual inspection, data integration, and data fusion. This literature review aimed to provide a brief summary and basic information about MN techniques (data fusion approaches in particular) and classification approaches. Data fusion approaches are generally categorized as asymmetric and symmetric. The present review focused exclusively on studies based on symmetric data fusion methods (data-driven methods), such as independent component analysis and principal component analysis. Machine learning techniques have recently been introduced for use in identifying diseases and biomarkers of disease. The machine learning technique most widely used by neuroscientists is classification-especially support vector machine classification. Several studies differentiated patients with psychiatric diseases and healthy controls with using combined datasets. The common conclusion among these studies is that the prediction of diseases increases when combining data via MN techniques; however, there remain a few challenges associated with MN, such as sample size. Perhaps in the future N-way fusion can be used to combine multiple neuroimaging techniques or nonimaging predictors (eg, cognitive ability) to overcome the limitations of MN.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
Idealized simulation of the Colorado hailstorm case: comparison of bulk and detailed microphysics
NASA Astrophysics Data System (ADS)
Geresdi, I.
One of the purposes of the Fourth Cloud Modeling Workshop was to compare different microphysical treatments. In this paper, the results of a widely used bulk treatment and five versions of a detailed microphysical model are presented. Sensitivity analysis was made to investigate the effect of bulk parametrization, ice initiation technique, CCN concentration and collision efficiency of rimed ice crystal-drop collision. The results show that: (i) The mixing ratios of different species of hydrometeors calculated by bulk and one of the detailed models show some similarity. However, the processes of hail/graupel formation are different in the bulk and the detailed models. (ii) Using different ice initiation in the detailed models' different processes became important in the hail and graupel formation. (iii) In the case of higher CCN concentration, the mixing ratio of liquid water, hail and graupel were more sensitive to the value of collision efficiency of rimed ice crystal-drop collision. (iv) The Bergeron-Findeisen process does not work in the updraft core of a convective cloud. The vapor content was always over water saturation; moreover, the supersaturation gradually increased after the appearance of precipitation ice particles.
Development of a CFD Code for Analysis of Fluid Dynamic Forces in Seals
NASA Technical Reports Server (NTRS)
Athavale, Mahesh M.; Przekwas, Andrzej J.; Singhal, Ashok K.
1991-01-01
The aim is to develop a 3-D computational fluid dynamics (CFD) code for the analysis of fluid flow in cylindrical seals and evaluation of the dynamic forces on the seals. This code is expected to serve as a scientific tool for detailed flow analysis as well as a check for the accuracy of the 2D industrial codes. The features necessary in the CFD code are outlined. The initial focus was to develop or modify and implement new techniques and physical models. These include collocated grid formulation, rotating coordinate frames and moving grid formulation. Other advanced numerical techniques include higher order spatial and temporal differencing and an efficient linear equation solver. These techniques were implemented in a 2D flow solver for initial testing. Several benchmark test cases were computed using the 2D code, and the results of these were compared to analytical solutions or experimental data to check the accuracy. Tests presented here include planar wedge flow, flow due to an enclosed rotor, and flow in a 2D seal with a whirling rotor. Comparisons between numerical and experimental results for an annular seal and a 7-cavity labyrinth seal are also included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The regional suitability of underground construction as a climate control technique is discussed with reference to (1) a bioclimatic analysis of long-term weather data for 29 locations in the United States to determine appropriate above ground climate control techniques, (2) a data base of synthesized ground temperatures for the coterminous United States, and (3) monthly dew point ground temperature comparisons for identifying the relative likelihood of condensation from one region to another. It is concluded that the suitability of earth tempering as a practice and of specific earth-sheltered design stereotypes varies geographically; while the subsurface almost always provides a thermalmore » advantage on its own terms when compared to above ground climatic data, it can, nonetheless, compromise the effectiveness of other, regionally more important climate control techniques. Also contained in the report are reviews of above and below ground climate mapping schemes related to human comfort and architectural design, and detailed description of a theoretical model of ground temperature, heat flow, and heat storage in the ground. Strategies of passive climate control are presented in a discussion of the building bioclimatic analysis procedure which has been applied in a computer analysis of 30 years of weather data for each of 29 locations in the United States.« less
Low energy analysis techniques for CUORE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alduino, C.; Alfonso, K.; Artusa, D. R.
CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less
Brain tumor classification using AFM in combination with data mining techniques.
Huml, Marlene; Silye, René; Zauner, Gerald; Hutterer, Stephan; Schilcher, Kurt
2013-01-01
Although classification of astrocytic tumors is standardized by the WHO grading system, which is mainly based on microscopy-derived, histomorphological features, there is great interobserver variability. The main causes are thought to be the complexity of morphological details varying from tumor to tumor and from patient to patient, variations in the technical histopathological procedures like staining protocols, and finally the individual experience of the diagnosing pathologist. Thus, to raise astrocytoma grading to a more objective standard, this paper proposes a methodology based on atomic force microscopy (AFM) derived images made from histopathological samples in combination with data mining techniques. By comparing AFM images with corresponding light microscopy images of the same area, the progressive formation of cavities due to cell necrosis was identified as a typical morphological marker for a computer-assisted analysis. Using genetic programming as a tool for feature analysis, a best model was created that achieved 94.74% classification accuracy in distinguishing grade II tumors from grade IV ones. While utilizing modern image analysis techniques, AFM may become an important tool in astrocytic tumor diagnosis. By this way patients suffering from grade II tumors are identified unambiguously, having a less risk for malignant transformation. They would benefit from early adjuvant therapies.
Low energy analysis techniques for CUORE
Alduino, C.; Alfonso, K.; Artusa, D. R.; ...
2017-12-12
CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less
Head movement compensation in real-time magnetoencephalographic recordings.
Little, Graham; Boe, Shaun; Bardouille, Timothy
2014-01-01
Neurofeedback- and brain-computer interface (BCI)-based interventions can be implemented using real-time analysis of magnetoencephalographic (MEG) recordings. Head movement during MEG recordings, however, can lead to inaccurate estimates of brain activity, reducing the efficacy of the intervention. Most real-time applications in MEG have utilized analyses that do not correct for head movement. Effective means of correcting for head movement are needed to optimize the use of MEG in such applications. Here we provide preliminary validation of a novel analysis technique, real-time source estimation (rtSE), that measures head movement and generates corrected current source time course estimates in real-time. rtSE was applied while recording a calibrated phantom to determine phantom position localization accuracy and source amplitude estimation accuracy under stationary and moving conditions. Results were compared to off-line analysis methods to assess validity of the rtSE technique. The rtSE method allowed for accurate estimation of current source activity at the source-level in real-time, and accounted for movement of the source due to changes in phantom position. The rtSE technique requires modifications and specialized analysis of the following MEG work flow steps.•Data acquisition•Head position estimation•Source localization•Real-time source estimation This work explains the technical details and validates each of these steps.
Objective Analysis and Prediction Techniques.
1986-11-30
contract work performance period extended from November 25, 1981 to November 24, 1986. This report consists of two parts: Part One details the results and...be added to the ELAN to make it a truly effective research tool. Also, muach more testing and streamlining should be performed to insure that Its...before performing some kind of matching. Classification of Lhe data in this manner reduces the number of data points with which we need to work from
Zero leakage separable and semipermanent ducting joints
NASA Technical Reports Server (NTRS)
Mischel, H. T.
1973-01-01
A study program has been conducted to explore new methods of achieving zero leakage, separable and semipermanent, ducting joints for space flight vehicles. The study consisted of a search of literature of existing zero leakage methods, the generation of concepts of new methods of achieving the desired zero leakage criteria and the development of detailed analysis and design of a selected concept. Other techniques of leak detection were explored with a view toward improving this area.
Molecular Beam Studies of Volatile Liquids and Fuel Surrogates Using Liquid MICR
2014-12-23
Detailed discussions of the microjet technique are carried out in the following publications. Nozzle Liquid Jet Chopper Wheel Cold Collector Cold...process is shown in the picture below; heating and evaporation occur within 1 ms of fuel leaving the fuel injector . This atomization proves is often...liquid jet. This analysis leads to criteria for selecting the temperature and nozzle radius for producing stable jets in vacuum. Figure 4 depicts the
Pfeiffer, Jonathan B; Wagner, Kelvin H; Kaufman, Yaniv; Ledbetter, Hassel; Soos, Jolanta; Diestler, Mark
2016-10-01
Both Schaefer-Bergmann diffraction and resonant ultrasound spectroscopy were used to measure the six independent elastic-stiffness coefficients of the trigonal, non-piezoelectric crystal α-BaB 2 O 4 . The two measurement sets resulted in a root-mean-square variance of 1.2%. This paper provides a detailed analysis of the two different measurement techniques and discusses the similarities and differences.
Membrane Transport Phenomena (MTP)
NASA Technical Reports Server (NTRS)
Mason, Larry W.
1997-01-01
The third semi-annual period of the MTP project has been involved with performing experiments using the Membrane Transport Apparatus (MTA), development of analysis techniques for the experiment results, analytical modeling of the osmotic transport phenomena, and completion of a DC-9 microgravity flight to test candidate fluid cell geometries. Preparations were also made for the MTP Science Concept Review (SCR), held on 13 June 1997 at Lockheed Martin Astronautics in Denver. These activities are detailed in the report.
A big measurement of a small moment
NASA Astrophysics Data System (ADS)
E Sauer, B.; Devlin, J. A.; Rabey, I. M.
2017-07-01
A beam of ThO molecules has been used to make the most precise measurement of the electron’s electric dipole moment (EDM) to date. In their recent paper, the ACME collaboration set out in detail their experimental and data analysis techniques. In a tour-de-force, they explain the many ways in which their apparatus can produce a signal which mimics the EDM and show how these systematic effects are measured and controlled.
NASA Astrophysics Data System (ADS)
Marshall, H. E.; Ruegg, R. T.; Wilson, F.
1980-01-01
Economic analysis techniques for evaluating alternative energy conservation investments in buildings are presented. Life cycle cost, benefit cost, savings to investment, payback, and rate of return analyses are explained and illustrated. The procedure for discounting is described for a heat pump investment. Formulas, tables of discount factors, and detailed instructions are provided to give all information required to make economic evaluations of energy conserving building designs.
Using machine learning techniques to automate sky survey catalog generation
NASA Technical Reports Server (NTRS)
Fayyad, Usama M.; Roden, J. C.; Doyle, R. J.; Weir, Nicholas; Djorgovski, S. G.
1993-01-01
We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data.
Zhang, Q; Liu, Z; Xie, H; Ma, K; Wu, L
2016-12-01
Grating fabrication techniques are crucial to the success of grating-based deformation measurement methods because the quality of the grating will directly affect the measurement results. Deformation measurements at high temperatures entail heating and, perhaps, oxidize the grating. The contrast of the grating lines may change during the heating process. Thus, the thermal-resistant capability of the grating becomes a point of great concern before taking measurements. This study proposes a method that combines a laser-engraving technique with the processes of particle spraying and sintering for fabricating thermal-resistant gratings. The grating fabrication technique is introduced and discussed in detail. A numerical simulation with a geometric phase analysis (GPA) is performed for a homogeneous deformation case. Then, the selection scheme of the grating pitch is suggested. The validity of the proposed technique is verified by fabricating a thermal-resistant grating on a ZrO 2 specimen and measuring its thermal strain at high temperatures (up to 1300 °C). Images of the grating before and after deformation are used to obtain the thermal-strain field by GPA and to compare the results with well-established reference data. The experimental results indicate that this proposed technique is feasible and will offer good prospects for further applications.
Correcting the lobule in otoplasty using the fillet technique.
Sadick, Haneen; Artinger, Verena M; Haubner, Frank; Gassner, Holger G
2014-01-01
Correction of the protruded lobule in otoplasty continues to represent an important challenge. The lack of skeletal elements within the lobule makes a controlled lobule repositioning less predictable. OBJECTIVE To present a new surgical technique for lobule correction in otoplasty. Human cadaver studies were performed for detailed anatomical analysis of lobule deformities. In addition, we evaluated a novel algorithmic approach to correction of the lobule in 12 consecutive patients. INTERVENTIONS/EXPOSURES: Otoplasty with surgical correction of lobule using the fillet technique. The surgical outcome in the 12 most recent consecutive patients with at least 3 months of follow-up was assessed retrospectively. The postsurgical results were independently reviewed by a panel of noninvolved experts. The 3 major anatomic components of lobular deformities are the axial angular protrusion, the coronal angular protrusion, and the inherent shape. The fillet technique described in the present report addressed all 3 aspects in an effective way. Clinical data analysis revealed no immediate or long-term complications associated with this new surgical method. The patients' subjective rating and the panel's objective rating revealed "good" to "very good" postoperative results. This newly described fillet technique represents a safe and efficient method to correct protruded ear lobules in otoplasty. It allows precise and predictable positioning of the lobule with an excellent safety profile. 4.
NASA Astrophysics Data System (ADS)
Hassan, S.; Yusof, M. S.; Embong, Z.; Ding, S.; Maksud, M. I.
2018-01-01
Micro-flexographic printing is a combination of flexography and micro-contact printing technique. It is a new printing method for fine solid lines printing purpose. Graphene material has been used as depositing agent or printing ink in other printing technique like inkjet printing. This graphene ink is printed on biaxially oriented polypropylene (BOPP) by using Micro-flexographic printing technique. The choose of graphene as a printing ink is due to its wide application in producing electronic and micro-electronic devices such as Radio-frequency identification (RFID) and printed circuit board. The graphene printed on the surface of BOPP substrate was analyzed using X-Ray Photoelectron Spectroscopy (XPS). The positions for each synthetic component in the narrow scan are referred to the electron binding energy (eV). This research is focused on two narrow scan regions which are C 1s and O 1s. Further discussion of the narrow scan spectrum will be explained in detail. From the narrow scan analysis, it is proposed that from the surface adhesive properties of graphene, it is suitable as an alternative printing ink medium for Micro-flexographic printing technique in printing multiple fine solid lines at micro to nano scale feature.
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Giglio, Louis
1994-01-01
This paper describes a multichannel physical approach for retrieving rainfall and vertical structure information from satellite-based passive microwave observations. The algorithm makes use of statistical inversion techniques based upon theoretically calculated relations between rainfall rates and brightness temperatures. Potential errors introduced into the theoretical calculations by the unknown vertical distribution of hydrometeors are overcome by explicity accounting for diverse hydrometeor profiles. This is accomplished by allowing for a number of different vertical distributions in the theoretical brightness temperature calculations and requiring consistency between the observed and calculated brightness temperatures. This paper will focus primarily on the theoretical aspects of the retrieval algorithm, which includes a procedure used to account for inhomogeneities of the rainfall within the satellite field of view as well as a detailed description of the algorithm as it is applied over both ocean and land surfaces. The residual error between observed and calculated brightness temperatures is found to be an important quantity in assessing the uniqueness of the solution. It is further found that the residual error is a meaningful quantity that can be used to derive expected accuracies from this retrieval technique. Examples comparing the retrieved results as well as the detailed analysis of the algorithm performance under various circumstances are the subject of a companion paper.
Capturing Fine Details Involving Low-Cost Sensors -a Comparative Study
NASA Astrophysics Data System (ADS)
Rehany, N.; Barsi, A.; Lovas, T.
2017-11-01
Capturing the fine details on the surface of small objects is a real challenge to many conventional surveying methods. Our paper discusses the investigation of several data acquisition technologies, such as arm scanner, structured light scanner, terrestrial laser scanner, object line-scanner, DSLR camera, and mobile phone camera. A palm-sized embossed sculpture reproduction was used as a test object; it has been surveyed by all the instruments. The result point clouds and meshes were then analyzed, using the arm scanner's dataset as reference. In addition to general statistics, the results have been evaluated based both on 3D deviation maps and 2D deviation graphs; the latter allows even more accurate analysis of the characteristics of the different data acquisition approaches. Additionally, own-developed local minimum maps were created that nicely visualize the potential level of detail provided by the applied technologies. Besides the usual geometric assessment, the paper discusses the different resource needs (cost, time, expertise) of the discussed techniques. Our results proved that even amateur sensors operated by amateur users can provide high quality datasets that enable engineering analysis. Based on the results, the paper contains an outlook to potential future investigations in this field.
Ivleva, Natalia P; Kubryk, Patrick; Niessner, Reinhard
2017-07-01
Biofilms represent the predominant form of microbial life on our planet. These aggregates of microorganisms, which are embedded in a matrix formed by extracellular polymeric substances, may colonize nearly all interfaces. Detailed knowledge of microorganisms enclosed in biofilms as well as of the chemical composition, structure, and functions of the complex biofilm matrix and their changes at different stages of the biofilm formation and under various physical and chemical conditions is relevant in different fields. Important research topics include the development and improvement of antibiotics and medical devices and the optimization of biocides, antifouling strategies, and biological wastewater treatment. Raman microspectroscopy is a capable and nondestructive tool that can provide detailed two-dimensional and three-dimensional chemical information about biofilm constituents with the spatial resolution of an optical microscope and without interference from water. However, the sensitivity of Raman microspectroscopy is rather limited, which hampers the applicability of Raman microspectroscopy especially at low biomass concentrations. Fortunately, the resonance Raman effect as well as surface-enhanced Raman scattering can help to overcome this drawback. Furthermore, the combination of Raman microspectroscopy with other microscopic techniques, mass spectrometry techniques, or particularly with stable-isotope techniques can provide comprehensive information on monospecies and multispecies biofilms. Here, an overview of different Raman microspectroscopic techniques, including resonance Raman microspectroscopy and surface-enhanced Raman scattering microspectroscopy, for in situ detection, visualization, identification, and chemical characterization of biofilms is given, and the main feasibilities and limitations of these techniques in biofilm research are presented. Future possibilities of and challenges for Raman microspectroscopy alone and in combination with other analytical techniques for characterization of complex biofilm matrices are discussed in a critical review. Graphical Abstract Applicability of Raman microspectroscopy for biofilm analysis.
Implementation plans included in World Health Organisation guidelines.
Wang, Zhicheng; Norris, Susan L; Bero, Lisa
2016-05-20
The implementation of high-quality guidelines is essential to improve clinical practice and public health. The World Health Organisation (WHO) develops evidence-based public health and other guidelines that are used or adapted by countries around the world. Detailed implementation plans are often necessary for local policymakers to properly use the guidelines developed by WHO. This paper describes the plans for guideline implementation reported in WHO guidelines and indicates which of these plans are evidence-based. We conducted a content analysis of the implementation sections of WHO guidelines approved by the WHO guideline review committee between December 2007 and May 2015. The implementation techniques reported in each guideline were coded according to the Cochrane Collaboration's Effective Practice and Organisation of Care (EPOC) taxonomy and classified as passive, active or policy strategies. The frequencies of implementation techniques are reported. The WHO guidelines (n = 123) analysed mentioned implementation techniques 800 times, although most mentioned implementation techniques very briefly, if at all. Passive strategies (21 %, 167/800) and general policy strategies (62 %, 496/800) occurred most often. Evidence-based active implementation methods were generally neglected with no guideline mentioning reminders (computerised or paper) and only one mentioning a multifaceted approach. Many guidelines contained implementation sections that were identical to those used in older guidelines produced by the same WHO technical unit. The prevalence of passive and policy-based implementation techniques as opposed to evidence-based active techniques suggests that WHO guidelines should contain stronger guidance for implementation. This could include structured and increased detail on implementation considerations, accompanying or linked documents that provide information on what is needed to contextualise or adapt a guideline and specific options from among evidence-based implementation strategies.
In-Depth View of the Structure and Growth of SnO2 Nanowires and Nanobrushes.
Stuckert, Erin P; Geiss, Roy H; Miller, Christopher J; Fisher, Ellen R
2016-08-31
Strategic application of an array of complementary imaging and diffraction techniques is critical to determine accurate structural information on nanomaterials, especially when also seeking to elucidate structure-property relationships and their effects on gas sensors. In this work, SnO2 nanowires and nanobrushes grown via chemical vapor deposition (CVD) displayed the same tetragonal SnO2 structure as revealed via powder X-ray diffraction bulk crystallinity data. Additional characterization using a range of electron microscopy imaging and diffraction techniques, however, revealed important structure and morphology distinctions between the nanomaterials. Tailoring scanning transmission electron microscopy (STEM) modes combined with transmission electron backscatter diffraction (t-EBSD) techniques afforded a more detailed view of the SnO2 nanostructures. Indeed, upon deeper analysis of individual wires and brushes, we discovered that, despite a similar bulk structure, wires and brushes grew with different crystal faces and lattice spacings. Had we not utilized multiple STEM diffraction modes in conjunction with t-EBSD, differences in orientation related to bristle density would have been overlooked. Thus, it is only through a methodical combination of several structural analysis techniques that precise structural information can be reliably obtained.
BOOK REVIEW: Vortex Methods: Theory and Practice
NASA Astrophysics Data System (ADS)
Cottet, G.-H.; Koumoutsakos, P. D.
2001-03-01
The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez
NASA Astrophysics Data System (ADS)
Ahmadian, A.; Ismail, F.; Salahshour, S.; Baleanu, D.; Ghaemi, F.
2017-12-01
The analysis of the behaviors of physical phenomena is important to discover significant features of the character and the structure of mathematical models. Frequently the unknown parameters involve in the models are assumed to be unvarying over time. In reality, some of them are uncertain and implicitly depend on several factors. In this study, to consider such uncertainty in variables of the models, they are characterized based on the fuzzy notion. We propose here a new model based on fractional calculus to deal with the Kelvin-Voigt (KV) equation and non-Newtonian fluid behavior model with fuzzy parameters. A new and accurate numerical algorithm using a spectral tau technique based on the generalized fractional Legendre polynomials (GFLPs) is developed to solve those problems under uncertainty. Numerical simulations are carried out and the analysis of the results highlights the significant features of the new technique in comparison with the previous findings. A detailed error analysis is also carried out and discussed.
Real-Time Visualization of Network Behaviors for Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.
Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less
Visual mining business service using pixel bar charts
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Casati, Fabio
2004-06-01
Basic bar charts have been commonly available, but they only show highly aggregated data. Finding the valuable information hidden in the data is essential to the success of business. We describe a new visualization technique called pixel bar charts, which are derived from regular bar charts. The basic idea of a pixel bar chart is to present all data values directly instead of aggregating them into a few data values. Pixel bar charts provide data distribution and exceptions besides aggregated data. The approach is to represent each data item (e.g. a business transaction) by a single pixel in the bar chart. The attribute of each data item is encoded into the pixel color and can be accessed and drilled down to the detail information as needed. Different color mappings are used to represent multiple attributes. This technique has been prototyped in three business service applications-Business Operation Analysis, Sales Analysis, and Service Level Agreement Analysis at Hewlett Packard Laboratories. Our applications show the wide applicability and usefulness of this new idea.
^10B analysis using Charged Particle Activation Analysis
NASA Astrophysics Data System (ADS)
Guo, B. N.; Jin, J. Y.; Duggan, J. D.; McDaniel, F. D.
1997-10-01
Charged Particle Activation analysis (CPAA) is an analytic technique that is used to determine trace quantities of an element usually on the surface of a substrate. The beam from the accelerator is used to make the required nuclear reaction that leaves the residual activity with a measurable half life. Gamma rays from the residual activity are measured to determine the trace quantities of the elements being studied. We have used this technique to study re-entry cloth coatings for space and aircraft vehicles. The clothes made of 20μ m SiC fibers are coated with Boron Nitride. CPAA was used to determine the relative thicknesses of the boron coatings. In particular the ^10B(p,γ)^11C reaction was used. A fast coincidence set up was used to measure the 0.511 MeV annihilation radiation from the 20.38 minute ^11C activity. Rutherford Back Scattering (RBS) results will be presented as a comparison. Details of the process and the experiment will be discussed.
NASA Astrophysics Data System (ADS)
Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke
2016-05-01
Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.
Structure of Nano-sized CeO 2 Materials: Combined Scattering and Spectroscopic Investigations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchbank, Huw R.; Clark, Adam H.; Hyde, Timothy I.
Here, the nature of nano-sized ceria, CeO 2, systems were investigated using neutron and X-ray diffraction and X-ray absorption spectroscopy. Whilst both diffraction andtotal pair distribution functions (PDFs) revealed that in all the samples the occupancy of both Ce 4+ and O 2- are very close to the ideal stoichiometry, the analysis using reverse Monte Carlo technique revealedsignificant disorder around oxygen atoms in the nano sized ceria samples in comparison to the highly crystalline NIST standard.In addition, the analysis reveal that the main differences observed in the pair correlations from various X-ray and neutron diffraction techniques were attributed to themore » particle size of the CeO 2 prepared by the reported three methods. Furthermore, detailed analysis of the Ce L 3– and K-edge EXAFS data support this finding; in particular the decrease in higher shell coordination numbers with respect to the NIST standard, are attributed to differences in particle size.« less
Structure of Nano-sized CeO 2 Materials: Combined Scattering and Spectroscopic Investigations
Marchbank, Huw R.; Clark, Adam H.; Hyde, Timothy I.; ...
2016-08-29
Here, the nature of nano-sized ceria, CeO 2, systems were investigated using neutron and X-ray diffraction and X-ray absorption spectroscopy. Whilst both diffraction andtotal pair distribution functions (PDFs) revealed that in all the samples the occupancy of both Ce 4+ and O 2- are very close to the ideal stoichiometry, the analysis using reverse Monte Carlo technique revealedsignificant disorder around oxygen atoms in the nano sized ceria samples in comparison to the highly crystalline NIST standard.In addition, the analysis reveal that the main differences observed in the pair correlations from various X-ray and neutron diffraction techniques were attributed to themore » particle size of the CeO 2 prepared by the reported three methods. Furthermore, detailed analysis of the Ce L 3– and K-edge EXAFS data support this finding; in particular the decrease in higher shell coordination numbers with respect to the NIST standard, are attributed to differences in particle size.« less
Van’t Hoff global analyses of variable temperature isothermal titration calorimetry data
Freiburger, Lee A.; Auclair, Karine; Mittermaier, Anthony K.
2016-01-01
Isothermal titration calorimetry (ITC) can provide detailed information on the thermodynamics of biomolecular interactions in the form of equilibrium constants, KA, and enthalpy changes, ΔHA. A powerful application of this technique involves analyzing the temperature dependences of ITC-derived KA and ΔHA values to gain insight into thermodynamic linkage between binding and additional equilibria, such as protein folding. We recently developed a general method for global analysis of variable temperature ITC data that significantly improves the accuracy of extracted thermodynamic parameters and requires no prior knowledge of the coupled equilibria. Here we report detailed validation of this method using Monte Carlo simulations and an application to study coupled folding and binding in an aminoglycoside acetyltransferase enzyme. PMID:28018008
Further considerations of engine emissions from subsonic aircraft at cruise altitude
NASA Astrophysics Data System (ADS)
Lee, S. H.; Le Dilosquer, M.; Singh, R.; Rycroft, M. J.
The most significant man-made sources of pollution of the higher troposphere and lower stratosphere are exhaust emissions from civil subsonic aircraft at cruise altitude (8-12 km). This paper examines such issues by computational modelling of Boeing 747-400 flights during their cruise phase between selected city pairs, for example London to Tokyo. The engine performance, exhaust pollutant prediction, and detailed flight history analysis effects of different Mach numbers and of increasing the cruise altitude from 9.8 to 12.1 km during the flight rather than staying at a constant cruise altitude of 10.5 km are studied in detail. To minimise the overall effects of atmospheric pollution, a Mach number of 0.85 and increasing altitude is the favoured cruise technique.
Efficient Computation Of Behavior Of Aircraft Tires
NASA Technical Reports Server (NTRS)
Tanner, John A.; Noor, Ahmed K.; Andersen, Carl M.
1989-01-01
NASA technical paper discusses challenging application of computational structural mechanics to numerical simulation of responses of aircraft tires during taxing, takeoff, and landing. Presents details of three main elements of computational strategy: use of special three-field, mixed-finite-element models; use of operator splitting; and application of technique reducing substantially number of degrees of freedom. Proposed computational strategy applied to two quasi-symmetric problems: linear analysis of anisotropic tires through use of two-dimensional-shell finite elements and nonlinear analysis of orthotropic tires subjected to unsymmetric loading. Three basic types of symmetry and combinations exhibited by response of tire identified.
NASA Technical Reports Server (NTRS)
McQuigg, Thomas D.
2011-01-01
A better understanding of the effect of impact damage on composite structures is necessary to give the engineer an ability to design safe, efficient structures. Current composite structures suffer severe strength reduction under compressive loading conditions, due to even light damage, such as from low velocity impact. A review is undertaken to access the current state-of-development in the areas of experimental testing, and analysis methods. A set of experiments on honeycomb core sandwich panels, with thin woven fiberglass cloth facesheets, is described, which includes detailed instrumentation and unique observation techniques.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, J.S.; Gordon, R.L.; Lessor, D.L.
1981-08-01
Alternate measurement and data analysis procedures are discussed and compared for the application of reflective Nomarski differential interference contrast microscopy for the determination of surface slopes. The discussion includes the interpretation of a previously reported iterative procedure using the results of a detailed optical model and the presentation of a new procedure based on measured image intensity extrema. Surface slope determinations from these procedures are presented and compared with results from a previously reported curve fit analysis of image intensity data. The accuracy and advantages of the different procedures are discussed.
Antennas in matter: Fundamentals, theory, and applications
NASA Technical Reports Server (NTRS)
King, R. W. P.; Smith, G. S.; Owens, M.; Wu, T. T.
1981-01-01
The volume provides an introduction to antennas and probes embedded within or near material bodies such as the earth, the ocean, or a living organism. After a fundamental analysis of insulated and bare antennas, an advanced treatment of antennas in various media is presented, including a detailed study of the electromagnetic equations in homogeneous isotropic media, the complete theory of the bare dipole in a general medium, and a rigorous analysis of the insulated antenna as well as bare and insulated loop antennas. Finally, experimental models and measuring techniques related to antennas and probes in a general dissipative or dielectric medium are examined.
Omics Profiling in Precision Oncology*
Yu, Kun-Hsing; Snyder, Michael
2016-01-01
Cancer causes significant morbidity and mortality worldwide, and is the area most targeted in precision medicine. Recent development of high-throughput methods enables detailed omics analysis of the molecular mechanisms underpinning tumor biology. These studies have identified clinically actionable mutations, gene and protein expression patterns associated with prognosis, and provided further insights into the molecular mechanisms indicative of cancer biology and new therapeutics strategies such as immunotherapy. In this review, we summarize the techniques used for tumor omics analysis, recapitulate the key findings in cancer omics studies, and point to areas requiring further research on precision oncology. PMID:27099341
NASA Technical Reports Server (NTRS)
Flamant, Cyrille N.; Schwemmer, Geary K.; Korb, C. Laurence; Evans, Keith D.; Palm, Stephen P.
1999-01-01
Remote airborne measurements of the vertical and horizontal structure of the atmospheric pressure field in the lower troposphere are made with an oxygen differential absorption lidar (DIAL). A detailed analysis of this measurement technique is provided which includes corrections for imprecise knowledge of the detector background level, the oxygen absorption fine parameters, and variations in the laser output energy. In addition, we analyze other possible sources of systematic errors including spectral effects related to aerosol and molecular scattering interference by rotational Raman scattering and interference by isotopic oxygen fines.
Investigation of the jet-wake flow of a highly loaded centrifugal compressor impeller
NASA Technical Reports Server (NTRS)
Eckardt, D.
1978-01-01
Investigations, aimed at developing a better understanding of the complex flow field in high performance centrifugal compressors were performed. Newly developed measuring techniques for unsteady static and total pressures as well as flow directions, and a digital data analysis system for fluctuating signals were thoroughly tested. The loss-affected mixing process of the distorted impeller discharge flow was investigated in detail, in the absolute and relative system, at impeller tip speeds up to 380 m/s. A theoretical analysis proved good coincidence of the test results with the DEAN-SENOO theory, which was extended to compressible flows.
2014-01-01
Background Ultrasonography is an important diagnostic tool in the investigation of abdominal disease in the horse. Several factors may affect the ability to image different structures within the abdomen. The aim of the study was to describe the repeatability of identification of abdominal structures in normal horses using a detailed ultrasonographic examination technique and using a focused, limited preparation technique. Methods A detailed abdominal ultrasound examination was performed in five normal horses, repeated on five occasions (total of 25 examinations). The abdomen was divided into ten different imaging sites, and structures identified in each site were recorded. Five imaging sites were then selected for a single focused ultrasound examination in 20 normal horses. Limited patient preparation was performed. Structures were recorded as ‘identified’ if ultrasonographic features could be distinguished. The location of organs and their frequency of identification were recorded. Data from both phases were analysed to determine repeatability of identification of structures in each examination (irrespective of imaging site), and for each imaging site. Results Caecum, colon, spleen, liver and right kidney were repeatably identified using the detailed technique, and had defined locations. Large colon and right kidney were identified in 100% of examinations with both techniques. Liver, spleen, caecum, duodenum and other small intestine were identified more frequently with the detailed examination. Small intestine was most frequently identified in the ventral abdomen, its identification varied markedly within and between horses, and required repeated examinations in some horses. Left kidney could not be identified in every horse using either technique. Sacculated colon was identified in all ventral sites, and was infrequently identified in dorsal sites. Conclusions Caecum, sacculated large intestine, spleen, liver and right kidney were consistently identified with both techniques. There were some normal variations which should be considered when interpreting ultrasonographic findings in clinical cases: left kidney was not always identified, sacculated colon was occasionally identified in dorsal flank sites. Multiple imaging sites and repeated examinations may be required to identify small intestine. A focused examination identified most key structures, but has some limitations compared to a detailed examination. PMID:25238559
NASA Astrophysics Data System (ADS)
Sudra, Gunther; Speidel, Stefanie; Fritz, Dominik; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger
2007-03-01
Minimally invasive surgery is a highly complex medical discipline with various risks for surgeon and patient, but has also numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate with these new problems, we propose to support the surgeon's spatial cognition by using augmented reality (AR) techniques to directly visualize virtual objects in the surgical site. In order to generate an intelligent support, it is necessary to have an intraoperative assistance system that recognizes the surgical skills during the intervention and provides context-aware assistance surgeon using AR techniques. With MEDIASSIST we bundle our research activities in the field of intraoperative intelligent support and visualization. Our experimental setup consists of a stereo endoscope, an optical tracking system and a head-mounted-display for 3D visualization. The framework will be used as platform for the development and evaluation of our research in the field of skill recognition and context-aware assistance generation. This includes methods for surgical skill analysis, skill classification, context interpretation as well as assistive visualization and interaction techniques. In this paper we present the objectives of MEDIASSIST and first results in the fields of skill analysis, visualization and multi-modal interaction. In detail we present a markerless instrument tracking for surgical skill analysis as well as visualization techniques and recognition of interaction gestures in an AR environment.
Mehl, S.; Hill, M.C.
2001-01-01
Five common numerical techniques for solving the advection-dispersion equation (finite difference, predictor corrector, total variation diminishing, method of characteristics, and modified method of characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using discrete, randomly distributed, homogeneous blocks of five sand types. This experimental model provides an opportunity to compare the solution techniques: the heterogeneous hydraulic-conductivity distribution of known structure can be accurately represented by a numerical model, and detailed measurements can be compared with simulated concentrations and total flow through the tank. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation given the different methods of simulating solute transport. The breakthrough curves show that simulated peak concentrations, even at very fine grid spacings, varied between the techniques because of different amounts of numerical dispersion. Sensitivity-analysis results revealed: (1) a high correlation between hydraulic conductivity and porosity given the concentration and flow observations used, so that both could not be estimated; and (2) that the breakthrough curve data did not provide enough information to estimate individual values of dispersivity for the five sands. This study demonstrates that the choice of assigned dispersivity and the amount of numerical dispersion present in the solution technique influence estimated hydraulic conductivity values to a surprising degree.
Wienke, B R; O'Leary, T R
2008-05-01
Linking model and data, we detail the LANL diving reduced gradient bubble model (RGBM), dynamical principles, and correlation with data in the LANL Data Bank. Table, profile, and meter risks are obtained from likelihood analysis and quoted for air, nitrox, helitrox no-decompression time limits, repetitive dive tables, and selected mixed gas and repetitive profiles. Application analyses include the EXPLORER decompression meter algorithm, NAUI tables, University of Wisconsin Seafood Diver tables, comparative NAUI, PADI, Oceanic NDLs and repetitive dives, comparative nitrogen and helium mixed gas risks, USS Perry deep rebreather (RB) exploration dive,world record open circuit (OC) dive, and Woodville Karst Plain Project (WKPP) extreme cave exploration profiles. The algorithm has seen extensive and utilitarian application in mixed gas diving, both in recreational and technical sectors, and forms the bases forreleased tables and decompression meters used by scientific, commercial, and research divers. The LANL Data Bank is described, and the methods used to deduce risk are detailed. Risk functions for dissolved gas and bubbles are summarized. Parameters that can be used to estimate profile risk are tallied. To fit data, a modified Levenberg-Marquardt routine is employed with L2 error norm. Appendices sketch the numerical methods, and list reports from field testing for (real) mixed gas diving. A Monte Carlo-like sampling scheme for fast numerical analysis of the data is also detailed, as a coupled variance reduction technique and additional check on the canonical approach to estimating diving risk. The method suggests alternatives to the canonical approach. This work represents a first time correlation effort linking a dynamical bubble model with deep stop data. Supercomputing resources are requisite to connect model and data in application.
In vivo stationary flux analysis by 13C labeling experiments.
Wiechert, W; de Graaf, A A
1996-01-01
Stationary flux analysis is an invaluable tool for metabolic engineering. In the last years the metabolite balancing technique has become well established in the bioengineering community. On the other hand metabolic tracer experiments using 13C isotopes have long been used for intracellular flux determination. Only recently have both techniques been fully combined to form a considerably more powerful flux analysis method. This paper concentrates on modeling and data analysis for the evaluation of such stationary 13C labeling experiments. After reviewing recent experimental developments, the basic equations for modeling carbon labeling in metabolic systems, i.e. metabolite, carbon label and isotopomer balances, are introduced and discussed in some detail. Then the basics of flux estimation from measured extracellular fluxes combined with carbon labeling data are presented and, finally, this method is illustrated by using an example from C. glutamicum. The main emphasis is on the investigation of the extra information that can be obtained with tracer experiments compared with the metabolite balancing technique alone. As a principal result it is shown that the combined flux analysis method can dispense with some rather doubtful assumptions on energy balancing and that the forward and backward flux rates of bidirectional reaction steps can be simultaneously determined in certain situations. Finally, it is demonstrated that the variant of fractional isotopomer measurement is even more powerful than fractional labeling measurement but requires much higher numerical effort to solve the balance equations.
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
NASA Technical Reports Server (NTRS)
Abdi, Frank
1996-01-01
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.
The Technique of the Sound Studio: Radio, Record Production, Television, and Film. Revised Edition.
ERIC Educational Resources Information Center
Nisbett, Alec
Detailed explanations of the studio techniques used in radio, record, television, and film sound production are presented in as non-technical language as possible. An introductory chapter discusses the physics and physiology of sound. Subsequent chapters detail standards for sound control in the studio; explain the planning and routine of a sound…
Nuclear Weak Rates and Detailed Balance in Stellar Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Misch, G. Wendell, E-mail: wendell@sjtu.edu, E-mail: wendell.misch@gmail.com
Detailed balance is often invoked in discussions of nuclear weak transitions in astrophysical environments. Satisfaction of detailed balance is rightly touted as a virtue of some methods of computing nuclear transition strengths, but I argue that it need not necessarily be strictly obeyed in astrophysical environments, especially when the environment is far from weak equilibrium. I present the results of shell model calculations of nuclear weak strengths in both charged-current and neutral-current channels at astrophysical temperatures, finding some violation of detailed balance. I show that a slight modification of the technique to strictly obey detailed balance has little effect onmore » the reaction rates associated with these strengths under most conditions, though at high temperature the modified technique in fact misses some important strength. I comment on the relationship between detailed balance and weak equilibrium in astrophysical conditions.« less
NASA Astrophysics Data System (ADS)
Kumar, Vaibhav; Ng, Ivan; Sheard, Gregory J.; Brocher, Eric; Hourigan, Kerry; Fouras, Andreas
2011-08-01
This paper examines the shock cell structure, vorticity and velocity field at the exit of an underexpanded jet nozzle using a hydraulic analogy and the Reference Image Topography technique. Understanding the flow in this region is important for the mitigation of screech, an aeroacoustic problem harmful to aircraft structures. Experiments are conducted on a water table, allowing detailed quantitative investigation of this important flow regime at a greatly reduced expense. Conventional Particle Image Velocimetry is employed to determine the velocity and vorticity fields of the nozzle exit region. Applying Reference Image Topography, the wavy water surface is reconstructed and when combined with the hydraulic analogy, provides a pressure map of the region. With this approach subtraction of surfaces is used to highlight the unsteady regions of the flow, which is not as convenient or quantitative with conventional Schlieren techniques. This allows a detailed analysis of the shock cell structures and their interaction with flow instabilities in the shear layer that are the underlying cause of jet screech.
Ultrafast X-ray Imaging of Fuel Sprays
NASA Astrophysics Data System (ADS)
Wang, Jin
2007-01-01
Detailed analysis of fuel sprays has been well recognized as an important step for optimizing the operation of internal combustion engines to improve efficiency and reduce emissions. Ultrafast radiographic and tomographic techniques have been developed for probing the fuel distribution close to the nozzles of direct-injection diesel and gasoline injectors. The measurement was made using x-ray absorption of monochromatic synchrotron-generated radiation, allowing quantitative determination of the fuel distribution in this optically impenetrable region with a time resolution on the order of 1 μs. Furthermore, an accurate 3-dimensional fuel-density distribution, in the form of fuel volume fraction, was obtained by the time-resolved computed tomography. These quantitative measurements constitute the most detailed near-nozzle study of a fuel spray to date. With high-energy and high-brilliance x-ray beams available at the Advanced Photon Source, propagation-based phase-enhanced imaging was developed as a unique metrology technique to visualize the interior of an injection nozzle through a 3-mm-thick steel with a 10-μs temporal resolution, which is virtually impossible by any other means.
NASA Astrophysics Data System (ADS)
Pachaiappan, Rekha; Prakasarao, Aruna; Manoharan, Yuvaraj; Dornadula, Koteeswaran; Singaravelu, Ganesan
2017-02-01
During metabolism the metabolites such as hormones, proteins and enzymes were released in to the blood stream by the cells. These metabolites reflect any change that occurs due to any disturbances in normal metabolic function of the human system. This was well observed with the altered spectral signatures observed with fluorescence spectroscopic technique. Previously many have reported on the significance of native fluorescence spectroscopic method in the diagnosis of cancer. As fluorescence spectroscopy is sensitive and simple, it has complementary techniques such as excitation-emission matrix, synchronous and polarization. The fluorescence polarization measurement provides details about any association or binding reactions and denaturing effects that occurs due to change in the micro environment of cells and tissues. In this study, we have made an attempt in the diagnosis of oral cancer at 405 nm excitation using fluorescence polarization measurement. The fluorescence anisotropic values calculated from polarized fluorescence spectral data of normal and oral cancer subjects yielded a good accuracy when analyzed with linear discriminant analysis based artificial neural network. The results will be discussed in detail.
Lawrence, J R; Swerhone, G D W; Leppard, G G; Araki, T; Zhang, X; West, M M; Hitchcock, A P
2003-09-01
Confocal laser scanning microscopy (CLSM), transmission electron microscopy (TEM), and soft X-ray scanning transmission X-ray microscopy (STXM) were used to map the distribution of macromolecular subcomponents (e.g., polysaccharides, proteins, lipids, and nucleic acids) of biofilm cells and matrix. The biofilms were developed from river water supplemented with methanol, and although they comprised a complex microbial community, the biofilms were dominated by heterotrophic bacteria. TEM provided the highest-resolution structural imaging, CLSM provided detailed compositional information when used in conjunction with molecular probes, and STXM provided compositional mapping of macromolecule distributions without the addition of probes. By examining exactly the same region of a sample with combinations of these techniques (STXM with CLSM and STXM with TEM), we demonstrate that this combination of multimicroscopy analysis can be used to create a detailed correlative map of biofilm structure and composition. We are using these correlative techniques to improve our understanding of the biochemical basis for biofilm organization and to assist studies intended to investigate and optimize biofilms for environmental remediation applications.
Diffraction Techniques in Structural Biology
Egli, Martin
2010-01-01
A detailed understanding of chemical and biological function and the mechanisms underlying the activities ultimately requires atomic-resolution structural data. Diffraction-based techniques such as single-crystal X-ray crystallography, electron microscopy and neutron diffraction are well established and have paved the road to the stunning successes of modern-day structural biology. The major advances achieved in the last 20 years in all aspects of structural research, including sample preparation, crystallization, the construction of synchrotron and spallation sources, phasing approaches and high-speed computing and visualization, now provide specialists and non-specialists alike with a steady flow of molecular images of unprecedented detail. The present chapter combines a general overview of diffraction methods with a step-by-step description of the process of a single-crystal X-ray structure determination experiment, from chemical synthesis or expression to phasing and refinement, analysis and quality control. For novices it may serve as a stepping-stone to more in-depth treatises of the individual topics. Readers relying on structural information for interpreting functional data may find it a useful consumer guide. PMID:20517991
Chemometric Data Analysis for Deconvolution of Overlapped Ion Mobility Profiles
NASA Astrophysics Data System (ADS)
Zekavat, Behrooz; Solouki, Touradj
2012-11-01
We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution.
Liu, Yang; Wilson, W David
2010-01-01
Surface plasmon resonance (SPR) technology with biosensor surfaces has become a widely-used tool for the study of nucleic acid interactions without any labeling requirements. The method provides simultaneous kinetic and equilibrium characterization of the interactions of biomolecules as well as small molecule-biopolymer binding. SPR monitors molecular interactions in real time and provides significant advantages over optical or calorimetic methods for systems with strong binding coupled to small spectroscopic signals and/or reaction heats. A detailed and practical guide for nucleic acid interaction analysis using SPR-biosensor methods is presented. Details of the SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips, and samples, as well as extensive information on experimental design, quantitative and qualitative data analysis and presentation. A specific example of the interaction of a minor-groove-binding agent with DNA is evaluated by both kinetic and steady-state SPR methods to illustrate the technique. Since the molecules that bind cooperatively to specific DNA sequences are attractive for many applications, a cooperative small molecule-DNA interaction is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Madhavi Z; Labbe, Nicole; Wagner, Rebekah J.
2013-01-01
This chapter details the application of LIBS in a number of environmental areas of research such as carbon sequestration and climate change. LIBS has also been shown to be useful in other high resolution environmental applications for example, elemental mapping and detection of metals in plant materials. LIBS has also been used in phytoremediation applications. Other biological research involves a detailed understanding of wood chemistry response to precipitation variations and also to forest fires. A cross-section of Mountain pine (pinceae Pinus pungen Lamb.) was scanned using a translational stage to determine the differences in the chemical features both before andmore » after a fire event. Consequently, by monitoring the elemental composition pattern of a tree and by looking for abrupt changes, one can reconstruct the disturbance history of a tree and a forest. Lastly we have shown that multivariate analysis of the LIBS data is necessary to standardize the analysis and correlate to other standard laboratory techniques. LIBS along with multivariate statistical analysis makes it a very powerful technology that can be transferred from laboratory to field applications with ease.« less
Sensor image prediction techniques
NASA Astrophysics Data System (ADS)
Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.
1981-02-01
The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.
An efficient liner cooling scheme for advanced small gas turbine combustors
NASA Technical Reports Server (NTRS)
Paskin, Marc D.; Mongia, Hukam C.; Acosta, Waldo A.
1993-01-01
A joint Army/NASA program was conducted to design, fabricate, and test an advanced, small gas turbine, reverse-flow combustor utilizing a compliant metal/ceramic (CMC) wall cooling concept. The objectives of this effort were to develop a design method (basic design data base and analysis) for the CMC cooling technique and then demonstrate its application to an advanced cycle, small, reverse-flow combustor with 3000 F burner outlet temperature. The CMC concept offers significant improvements in wall cooling effectiveness resulting in a large reduction in cooling air requirements. Therefore, more air is available for control of burner outlet temperature pattern in addition to the benefits of improved efficiency, reduced emissions, and lower smoke levels. The program was divided into four tasks. Task 1 defined component materials and localized design of the composite wall structure in conjunction with development of basic design models for the analysis of flow and heat transfer through the wall. Task 2 included implementation of the selected materials and validated design models during combustor preliminary design. Detail design of the selected combustor concept and its refinement with 3D aerothermal analysis were completed in Task 3. Task 4 covered detail drawings, process development and fabrication, and a series of burner rig tests. The purpose of this paper is to provide details of the investigation into the fundamental flow and heat transfer characteristics of the CMC wall structure as well as implementation of the fundamental analysis method for full-scale combustor design.
Metallic glass coating on metals plate by adjusted explosive welding technique
NASA Astrophysics Data System (ADS)
Liu, W. D.; Liu, K. X.; Chen, Q. Y.; Wang, J. T.; Yan, H. H.; Li, X. J.
2009-09-01
Using an adjusted explosive welding technique, an aluminum plate has been coated by a Fe-based metallic glass foil in this work. Scanning electronic micrographs reveal a defect-free metallurgical bonding between the Fe-based metallic glass foil and the aluminum plate. Experimental evidence indicates that the Fe-based metallic glass foil almost retains its amorphous state and mechanical properties after the explosive welding process. Additionally, the detailed explosive welding process has been simulated by a self-developed hydro-code and the bonding mechanism has been investigated by numerical analysis. The successful welding between the Fe-based metallic glass foil and the aluminum plate provides a new way to obtain amorphous coating on general metal substrates.
Digital processing of satellite imagery application to jungle areas of Peru
NASA Technical Reports Server (NTRS)
Pomalaza, J. C. (Principal Investigator); Pomalaza, C. A.; Espinoza, J.
1976-01-01
The author has identified the following significant results. The use of clustering methods permits the development of relatively fast classification algorithms that could be implemented in an inexpensive computer system with limited amount of memory. Analysis of CCTs using these techniques can provide a great deal of detail permitting the use of the maximum resolution of LANDSAT imagery. Potential cases were detected in which the use of other techniques for classification using a Gaussian approximation for the distribution functions can be used with advantage. For jungle areas, channels 5 and 7 can provide enough information to delineate drainage patterns, swamp and wet areas, and make a reasonable broad classification of forest types.
Fixed gain and adaptive techniques for rotorcraft vibration control
NASA Technical Reports Server (NTRS)
Roy, R. H.; Saberi, H. A.; Walker, R. A.
1985-01-01
The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.
Analyses of amphibole asbestiform fibers in municipal water supplies
Nicholson, William J.
1974-01-01
Details are given of the techniques used in the analysis of asbestiform fibers in the water systems of Duluth, Minnesota and other cities. Photographic electron diffraction and electron microprobe analyses indicated that the concentration of verified amphibole mineral fibers ranged from 20 × 106 to 75 × 106 fibers/l. Approximately 50–60% of the fibers were in the cummingtonite-grunerite series and 20% were in the actinolite-tremolite series. About 5% were chemically identical with amosite. A wide variety of analytical techniques must be employed for unique identification of the mineral species present in water systems. ImagesFIGURE 1.FIGURE 2.FIGURE 3.FIGURE 4.FIGURE 5.FIGURE 6. PMID:4470931
Hybrid 3D reconstruction and image-based rendering techniques for reality modeling
NASA Astrophysics Data System (ADS)
Sequeira, Vitor; Wolfart, Erik; Bovisio, Emanuele; Biotti, Ester; Goncalves, Joao G. M.
2000-12-01
This paper presents a component approach that combines in a seamless way the strong features of laser range acquisition with the visual quality of purely photographic approaches. The relevant components of the system are: (i) Panoramic images for distant background scenery where parallax is insignificant; (ii) Photogrammetry for background buildings and (iii) High detailed laser based models for the primary environment, structure of exteriors of buildings and interiors of rooms. These techniques have a wide range of applications in visualization, virtual reality, cost effective as-built analysis of architectural and industrial environments, building facilities management, real-estate, E-commerce, remote inspection of hazardous environments, TV production and many others.
Searching for Extant Life on Mars - The ATP-Firefly LuciferinLuciferase Technique
NASA Astrophysics Data System (ADS)
Obousy, R. K.; Tziolas, A. C.; Kaltsas, K.; Sims, M. R.; Grant, W. D.
We have investigated the use of the ATP-Firefly Luciferin/Luciferase (FFL) enzymic photoluminescent reaction as a possible means of detecting extant life in the Martian environment. Experiments carried out by the authors illustrate the capacity of the method to successfully detect extant forms of life on Mars assuming ATP is an intrinsic part of the biochemistry of such life-forms. A photodiode based apparatus, built to test the assumptions and applicability of the ATP-Firefly Luciferase/Luciferin technique to an exobiologically inclined mission to Mars, revealed the adequate resolution and reproducibility of the methodology plus areas of improvement. Also detailed are extraction, delivery and analysis system concepts, proposed for future Mars missions.
A survey of compiler optimization techniques
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1972-01-01
Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.
An Effective Technique for Enhancing an Intrauterine Catheter Fetal Electrocardiogram
NASA Astrophysics Data System (ADS)
Horner, Steven L.; Holls, William M.
2003-12-01
Physician can obtain fetal heart rate, electrophysiological information, and uterine contraction activity for determining fetal status from an intrauterine catheters electrocardiogram with the maternal electrocardiogram canceled. In addition, the intrauterine catheter would allow physicians to acquire fetal status with one non-invasive to the fetus biosensor as compared to invasive to the fetus scalp electrode and intrauterine pressure catheter used currently. A real-time maternal electrocardiogram cancellation technique of the intrauterine catheters electrocardiogram will be discussed along with an analysis for the methods effectiveness with synthesized and clinical data. The positive results from an original detailed subjective and objective analysis of synthesized and clinical data clearly indicate that the maternal electrocardiogram cancellation method was found to be effective. The resulting intrauterine catheters electrocardiogram from effectively canceling the maternal electrocardiogram could be used for determining fetal heart rate, fetal electrocardiogram electrophysiological information, and uterine contraction activity.
Light water reactor lower head failure analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rempe, J.L.; Chavez, S.A.; Thinnes, G.L.
1993-10-01
This document presents the results from a US Nuclear Regulatory Commission-sponsored research program to investigate the mode and timing of vessel lower head failure. Major objectives of the analysis were to identify plausible failure mechanisms and to develop a method for determining which failure mode would occur first in different light water reactor designs and accident conditions. Failure mechanisms, such as tube ejection, tube rupture, global vessel failure, and localized vessel creep rupture, were studied. Newly developed models and existing models were applied to predict which failure mechanism would occur first in various severe accident scenarios. So that a broadermore » range of conditions could be considered simultaneously, calculations relied heavily on models with closed-form or simplified numerical solution techniques. Finite element techniques-were employed for analytical model verification and examining more detailed phenomena. High-temperature creep and tensile data were obtained for predicting vessel and penetration structural response.« less
Radiation Induced Degradation of the White Thermal Control Paints Z-93 and Z-93P
NASA Technical Reports Server (NTRS)
Edwards, D. L.; Zwiener, J. M.; Wertz, G. E.; Vaughn, J. A.; Kamenetzky, R. R.; Finckenor, M. M.; Meshishnek, M. J.
1996-01-01
This paper details a comparison analysis of the zinc oxide pigmented white thermal control paints Z-93 and Z-93P. Both paints were simultaneously exposed to combined space environmental effects and analyzed using an in-vacuo reflectance technique. The dose applied to the paints was approximately equivalent to 5 years in a geosynchronous orbit. This comparison analysis showed that Z-93P is an acceptable substitute for Z-93. Irradiated samples of Z-93 and Z-93P were subjected to additional exposures of ultraviolet (UV) radiation and analyzed using the in-vacuo reflectance technique to investigate UV activated reflectance recovery. Both samples showed minimal UV activated reflectance recovery after an additional 190 equivalent sun hour (ESH) exposure. Reflectance response utilizing nitrogen as a repressurizing gas instead of air was also investigated. This investigation found the rates of reflectance recovery when repressurized with nitrogen are slower than when repressurized with air.
Radiation Induced Degradation of White Thermal Control Paint
NASA Technical Reports Server (NTRS)
Edwards, D. L.; Zwiener, J. M.; Wertz, G. E.; Vaughn, Jason A.; Kamenetzky, Rachel R.; Finckenor, M. M.; Meshishnek, M. J.
1999-01-01
This paper details a comparison analysis of the zinc-oxide pigmented white thermal control paints Z-93 and Z-93P. Both paints were simultaneously exposed to combined space environmental effects and analyzed using an in-vacuo reflectance technique. The dose applied to the paints was approximately equivalent to 5 yr in a geosynchronous orbit. This comparison analysis showed that Z-93P is an acceptable substitute for Z-93. Irradiated samples of Z-93 and Z-93P were subjected to additional exposures of ultraviolet (UV) radiation and analyzed using the in-vacuo reflectance technique to investigate UV activated reflectance recovery. Both samples showed minimal UV activated reflectance recovery after an additional 190 equivalent Sun hour (ESH) exposure. Reflectance response utilizing nitrogen as a repressurizing gas instead of air was also investigated. This investigation found the rates of reflectance recovery when repressurized with nitrogen are slower than when repressurized with air.
Proton irradiation studies on Al and Al5083 alloy
NASA Astrophysics Data System (ADS)
Bhattacharyya, P.; Gayathri, N.; Bhattacharya, M.; Gupta, A. Dutta; Sarkar, Apu; Dhar, S.; Mitra, M. K.; Mukherjee, P.
2017-10-01
The change in the microstructural parameters and microhardness values in 6.5 MeV proton irradiated pure Al and Al5083 alloy samples have been evaluated using different model based techniques of X-ray diffraction Line Profile Analysis (XRD) and microindendation techniques. The detailed line profile analysis of the XRD data showed that the domain size increases and saturates with irradiation dose both in the case of Al and Al5083 alloy. The corresponding microstrain values did not show any change with irradiation dose in the case of the pure Al but showed an increase at higher irradiation doses in the case of Al5083 alloy. The microindendation results showed that unirradiated Al5083 alloy has higher hardness value compared to that of unirradiated pure Al. The hardness increased marginally with irradiation dose in the case of Al5083, whereas for pure Al, there was no significant change with dose.
NASA Astrophysics Data System (ADS)
Davis, Benjamin L.; Berrier, J. C.; Shields, D. W.; Kennefick, J.; Kennefick, D.; Seigar, M. S.; Lacy, C. H. S.; Puerari, I.
2012-01-01
A logarithmic spiral is a prominent feature appearing in a majority of observed galaxies. This feature has long been associated with the traditional Hubble classification scheme, but historical quotes of pitch angle of spiral galaxies have been almost exclusively qualitative. We have developed a methodology, utilizing Two-Dimensional Fast Fourier Transformations of images of spiral galaxies, in order to isolate and measure the pitch angles of their spiral arms. Our technique provides a quantitative way to measure this morphological feature. This will allow the precise comparison of spiral galaxy evolution to other galactic parameters and test spiral arm genesis theories. In this work, we detail our image processing and analysis of spiral galaxy images and discuss the robustness of our analysis techniques. The authors gratefully acknowledge support for this work from NASA Grant NNX08AW03A.
Fujihara, Yuki; Saito, Taichi; Huetteman, Helen E; Sterbenz, Jennifer M; Chung, Kevin C
2018-04-01
A well-organized, thoughtful study design is essential for creating an impactful study. However, pressures promoting high output from researchers can lead to rushed study proposals that overlook critical weaknesses in the study design that can affect the validity of the conclusions. Researchers can benefit from thorough review of past failed proposals when crafting new research ideas. Conceptual frameworks and root cause analysis are two innovative techniques that can be used during study development to identify flaws and prevent study failures. In addition, conceptual frameworks and root cause analysis can be combined to complement each other to provide both a big picture and detailed view of a study proposal. This article describes these two common analytical methods and provides an example of how they can be used to evaluate and improve a study design by critically examining a previous failed research idea.
3D FISH to analyse gene domain-specific chromatin re-modeling in human cancer cell lines.
Kocanova, Silvia; Goiffon, Isabelle; Bystricky, Kerstin
2018-06-01
Fluorescence in situ hybridization (FISH) is a common technique used to label DNA and/or RNA for detection of a genomic region of interest. However, the technique can be challenging, in particular when applied to single genes in human cancer cells. Here, we provide a step-by-step protocol for analysis of short (35 kb-300 kb) genomic regions in three dimensions (3D). We discuss the experimental design and provide practical considerations for 3D imaging and data analysis to determine chromatin folding. We demonstrate that 3D FISH using BACs (Bacterial Artificial Chromosomes) or fosmids can provide detailed information of the architecture of gene domains. More specifically, we show that mapping of specific chromatin landscapes informs on changes associated with estrogen stimulated gene activity in human breast cancer cell lines. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Naqwi, Amir A.; Durst, Franz
1993-07-01
Dual-beam laser measuring techniques are now being used, not only for velocimetry, but also for simultaneous measurements of particle size and velocity in particulate two-phase flows. However, certain details of these optical techniques, such as the effect of Gaussian beam profiles on the accuracy of the measurements, need to be further explored. To implement innovative improvements, a general analytic framework is needed in which performances of various dual-beam instruments could be quantitatively studied and compared. For this purpose, the analysis of light scattering in a generalized dual-wave system is presented in this paper. The present simulation model provides a basis for studying effects of nonplanar beam structures of incident waves, taking into account arbitrary modes of polarization. A polarizer is included in the receiving optics as well. The peculiar aspects of numerical integration of scattered light over circular, rectangular, and truncated circular apertures are also considered.
NASA Technical Reports Server (NTRS)
Pierzga, M. J.
1981-01-01
The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.
2014-08-15
Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oßwald, Patrick; Köhler, Markus
A new high-temperature flow reactor experiment utilizing the powerful molecular beam mass spectrometry (MBMS) technique for detailed observation of gas phase kinetics in reacting flows is presented. The reactor design provides a consequent extension of the experimental portfolio of validation experiments for combustion reaction kinetics. Temperatures up to 1800 K are applicable by three individually controlled temperature zones with this atmospheric pressure flow reactor. Detailed speciation data are obtained using the sensitive MBMS technique, providing in situ access to almost all chemical species involved in the combustion process, including highly reactive species such as radicals. Strategies for quantifying the experimentalmore » data are presented alongside a careful analysis of the characterization of the experimental boundary conditions to enable precise numeric reproduction of the experimental results. The general capabilities of this new analytical tool for the investigation of reacting flows are demonstrated for a selected range of conditions, fuels, and applications. A detailed dataset for the well-known gaseous fuels, methane and ethylene, is provided and used to verify the experimental approach. Furthermore, application for liquid fuels and fuel components important for technical combustors like gas turbines and engines is demonstrated. Besides the detailed investigation of novel fuels and fuel components, the wide range of operation conditions gives access to extended combustion topics, such as super rich conditions at high temperature important for gasification processes, or the peroxy chemistry governing the low temperature oxidation regime. These demonstrations are accompanied by a first kinetic modeling approach, examining the opportunities for model validation purposes.« less
NASA Technical Reports Server (NTRS)
Tawfik, Hazem
1991-01-01
A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
VIRTIS on Venus Express: retrieval of real surface emissivity on global scales
NASA Astrophysics Data System (ADS)
Arnold, Gabriele E.; Kappel, David; Haus, Rainer; Telléz Pedroza, Laura; Piccioni, Giuseppe; Drossart, Pierre
2015-09-01
The extraction of surface emissivity data provides the data base for surface composition analyses and enables to evaluate Venus' geology. The Visible and InfraRed Thermal Imaging Spectrometer (VIRTIS) aboard ESA's Venus Express mission measured, inter alia, the nightside thermal emission of Venus in the near infrared atmospheric windows between 1.0 and 1.2 μm. These data can be used to determine information about surface properties on global scales. This requires a sophisticated approach to understand and consider the effects and interferences of different atmospheric and surface parameters influencing the retrieved values. In the present work, results of a new technique for retrieval of the 1.0 - 1.2 μm - surface emissivity are summarized. It includes a Multi-Window Retrieval Technique, a Multi-Spectrum Retrieval technique (MSR), and a detailed reliability analysis. The MWT bases on a detailed radiative transfer model making simultaneous use of information from different atmospheric windows of an individual spectrum. MSR regularizes the retrieval by incorporating available a priori mean values, standard deviations as well as spatial-temporal correlations of parameters to be retrieved. The capability of this method is shown for a selected surface target area. Implications for geologic investigations are discussed. Based on these results, the work draws conclusions for future Venus surface composition analyses on global scales using spectral remote sensing techniques. In that context, requirements for observational scenarios and instrumental performances are investigated, and recommendations are derived to optimize spectral measurements for Venus' surface studies.
NASA Astrophysics Data System (ADS)
Kondo, Yoshiyuki; Suga, Keishi; Hibi, Koki; Okazaki, Toshihiko; Komeno, Toshihiro; Kunugi, Tomoaki; Serizawa, Akimi; Yoneda, Kimitoshi; Arai, Takahiro
2009-02-01
An advanced experimental technique has been developed to simulate two-phase flow behavior in a light water reactor (LWR). The technique applies three kinds of methods; (1) use of sulfur-hexafluoride (SF6) gas and ethanol (C2H5OH) liquid at atmospheric temperature and a pressure less than 1.0MPa, where the fluid properties are similar to steam-water ones in the LWR, (2) generation of bubble with a sintering tube, which simulates bubble generation on heated surface in the LWR, (3) measurement of detailed bubble distribution data with a bi-optical probe (BOP), (4) and measurement of liquid velocities with the tracer liquid. This experimental technique provides easy visualization of flows by using a large scale experimental apparatus, which gives three-dimensional flows, and measurement of detailed spatial distributions of two-phase flow. With this technique, we have carried out experiments simulating two-phase flow behavior in a single-channel geometry, a multi-rod-bundle one, and a horizontal-tube-bundle one on a typical natural circulation reactor system. Those experiments have clarified a) a flow regime map in a rod bundle on the transient region between bubbly and churn flow, b) three-dimensional flow behaviour in rod-bundles where inter-subassembly cross-flow occurs, c) bubble-separation behavior with consideration of reactor internal structures. The data have given analysis models for the natural circulation reactor design with good extrapolation.
Optically and non-optically excited thermography for composites: A review
NASA Astrophysics Data System (ADS)
Yang, Ruizhen; He, Yunze
2016-03-01
Composites, such as glass fiber reinforced polymer (GFRP) and carbon fiber reinforced polymer (CFRP), and adhesive bonding are being increasingly used in fields of aerospace, renewable energy, civil and architecture, and other industries. Flaws and damages are inevitable during either fabrication or lifetime of composites structures or components. Thus, nondestructive testing (NDT) are extremely required to prevent failures and to increase reliability of composite structures or components in both manufacture and in-service inspection. Infrared thermography techniques including pulsed thermography, pulsed phase thermography, and lock-in thermography have shown the great potential and advantages. Besides conventional optical thermography, other sources such as laser, eddy current, microwave, and ultrasound excited thermography are drawing increasingly attentions for composites. In this work, a fully, in-depth and comprehensive review of thermography NDT techniques for composites inspection was conducted based on an orderly and concise literature survey and detailed analysis. Firstly, basic concepts for thermography NDT were defined and introduced, such as volume heating thermography. Next, the developments of conventional optic, laser, eddy current, microwave, and ultrasound thermography for composite inspection were reviewed. Then, some case studies for scanning thermography were also reviewed. After that, the strengths and limitations of thermography techniques were concluded through comparison studies. At last, some research trends were predicted. This work containing critical overview, detailed comparison and extensive list of references will disseminates knowledge between users, manufacturers, designers and researchers involved in composite structures or components inspection by means of thermography NDT techniques.