Sample records for allowed detailed analysis

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  2. Coupled reactors analysis: New needs and advances using Monte Carlo methodology

    DOE PAGES

    Aufiero, M.; Palmiotti, G.; Salvatores, M.; ...

    2016-08-20

    Coupled reactors and the coupling features of large or heterogeneous core reactors can be investigated with the Avery theory that allows a physics understanding of the main features of these systems. However, the complex geometries that are often encountered in association with coupled reactors, require a detailed geometry description that can be easily provided by modern Monte Carlo (MC) codes. This implies a MC calculation of the coupling parameters defined by Avery and of the sensitivity coefficients that allow further detailed physics analysis. The results presented in this paper show that the MC code SERPENT has been successfully modifed tomore » meet the required capabilities.« less

  3. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  4. GOATS - Orbitology Component

    NASA Technical Reports Server (NTRS)

    Haber, Benjamin M.; Green, Joseph J.

    2010-01-01

    The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.

  5. An Application of Activity Theory

    ERIC Educational Resources Information Center

    Marken, James A.

    2006-01-01

    Activity Theory has often been used in workplace settings to gain new theoretical understandings about work and the humans who engage in work, but rarely has there been sufficient detail in the literature to allow HPT practitioners to do their own activity analysis. The detail presented in this case is sufficient for HPT practitioners to begin to…

  6. Assessing efficiency of software production for NASA-SEL data

    NASA Technical Reports Server (NTRS)

    Vonmayrhauser, Anneliese; Roeseler, Armin

    1993-01-01

    This paper uses production models to identify and quantify efficient allocation of resources and key drivers of software productivity for project data in the NASA-SEL database. While analysis allows identification of efficient projects, many of the metrics that could have provided a more detailed analysis are not at a level of measurement to allow production model analysis. Production models must be used with proper parameterization to be successful. This may mean a new look at which metrics are helpful for efficiency assessment.

  7. Abstractions for DNA circuit design.

    PubMed

    Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew

    2012-03-07

    DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.

  8. Finite element based micro-mechanics modeling of textile composites

    NASA Technical Reports Server (NTRS)

    Glaessgen, E. H.; Griffin, O. H., Jr.

    1995-01-01

    Textile composites have the advantage over laminated composites of a significantly greater damage tolerance and resistance to delamination. Currently, a disadvantage of textile composites is the inability to examine the details of the internal response of these materials under load. Traditional approaches to the study fo textile based composite materials neglect many of the geometric details that affect the performance of the material. The present three dimensional analysis, based on the representative volume element (RVE) of a plain weave, allows prediction of the internal details of displacement, strain, stress, and failure quantities. Through this analysis, the effect of geometric and material parameters on the aforementioned quantities are studied.

  9. Falcon: A Temporal Visual Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.

    2016-09-05

    Flexible visible exploration of long, high-resolution time series from multiple sensor streams is a challenge in several domains. Falcon is a visual analytics approach that helps researchers acquire a deep understanding of patterns in log and imagery data. Falcon allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations with multiple levels of detail. These capabilities are applicable to the analysis of any quantitative time series.

  10. Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Lee Kenneth

    2017-03-01

    This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.

  11. Computer-Assisted Analysis of Spontaneous Speech: Quantification of Basic Parameters in Aphasic and Unimpaired Language

    ERIC Educational Resources Information Center

    Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter

    2012-01-01

    Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…

  12. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.

  13. Analysis of the impact of simulation model simplifications on the quality of low-energy buildings simulation results

    NASA Astrophysics Data System (ADS)

    Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr

    2017-11-01

    The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.

  14. Accelerated Insertion of Materials - Composites

    DTIC Science & Technology

    2001-08-28

    Details • Damage Tolerance • Repair • Validation of Analysis Methodology • Fatigue • Static • Acoustic • Configuration Details • Damage Tolerance...Sensitivity – Fatigue – Adhesion – Damage Tolerance – All critical modes and environments Products: Material Specifications, B-Basis Design Allowables...Demonstrate damage tolerance AIM-C DARPA DARPA Workshop, Annapolis, August 27-28, 2001 Requalification of Polymer / Composite Parts • Material Changes – Raw

  15. The Impact of Background Music on Adult Listeners: A Meta-Analysis

    ERIC Educational Resources Information Center

    Kampfe, Juliane; Sedlmeier, Peter; Renkewitz, Frank

    2011-01-01

    Background music has been found to have beneficial, detrimental, or no effect on a variety of behavioral and psychological outcome measures. This article reports a meta-analysis that attempts to summarize the impact of background music. A global analysis shows a null effect, but a detailed examination of the studies that allow the calculation of…

  16. Generating mouse lines for lineage tracing and knockout studies.

    PubMed

    Kraus, Petra; Sivakamasundari, V; Xing, Xing; Lufkin, Thomas

    2014-01-01

    In 2007 Capecchi, Evans, and Smithies received the Nobel Prize in recognition for discovering the principles for introducing specific gene modifications in mice via embryonic stem cells, a technology, which has revolutionized the field of biomedical science allowing for the generation of genetically engineered animals. Here we describe detailed protocols based on and developed from these ground-breaking discoveries, allowing for the modification of genes not only to create mutations to study gene function but additionally to modify genes with fluorescent markers, thus permitting the isolation of specific rare wild-type and mutant cell types for further detailed analysis at the biochemical, pathological, and genomic levels.

  17. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  18. Optical techniques: using coarse and detailed scans for the preventive acquisition of fingerprints with chromatic white-light sensors

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    The preventive application of automated latent fingerprint acquisition devices can enhance the Homeland Defence, e.g. by improving the border security. Here, contact-less optical acquisition techniques for the capture of traces are subject to research; chromatic white light sensors allow for multi-mode operation using coarse or detailed scans. The presence of potential fingerprints could be detected using fast coarse scans. Those Regions-of- Interest can be acquired afterwards with high-resolution detailed scans to allow for a verification or identification of individuals. An acquisition and analysis of fingerprint traces on different objects that are imported or pass borders might be a great enhancement for security. Additionally, if suspicious objects require a further investigation, an initial securing of potential fingerprints could be very useful. In this paper we show current research results for the coarse detection of fingerprints to prepare the detailed acquisition from various surface materials that are relevant for preventive applications.

  19. Laser scanning cytometry (LCS) allows detailed analysis of the cell cycle in PI stained human fibroblasts (TIG-7).

    PubMed

    Kawasaki, M; Sasaki, K; Satoh, T; Kurose, A; Kamada, T; Furuya, T; Murakami, T; Todoroki, T

    1997-01-01

    We have demonstrated a method for the in situ determination of the cell cycle phases of TIG-7 fibroblasts using a laser scanning cytometer (LSC) which has not only a function equivalent to flow cytometry (FCM) but also has a capability unique in itself. LSC allows a more detailed analysis of the cell cycle in cells stained with propidium iodide (PI) than FCM. With LSC it is possible to discriminate between mitotic cells and G2 cells, between post-mitotic cells and G1 cells, and between quiescent cells and cycling cells in a PI fluorescence peak (chromatin condensation) vs. fluorescence value (DNA content) cytogram for cells stained with PI. These were amply confirmed by experiments using colcemid and adriamycin. We were able to identify at least six cell subpopulations for PI stained cells using LSC; namely G1, S, G2, M, postmitotic and quiescent cell populations. LSC analysis facilitates the monitoring of effects of drugs on the cell cycle.

  20. Mineralogical characterization of rendering mortars from decorative details of a baroque building in Kozuchow (SW Poland)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartz, W., E-mail: wojciech.bartz@ing.uni.wroc.pl; Filar, T.

    Optical microscopic observations, scanning electron microscopy and microprobe with energy dispersive X-ray analysis, X-ray diffraction and differential thermal/thermogravimetric analysis allowed detailed characterization of rendering mortars from decorative details (figures of Saints) of a baroque building in Kozuchow (Lubuskie Voivodship, Western Poland). Two separate coats of rendering mortars have been distinguished, differing in composition of their filler. The under coat mortar has filler composed of coarse-grained siliceous sand, whereas the finishing one has much finer grained filler, dominated by a mixture of charcoal and Fe-smelting slag, with minor amounts of quartz grains. Both mortars have air-hardening binder composed of gypsum andmore » micritic calcite, exhibiting microcrystalline structure.« less

  1. Hydrodynamic design of generic pump components

    NASA Technical Reports Server (NTRS)

    Eastland, A. H. J.; Dodson, H. C.

    1991-01-01

    Inducer and impellar base geometries were defined for a fuel pump for a generic generator cycle. Blade surface data and inlet flowfield definition are available in sufficient detail to allow computational fluid dynamic analysis of the two components.

  2. Self-organizing maps: a versatile tool for the automatic analysis of untargeted imaging datasets.

    PubMed

    Franceschi, Pietro; Wehrens, Ron

    2014-04-01

    MS-based imaging approaches allow for location-specific identification of chemical components in biological samples, opening up possibilities of much more detailed understanding of biological processes and mechanisms. Data analysis, however, is challenging, mainly because of the sheer size of such datasets. This article presents a novel approach based on self-organizing maps, extending previous work in order to be able to handle the large number of variables present in high-resolution mass spectra. The key idea is to generate prototype images, representing spatial distributions of ions, rather than prototypical mass spectra. This allows for a two-stage approach, first generating typical spatial distributions and associated m/z bins, and later analyzing the interesting bins in more detail using accurate masses. The possibilities and advantages of the new approach are illustrated on an in-house dataset of apple slices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Chemometric analysis of multisensor hyperspectral images of precipitated atmospheric particulate matter.

    PubMed

    Ofner, Johannes; Kamilli, Katharina A; Eitenberger, Elisabeth; Friedbacher, Gernot; Lendl, Bernhard; Held, Andreas; Lohninger, Hans

    2015-09-15

    The chemometric analysis of multisensor hyperspectral data allows a comprehensive image-based analysis of precipitated atmospheric particles. Atmospheric particulate matter was precipitated on aluminum foils and analyzed by Raman microspectroscopy and subsequently by electron microscopy and energy dispersive X-ray spectroscopy. All obtained images were of the same spot of an area of 100 × 100 μm(2). The two hyperspectral data sets and the high-resolution scanning electron microscope images were fused into a combined multisensor hyperspectral data set. This multisensor data cube was analyzed using principal component analysis, hierarchical cluster analysis, k-means clustering, and vertex component analysis. The detailed chemometric analysis of the multisensor data allowed an extensive chemical interpretation of the precipitated particles, and their structure and composition led to a comprehensive understanding of atmospheric particulate matter.

  4. Benefits of detailed models of muscle activation and mechanics

    NASA Technical Reports Server (NTRS)

    Lehman, S. L.; Stark, L.

    1981-01-01

    Recent biophysical and physiological studies identified some of the detailed mechanisms involved in excitation-contraction coupling, muscle contraction, and deactivation. Mathematical models incorporating these mechanisms allow independent estimates of key parameters, direct interplay between basic muscle research and the study of motor control, and realistic model behaviors, some of which are not accessible to previous, simpler, models. The existence of previously unmodeled behaviors has important implications for strategies of motor control and identification of neural signals. New developments in the analysis of differential equations make the more detailed models feasible for simulation in realistic experimental situations.

  5. Analysis of Radio Frequency Surveillance Systems for Air Traffic Control : Volume 1. Text.

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  6. IDENTIFICATION OF AN IDEAL REACTOR MODEL IN A SECONDARY COMBUSTION CHAMBER

    EPA Science Inventory

    Tracer analysis was applied to a secondary combustion chamber of a rotary kiln incinerator simulator to develop a computationally inexpensive networked ideal reactor model and allow for the later incorporation of detailed reaction mechanisms. Tracer data from sulfur dioxide trace...

  7. An analysis of radio frequency surveillance systems for air traffic control volume II: appendixes

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  8. Chemometric analysis of soil pollution data using the Tucker N-way method.

    PubMed

    Stanimirova, I; Zehl, K; Massart, D L; Vander Heyden, Y; Einax, J W

    2006-06-01

    N-way methods, particularly the Tucker method, are often the methods of choice when analyzing data sets arranged in three- (or higher) way arrays, which is the case for most environmental data sets. In the future, applying N-way methods will become an increasingly popular way to uncover hidden information in complex data sets. The reason for this is that classical two-way approaches such as principal component analysis are not as good at revealing the complex relationships present in data sets. This study describes in detail the application of a chemometric N-way approach, namely the Tucker method, in order to evaluate the level of pollution in soil from a contaminated site. The analyzed soil data set was five-way in nature. The samples were collected at different depths (way 1) from two locations (way 2) and the levels of thirteen metals (way 3) were analyzed using a four-step-sequential extraction procedure (way 4), allowing detailed information to be obtained about the bioavailability and activity of the different binding forms of the metals. Furthermore, the measurements were performed under two conditions (way 5), inert and non-inert. The preferred Tucker model of definite complexity showed that there was no significant difference in measurements analyzed under inert or non-inert conditions. It also allowed two depth horizons, characterized by different accumulation pathways, to be distinguished, and it allowed the relationships between chemical elements and their biological activities and mobilities in the soil to be described in detail.

  9. Analysis of the Underlying Cognitive Activity in the Resolution of a Task on Derivability of the Absolute-Value Function: Two Theoretical Perspectives

    ERIC Educational Resources Information Center

    Pino-Fan, Luis R.; Guzmán, Ismenia; Font, Vicenç; Duval, Raymond

    2017-01-01

    This paper presents a study of networking of theories between the theory of registers of semiotic representation (TRSR) and the onto-semiotic approach of mathematical cognition and instruction (OSA). The results obtained show complementarities between these two theoretical perspectives, which might allow more detailed analysis of the students'…

  10. Tracking the When, Where, and With Whom of Alcohol Use

    PubMed Central

    Freisthler, Bridget; Lipperman-Kreda, Sharon; Bersamin, Melina; Gruenewald, Paul J.

    2014-01-01

    Prevention researchers have found that drinking in different contexts is related to different alcohol problems. Where and with whom people drink affects the types of alcohol-related problems they experience. Consequently, identifying those contexts that result in the greatest number of problems provides a novel opportunity to target new prevention efforts aimed at those contexts. However, identifying these contexts poses methodological challenges to prevention research. To overcome these challenges, researchers need tools that allow them to gather detailed information about when and where people choose to drink and how contextual factors influence drinking risks. New data collection and analysis techniques, such as activity-space analysis, which examines movement through different contexts, and ecological momentary assessment, which captures microlevel contextual changes as individuals move through their days, can advance the field of alcohol studies by providing detailed information on the use of drinking contexts, particularly when combined. Data acquired through these methods allow researchers to better identify those contexts where and conditions under which drinking and problems related to drinking occur. Use of these methods will allow prevention practitioners to target prevention efforts to those contexts that place most drinkers at risk and tailor prevention efforts to each context for specific outcomes. PMID:26258998

  11. Analysis of BSRT Profiles in the LHC at Injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitterer, M.; Stancari, G.; Papadopoulou, S.

    The beam synchrotron radiation telescope (BSRT) at the LHC allows to take profiles of the transverse beam distribution, which can provide useful additional insight in the evolution of the transverse beam distribution. A python class has been developed [1], which allows to read in the BSRT profiles, usually stored in binary format, run different analysis tools and generate plots of the statistical parameters and profiles as well as videos of the the profiles. The detailed analysis will be described in this note. The analysis is based on the data obtained at injection energy (450 GeV) during MD1217 [2] and MD1415more » [3] which will be also used as illustrative example. A similar approach is also taken with a MATLAB based analysis described in [4].« less

  12. The Luizi Structure (Democratic Republic of Congo) — First Confirmed Meteorite Impact Crater in Central Africa

    NASA Astrophysics Data System (ADS)

    Ferrière, L.; Lubala, F. R. T.; Osinski, G. R.; Kaseti, P. K.

    2011-03-01

    Our detailed analysis of the Luizi structure, combining a remote sensing study with geological field observations and petrographic examination of rock samples collected during our 2010 field campaign allows us to confirm its meteorite impact origin.

  13. Simulations of the HDO and H2O-18 atmospheric cycles using the NASA GISS general circulation model - Sensitivity experiments for present-day conditions

    NASA Technical Reports Server (NTRS)

    Jouzel, Jean; Koster, R. D.; Suozzo, R. J.; Russell, G. L.; White, J. W. C.

    1991-01-01

    Incorporating the full geochemical cycles of stable water isotopes (HDO and H2O-18) into an atmospheric general circulation model (GCM) allows an improved understanding of global delta-D and delta-O-18 distributions and might even allow an analysis of the GCM's hydrological cycle. A detailed sensitivity analysis using the NASA/Goddard Institute for Space Studies (GISS) model II GCM is presented that examines the nature of isotope modeling. The tests indicate that delta-D and delta-O-18 values in nonpolar regions are not strongly sensitive to details in the model precipitation parameterizations. This result, while implying that isotope modeling has limited potential use in the calibration of GCM convection schemes, also suggests that certain necessarily arbitrary aspects of these schemes are adequate for many isotope studies. Deuterium excess, a second-order variable, does show some sensitivity to precipitation parameterization and thus may be more useful for GCM calibration.

  14. Tradeoffs in manipulator structure and control. Part 4: Flexible manipulator analysis program. [user manual

    NASA Technical Reports Server (NTRS)

    Book, W. J.

    1974-01-01

    The Flexible Manipulator Analysis Program (FMAP) is a collection of FORTRAN coding to allow easy analysis of the flexible dynamics of mechanical arms. The user specifies the arm configuration and parameters and any or all of several frequency domain analyses to be performed, while the time domain impulse response is obtained by inverse Fourier transformation of the frequency response. A detailed explanation of how to use FMAP is provided.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Espinosa-Paredes, Gilberto; Prieto-Guerrero, Alfonso; Nunez-Carrera, Alejandro

    This paper introduces a wavelet-based method to analyze instability events in a boiling water reactor (BWR) during transient phenomena. The methodology to analyze BWR signals includes the following: (a) the short-time Fourier transform (STFT) analysis, (b) decomposition using the continuous wavelet transform (CWT), and (c) application of multiresolution analysis (MRA) using discrete wavelet transform (DWT). STFT analysis permits the study, in time, of the spectral content of analyzed signals. The CWT provides information about ruptures, discontinuities, and fractal behavior. To detect these important features in the signal, a mother wavelet has to be chosen and applied at several scales tomore » obtain optimum results. MRA allows fast implementation of the DWT. Features like important frequencies, discontinuities, and transients can be detected with analysis at different levels of detail coefficients. The STFT was used to provide a comparison between a classic method and the wavelet-based method. The damping ratio, which is an important stability parameter, was calculated as a function of time. The transient behavior can be detected by analyzing the maximum contained in detail coefficients at different levels in the signal decomposition. This method allows analysis of both stationary signals and highly nonstationary signals in the timescale plane. This methodology has been tested with the benchmark power instability event of Laguna Verde nuclear power plant (NPP) Unit 1, which is a BWR-5 NPP.« less

  16. Non-destructive digital imaging in poplar allows detailed analysis of adventitious rooting dynamics

    Treesearch

    R.J. Kodrzycki; R.B. Michaels; A.L. Friend; R.S. Zalesny; Ch.P. Mawata; D.W. McDonald

    2008-01-01

    The dynamics of root formation are difficult to observe directly over time without disturbing the rooting environment. A novel system for a non-destructive, non-invasive root analysis (RootViz FS, Phenotype Screening Corp.) was evaluated for its ability to analyze root formation from cuttings over a 32 day period in three poplar genotypes (DN70, P. Deltoides x...

  17. FHWA travel analysis framework : development of VMT forecasting models for use by the Federal Highway Administration

    DOT National Transportation Integrated Search

    2014-05-12

    This document details the process that the Volpe National Transportation Systems Center (Volpe) used to develop travel forecasting models for the Federal Highway Administration (FHWA). The purpose of these models is to allow FHWA to forecast future c...

  18. CALL--Past, Present, and Future.

    ERIC Educational Resources Information Center

    Bax, Stephen

    2003-01-01

    Provides a critical examination and reassessment of the history of computer assisted language learning (CALL), and argues for three new strategies--restricted, open, and integrated. Offers definitions and descriptions of the three approaches and argues that they allow a more detailed analysis of institutions and classrooms than earlier analyses.…

  19. NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.

    PubMed

    Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus

    2014-12-01

    We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.

  20. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Richard Yorg

    2011-03-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less

  1. GIS-based Landing-Site Analysis and Passive Decision Support

    NASA Astrophysics Data System (ADS)

    van Gasselt, Stephan; Nass, Andrea

    2016-04-01

    The increase of surface coverage and the availability and accessibility of planetary data allow researchers and engineers to remotely perform detailed studies on surface processes and properties, in particular on objects such as Mars and the Moon for which Terabytes of multi-temporal data at multiple spatial resolution levels have become available during the last 15 years. Orbiters, rovers and landers have been returning information and insights into the surface evolution of the terrestrial planets in unprecedented detail. While rover- and lander-based analyses are one major research aim to obtain ground truth, resource exploration or even potential establishment of bases using autonomous platforms are others and they require detailed investigation of settings in order to identify spots on the surface that are suitable for spacecraft to land and operate safely and over a long period of time. What has been done using hardcopy material in the past is today being carried by using either in-house developments or off-the-shelf spatial information system technology which allows to manage, integrate and analyse data as well as visualize and create user-defined reports for performing assessments. Usually, such analyses can be broken down (manually) by considering scientific wishes, engineering boundary conditions, potential hazards and various tertiary constraints. We here (1) review standard tasks of landing site analyses, (2) discuss issues inherently related to the analysis using integrated spatial analysis systems and (3) demonstrate a modular analysis framework for integration of data and for the evaluation of results from individual tasks in order to support decisions for landing-site selection.

  2. A three-dimensional viscous/potential flow interaction analysis method for multi-element wings: Modifications to the potential flow code to allow part-span, high-lift devices and close-interference calculations

    NASA Technical Reports Server (NTRS)

    Maskew, B.

    1979-01-01

    The description of the modified code includes details of a doublet subpanel technique in which panels that are close to a velocity calculation point are replaced by a subpanel set. This treatment gives the effect of a higher panel density without increasing the number of unknowns. In particular, the technique removes the close approach problem of the earlier singularity model in which distortions occur in the detailed pressure calculation near panel corners. Removal of this problem allowed a complete wake relaxation and roll-up iterative procedure to be installed in the code. The geometry package developed for the new technique and also for the more general configurations is based on a multiple patch scheme. Each patch has a regular array of panels, but arbitrary relationships are allowed between neighboring panels at the edges of adjacent patches. This provides great versatility for treating general configurations.

  3. Latent Partially Ordered Classification Models and Normal Mixtures

    ERIC Educational Resources Information Center

    Tatsuoka, Curtis; Varadi, Ferenc; Jaeger, Judith

    2013-01-01

    Latent partially ordered sets (posets) can be employed in modeling cognitive functioning, such as in the analysis of neuropsychological (NP) and educational test data. Posets are cognitively diagnostic in the sense that classification states in these models are associated with detailed profiles of cognitive functioning. These profiles allow for…

  4. Assessment of Southern California environment from ERTS-1

    NASA Technical Reports Server (NTRS)

    Bowden, L. W.; Viellenave, J. H.

    1973-01-01

    ERTS-1 imagery is a useful source of data for evaluation of earth resources in Southern California. The improving quality of ERTS-1 imagery, and our increasing ability to enhance the imagery has resulted in studies of a variety of phenomena in several Southern California environments. These investigations have produced several significant results of varying detail. They include the detection and identification of macro-scale tectonic and vegetational patterns, as well as detailed analysis of urban and agricultural processes. The sequential nature of ERTS-1 imagery has allowed these studies to monitor significant changes in the environment. In addiation, some preliminary work has begun directed toward assessing the impact of expanding recreation, agriculture and urbanization into the fragile desert environment. Refinement of enhancement and mapping techniques and more intensive analysis of ERTS-1 imagery should lead to a greater capability to extract detailed information for more precise evaluations and more accurate monitoring of earth resources in Southern California.

  5. Interactive system for geomagnetic data analysis

    NASA Astrophysics Data System (ADS)

    Solovev, Igor

    2017-10-01

    The paper suggests the methods for analyzing geomagnetic field variations, which are implemented in "Aurora" software system for complex analysis of geophysical parameters. The software system allows one to perform a detailed magnetic data analysis. The methods allow one to estimate the intensity of geomagnetic perturbations and to allocate increased geomagnetic activity periods. The software system is publicly available (http://aurorasa.ikir.ru:8580, http://www.ikir.ru:8280/lsaserver/MagneticPage.jsp). This research was supported by the Russian Science Foundation (Project No. 14-11-00194).

  6. Characterisation of the PXIE Allison-type emittance scanner

    DOE PAGES

    D'Arcy, R.; Alvarez, M.; Gaynier, J.; ...

    2016-01-26

    An Allison-type emittance scanner has been designed for PXIE at FNAL with the goal of providing fast and accurate phase space reconstruction. The device has been modified from previous LBNL/SNS designs to operate in both pulsed and DC modes with the addition of water-cooled front slits. Extensive calibration techniques and error analysis allowed confinement of uncertainty to the <5% level (with known caveats). With a 16-bit, 1 MHz electronics scheme the device is able to analyse a pulse with a resolution of 1 μs, allowing for analysis of neutralisation effects. As a result, this paper describes a detailed breakdown ofmore » the R&D, as well as post-run analysis techniques.« less

  7. Reduced modeling of signal transduction – a modular approach

    PubMed Central

    Koschorreck, Markus; Conzelmann, Holger; Ebert, Sybille; Ederer, Michael; Gilles, Ernst Dieter

    2007-01-01

    Background Combinatorial complexity is a challenging problem in detailed and mechanistic mathematical modeling of signal transduction. This subject has been discussed intensively and a lot of progress has been made within the last few years. A software tool (BioNetGen) was developed which allows an automatic rule-based set-up of mechanistic model equations. In many cases these models can be reduced by an exact domain-oriented lumping technique. However, the resulting models can still consist of a very large number of differential equations. Results We introduce a new reduction technique, which allows building modularized and highly reduced models. Compared to existing approaches further reduction of signal transduction networks is possible. The method also provides a new modularization criterion, which allows to dissect the model into smaller modules that are called layers and can be modeled independently. Hallmarks of the approach are conservation relations within each layer and connection of layers by signal flows instead of mass flows. The reduced model can be formulated directly without previous generation of detailed model equations. It can be understood and interpreted intuitively, as model variables are macroscopic quantities that are converted by rates following simple kinetics. The proposed technique is applicable without using complex mathematical tools and even without detailed knowledge of the mathematical background. However, we provide a detailed mathematical analysis to show performance and limitations of the method. For physiologically relevant parameter domains the transient as well as the stationary errors caused by the reduction are negligible. Conclusion The new layer based reduced modeling method allows building modularized and strongly reduced models of signal transduction networks. Reduced model equations can be directly formulated and are intuitively interpretable. Additionally, the method provides very good approximations especially for macroscopic variables. It can be combined with existing reduction methods without any difficulties. PMID:17854494

  8. Microfluidic electrochemical device and process for chemical imaging and electrochemical analysis at the electrode-liquid interface in-situ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Liu, Bingwen; Yang, Li

    2016-03-01

    A microfluidic electrochemical device and process are detailed that provide chemical imaging and electrochemical analysis under vacuum at the surface of the electrode-sample or electrode-liquid interface in-situ. The electrochemical device allows investigation of various surface layers including diffuse layers at selected depths populated with, e.g., adsorbed molecules in which chemical transformation in electrolyte solutions occurs.

  9. KENNEDY SPACE CENTER, FLA. - One of the world’s highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data (shown here) in preparation for the shuttle fleet’s return to flight, is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. The system, developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

    NASA Image and Video Library

    2004-02-04

    KENNEDY SPACE CENTER, FLA. - One of the world’s highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data (shown here) in preparation for the shuttle fleet’s return to flight, is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. The system, developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

  10. A Dynamic Simulation of Musculoskeletal Function in the Mouse Hindlimb During Trotting Locomotion

    PubMed Central

    Charles, James P.; Cappellari, Ornella; Hutchinson, John R.

    2018-01-01

    Mice are often used as animal models of various human neuromuscular diseases, and analysis of these models often requires detailed gait analysis. However, little is known of the dynamics of the mouse musculoskeletal system during locomotion. In this study, we used computer optimization procedures to create a simulation of trotting in a mouse, using a previously developed mouse hindlimb musculoskeletal model in conjunction with new experimental data, allowing muscle forces, activation patterns, and levels of mechanical work to be estimated. Analyzing musculotendon unit (MTU) mechanical work throughout the stride allowed a deeper understanding of their respective functions, with the rectus femoris MTU dominating the generation of positive and negative mechanical work during the swing and stance phases. This analysis also tested previous functional inferences of the mouse hindlimb made from anatomical data alone, such as the existence of a proximo-distal gradient of muscle function, thought to reflect adaptations for energy-efficient locomotion. The results do not strongly support the presence of this gradient within the mouse musculoskeletal system, particularly given relatively high negative net work output from the ankle plantarflexor MTUs, although more detailed simulations could test this further. This modeling analysis lays a foundation for future studies of the control of vertebrate movement through the development of neuromechanical simulations. PMID:29868576

  11. Tomcat-Projects_RF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warrant, Marilyn M.; Garcia, Rudy J.; Zhang, Pengchu

    2004-09-15

    Tomcat-Projects_RF is a software package for analyzing sensor data obtained from a database and displaying the results with Java Servlet Pages (JSP). SQL Views into the dataset are tailored for personnel having different roles in monitoring the items in a storage facility. For example, an inspector, a host treaty compliance officer, a system engineer and software developers were the users identified that would need to access data at different levels of detail, The analysis provides a high level status of the storage facility and allows the user to go deeper into the data details if the user desires.

  12. A virtual reality browser for Space Station models

    NASA Technical Reports Server (NTRS)

    Goldsby, Michael; Pandya, Abhilash; Aldridge, Ann; Maida, James

    1993-01-01

    The Graphics Analysis Facility at NASA/JSC has created a visualization and learning tool by merging its database of detailed geometric models with a virtual reality system. The system allows an interactive walk-through of models of the Space Station and other structures, providing detailed realistic stereo images. The user can activate audio messages describing the function and connectivity of selected components within his field of view. This paper presents the issues and trade-offs involved in the implementation of the VR system and discusses its suitability for its intended purposes.

  13. A first approach for digital representation and automated classification of toolmarks on locking cylinders using confocal laser microscopy

    NASA Astrophysics Data System (ADS)

    Clausing, Eric; Kraetzer, Christian; Dittmann, Jana; Vielhauer, Claus

    2012-10-01

    An important part of criminalistic forensics is the analysis of toolmarks. Such toolmarks often consist of plenty of single striations, scratches and dents which can allow for conclusions in regards to the sequence of events or used tools. To receive qualified results with an automated analysis and contactless acquisition of such toolmarks, a detailed digital representation of these and their orientation as well as placing to each other is required. For marks of firearms and tools the desired result of an analysis is a conclusion whether or not a mark has been generated by a tool under suspicion. For toolmark analysis on locking cylinders, the aim is not an identification of the used tool but rather an identification of the opening method. The challenge of such an identification is that a one-to-one comparison of two images is not sufficient - although two marked objects look completely different in regards to the specific location and shape of found marks they still can represent a sample for the identical opening method. This paper provides the first approach for modelling toolmarks on lock pins and takes into consideration the different requirements necessary to generate a detailed and interpretable digital representation of these traces. These requirements are 'detail', i.e. adequate features which allow for a suitable representation and interpretation of single marks, 'meta detail', i.e. adequate representation of the context and connection between all marks and 'distinctiveness', i.e. the possibility to reliably distinguish different sample types by the according model. The model is evaluated with a set of 15 physical samples (resulting in 675 digital scans) of lock pins from cylinders opened with different opening methods, contactlessly scanned with a confocal laser microscope. The presented results suggest a high suitability for the aspired purpose of opening method determination.

  14. Experiment Design and Analysis Guide - Neutronics & Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  15. Identification of large geomorphological anomalies based on 2D discrete wavelet transform

    NASA Astrophysics Data System (ADS)

    Doglioni, A.; Simeone, V.

    2012-04-01

    The identification and analysis based on quantitative evidences of large geomorphological anomalies is an important stage for the study of large landslides. Numerical geomorphic analyses represent an interesting approach to this kind of studies, allowing for a detailed and pretty accurate identification of hidden topographic anomalies that may be related to large landslides. Here a geomorphic numerical analyses of the Digital Terrain Model (DTM) is presented. The introduced approach is based on 2D discrete wavelet transform (Antoine et al., 2003; Bruun and Nilsen, 2003, Booth et al., 2009). The 2D wavelet decomposition of the DTM, and in particular the analysis of the detail coefficients of the wavelet transform can provide evidences of anomalies or singularities, i.e. discontinuities of the land surface. These discontinuities are not very evident from the DTM as it is, while 2D wavelet transform allows for grid-based analysis of DTM and for mapping the decomposition. In fact, the grid-based DTM can be assumed as a matrix, where a discrete wavelet transform (Daubechies, 1992) is performed columnwise and linewise, which basically represent horizontal and vertical directions. The outcomes of this analysis are low-frequency approximation coefficients and high-frequency detail coefficients. Detail coefficients are analyzed, since their variations are associated to discontinuities of the DTM. Detailed coefficients are estimated assuming to perform 2D wavelet transform both for the horizontal direction (east-west) and for the vertical direction (north-south). Detail coefficients are then mapped for both the cases, thus allowing to visualize and quantify potential anomalies of the land surface. Moreover, wavelet decomposition can be pushed to further levels, assuming a higher scale number of the transform. This may potentially return further interesting results, in terms of identification of the anomalies of land surface. In this kind of approach, the choice of a proper mother wavelet function is a tricky point, since it conditions the analysis and then their outcomes. Therefore multiple levels as well as multiple wavelet analyses are guessed. Here the introduced approach is applied to some interesting cases study of south Italy, in particular for the identification of large anomalies associated to large landslides at the transition between Apennine chain domain and the foredeep domain. In particular low Biferno valley and Fortore valley are here analyzed. Finally, the wavelet transforms are performed on multiple levels, thus trying to address the problem of which is the level extent for an accurate analysis fit to a specific problem. Antoine J.P., Carrette P., Murenzi R., and Piette B., (2003), Image analysis with two-dimensional continuous wavelet transform, Signal Processing, 31(3), pp. 241-272, doi:10.1016/0165-1684(93)90085-O. Booth A.M., Roering J.J., and Taylor Perron J., (2009), Automated landslide mapping using spectral analysis and high-resolution topographic data: Puget Sound lowlands, Washington, and Portland Hills, Oregon, Geomorphology, 109(3-4), pp. 132-147, doi:10.1016/j.geomorph.2009.02.027. Bruun B.T., and Nilsen S., (2003), Wavelet representation of large digital terrain models, Computers and Geoscience, 29(6), pp. 695-703, doi:10.1016/S0098-3004(03)00015-3. Daubechies, I. (1992), Ten lectures on wavelets, SIAM.

  16. Automatic network coupling analysis for dynamical systems based on detailed kinetic models.

    PubMed

    Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich

    2005-10-01

    We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.

  17. Technology of Strengthening Steel Details by Surfacing Composite Coatings

    NASA Astrophysics Data System (ADS)

    Burov, V. G.; Bataev, A. A.; Rakhimyanov, Kh M.; Mul, D. O.

    2016-04-01

    The article considers the problem of forming wear resistant meal ceramic coatings on steel surfaces using the results of our own investigations and the analysis of achievements made in the country and abroad. Increasing the wear resistance of surface layers of steel details is achieved by surfacing composite coatings with carbides or borides of metals as disperse particles in the strengthening phase. The use of surfacing on wearing machine details and mechanisms has a history of more than 100 years. But still engineering investigations in this field are being conducted up to now. The use of heating sources which provide a high density of power allows ensuring temperature and time conditions of surfacing under which composites with peculiar service and functional properties are formed. High concentration of energy in the zone of melt, which is created from powder mixtures and the hardened surface layer, allows producing the transition zone between the main material and surfaced coating. Surfacing by the electron beam directed from vacuum to the atmosphere is of considerable technological advantages. They give the possibility of strengthening surface layers of large-sized details by surfacing powder mixtures without their preliminary compacting. A modified layer of the main metal with ceramic particles distributed in it is created as a result of heating surfaced powders and the detail surface layer by the electron beam. Technology of surfacing allows using powders of refractory metals and graphite in the composition of powder mixtures. They interact with one another and form the particles of the hardening phase of the composition coating. The chemical composition of the main and surfaced materials is considered to be the main factor which determines the character of metallurgical processes in local zones of melt as well as the structure and properties of surfaced composition.

  18. Functional Module Search in Protein Networks based on Semantic Similarity Improves the Analysis of Proteomics Data*

    PubMed Central

    Boyanova, Desislava; Nilla, Santosh; Klau, Gunnar W.; Dandekar, Thomas; Müller, Tobias; Dittrich, Marcus

    2014-01-01

    The continuously evolving field of proteomics produces increasing amounts of data while improving the quality of protein identifications. Albeit quantitative measurements are becoming more popular, many proteomic studies are still based on non-quantitative methods for protein identification. These studies result in potentially large sets of identified proteins, where the biological interpretation of proteins can be challenging. Systems biology develops innovative network-based methods, which allow an integrated analysis of these data. Here we present a novel approach, which combines prior knowledge of protein-protein interactions (PPI) with proteomics data using functional similarity measurements of interacting proteins. This integrated network analysis exactly identifies network modules with a maximal consistent functional similarity reflecting biological processes of the investigated cells. We validated our approach on small (H9N2 virus-infected gastric cells) and large (blood constituents) proteomic data sets. Using this novel algorithm, we identified characteristic functional modules in virus-infected cells, comprising key signaling proteins (e.g. the stress-related kinase RAF1) and demonstrate that this method allows a module-based functional characterization of cell types. Analysis of a large proteome data set of blood constituents resulted in clear separation of blood cells according to their developmental origin. A detailed investigation of the T-cell proteome further illustrates how the algorithm partitions large networks into functional subnetworks each representing specific cellular functions. These results demonstrate that the integrated network approach not only allows a detailed analysis of proteome networks but also yields a functional decomposition of complex proteomic data sets and thereby provides deeper insights into the underlying cellular processes of the investigated system. PMID:24807868

  19. Privacy and Trust Attitudes in the Intent to Volunteer for Data-Tracking Research

    ERIC Educational Resources Information Center

    Smith, Catherine L.

    2016-01-01

    Introduction: The analysis of detailed interaction records is fundamental to development of user-centred systems. Researchers seeking such data must recruit volunteers willing to allow tracking of their interactions. This study examines privacy and trust attitudes in the intent to volunteer for research requiring installation of tracking software.…

  20. Constraints on primary and secondary particulate carbon sources using chemical tracer and 14C methods during CalNex-Bakersfield

    EPA Science Inventory

    The present study investigates primary and secondary sources of organic carbon for Bakersfield, CA, USA as part of the 2010 CalNex study. The method used here involves integrated sampling that is designed to allow for detailed and specific chemical analysis of particulate matter ...

  1. Examining IS Curriculum Profiles and the IS 2010 Model Curriculum Guidelines in AACSB-Accredited Schools

    ERIC Educational Resources Information Center

    Mills, Robert J.; Velasquez, Nicole Forsgren; Fadel, Kelly J.; Bell, Corbin C.

    2012-01-01

    The IS 2010 Model Curriculum Guidelines were developed to provide recommendations for standardized information systems curricula while simultaneously allowing for customization within individual programs. While some studies have examined program adherence to the IS 2010 Model Curriculum Guidelines, a more detailed analysis of IS curriculum…

  2. Timing the warm absorber in NGC4051

    NASA Astrophysics Data System (ADS)

    Silva, C.; Uttley, P.; Costantini, E.

    2015-07-01

    In this work we have combined spectral and timing analysis in the characterization of highly ionized outflows in Seyfert galaxies, the so-called warm absorbers. Here, we present our results on the extensive ˜600ks of XMM-Newton archival observations of the bright and highly variable Seyfert 1 galaxy NGC4051, whose spectrum has revealed a complex multi-component wind. Working simultaneously with RGS and PN data, we have performed a detailed analysis using a time-dependent photoionization code in combination with spectral and Fourier timing techniques. This method allows us to study in detail the response of the gas due to variations in the ionizing flux of the central source. As a result, we will show the contribution of the recombining gas to the time delays of the most highly absorbed energy bands relative to the continuum (Silva, Uttley & Costantini in prep.), which is also vital information for interpreting the continuum lags associated with propagation and reverberation effects in the inner emitting regions. Furthermore, we will illustrate how this powerful method can be applied to other sources and warm-absorber configurations, allowing for a wide range of studies.

  3. Ten-minute analysis of drugs and metabolites in saliva by surface-enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Shende, Chetan; Inscore, Frank; Maksymiuk, Paul; Farquharson, Stuart

    2005-11-01

    Rapid analysis of drugs in emergency room overdose patients is critical to selecting appropriate medical care. Saliva analysis has long been considered an attractive alternative to blood plasma analysis for this application. However, current clinical laboratory analysis methods involve extensive sample extraction followed by gas chromatography and mass spectrometry, and typically require as much as one hour to perform. In an effort to overcome this limitation we have been investigating metal-doped sol-gels to both separate drugs and their metabolites from saliva and generate surface-enhanced Raman spectra. We have incorporated the sol-gel in a disposable lab-on-a-chip format, and generally no more than a drop of sample is required. The detailed molecular vibrational information allows chemical identification, while the increase in Raman scattering by six orders of magnitude or more allows detection of microg/mL concentrations. Measurements of cocaine, its metabolite benzoylecgonine, and several barbiturates are presented.

  4. Polarized pressure dependence of the anisotropic dielectric functions of highly oriented poly(p-phenylene vinylene)

    NASA Astrophysics Data System (ADS)

    Morandi, V.; Galli, M.; Marabelli, F.; Comoretto, D.

    2010-04-01

    In this work, we combined an experimental technique and a detailed data analysis to investigate the influence of an applied pressure on the anisotropic dielectric functions of highly oriented poly(p-phenylene vinylene) (PPV). The dielectric constants were derived from polarized reflectance spectra recorded through a diamond anvil cell up to 50 kbar. The presence of the diamond anvils strongly affects measured spectra requiring the development in an optical model able to take all spurious effects into account. A parametric procedure was then applied to derive the complex dielectric constants for both polarizations as a function of pressure. A detailed analysis of their pressure dependence allows addressing the role of intermolecular interactions and electron-phonon coupling in highly oriented PPV.

  5. Dendrobium micropropagation: a review.

    PubMed

    da Silva, Jaime A Teixeira; Cardoso, Jean Carlos; Dobránszki, Judit; Zeng, Songjun

    2015-05-01

    Dendrobium is one of the largest and most important (ornamentally and medicinally) orchid genera. Tissue culture is now an established method for the effective propagation of members of this genus. This review provides a detailed overview of the Dendrobium micropropagation literature. Through a chronological analysis, aspects such as explant, basal medium, plant growth regulators, culture conditions and final organogenic outcome are chronicled in detail. This review will allow Dendrobium specialists to use the information that has been documented to establish, more efficiently, protocols for their own germplasm and to improve in vitro culture conditions based on the optimized parameters detailed in this review. Not only will this expand the use for mass propagation, but will also allow for the conservation of important germplasm. Information on the in vitro responses of Dendrobium for developing efficient protocols for breeding techniques based on tissue culture, such as polyploidization, somatic hybridization, isolation of mutants and somaclonal variants and for synthetic seed and bioreactor technology, or for genetic transformation, is discussed in this review. This is the first such review on this genus and represents half a decade of literature dedicated to Dendrobium micropropagation.

  6. A Photofission Delayed γ-ray Spectra Calculation Tool for the Conception of a Nuclear Material Characterization Facility

    NASA Astrophysics Data System (ADS)

    Bernard, D.; Serot, O.; Simon, E.; Boucher, L.; Plumeri, S.

    2018-01-01

    The photon interrogation analysis is a nondestructive technique allowing to identify and quantify fissile materials in nuclear waste packages. This paper details an automatic procedure which has been developed to simulate the delayed γ-ray spectra for several actinide photofissions. This calculation tool will be helpful for the fine conception (collimation, shielding, noise background optimizations, etc.) and for the on-line analysis of such a facility.

  7. Cognitive approaches for patterns analysis and security applications

    NASA Astrophysics Data System (ADS)

    Ogiela, Marek R.; Ogiela, Lidia

    2017-08-01

    In this paper will be presented new opportunities for developing innovative solutions for semantic pattern classification and visual cryptography, which will base on cognitive and bio-inspired approaches. Such techniques can be used for evaluation of the meaning of analyzed patterns or encrypted information, and allow to involve such meaning into the classification task or encryption process. It also allows using some crypto-biometric solutions to extend personalized cryptography methodologies based on visual pattern analysis. In particular application of cognitive information systems for semantic analysis of different patterns will be presented, and also a novel application of such systems for visual secret sharing will be described. Visual shares for divided information can be created based on threshold procedure, which may be dependent on personal abilities to recognize some image details visible on divided images.

  8. Conceptual design of a crewed reusable space transportation system aimed at parabolic flights: stakeholder analysis, mission concept selection, and spacecraft architecture definition

    NASA Astrophysics Data System (ADS)

    Fusaro, Roberta; Viola, Nicole; Fenoglio, Franco; Santoro, Francesco

    2017-03-01

    This paper proposes a methodology to derive architectures and operational concepts for future earth-to-orbit and sub-orbital transportation systems. In particular, at first, it describes the activity flow, methods, and tools leading to the generation of a wide range of alternative solutions to meet the established goal. Subsequently, the methodology allows selecting a small number of feasible options among which the optimal solution can be found. For the sake of clarity, the first part of the paper describes the methodology from a theoretical point of view, while the second part proposes the selection of mission concepts and of a proper transportation system aimed at sub-orbital parabolic flights. Starting from a detailed analysis of the stakeholders and their needs, the major objectives of the mission have been derived. Then, following a system engineering approach, functional analysis tools as well as concept of operations techniques allowed generating a very high number of possible ways to accomplish the envisaged goals. After a preliminary pruning activity, aimed at defining the feasibility of these concepts, more detailed analyses have been carried out. Going on through the procedure, the designer should move from qualitative to quantitative evaluations, and for this reason, to support the trade-off analysis, an ad-hoc built-in mission simulation software has been exploited. This support tool aims at estimating major mission drivers (mass, heat loads, manoeuverability, earth visibility, and volumetric efficiency) as well as proving the feasibility of the concepts. Other crucial and multi-domain mission drivers, such as complexity, innovation level, and safety have been evaluated through the other appropriate analyses. Eventually, one single mission concept has been selected and detailed in terms of layout, systems, and sub-systems, highlighting also logistic, safety, and maintainability aspects.

  9. Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.

    1997-01-01

    The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

  10. Performance analysis and dynamic modeling of a single-spool turbojet engine

    NASA Astrophysics Data System (ADS)

    Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin

    2017-01-01

    The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.

  11. Quantitative analysis of detailed lignin monomer composition by pyrolysis-gas chromatography combined with preliminary acetylation of the samples.

    PubMed

    Sonoda, T; Ona, T; Yokoi, H; Ishida, Y; Ohtani, H; Tsuge, S

    2001-11-15

    Detailed quantitative analysis of lignin monomer composition comprising p-coumaryl, coniferyl, and sinapyl alcohol and p-coumaraldehyde, coniferaldehyde, and sinapaldehyde in plant has not been studied from every point mainly because of artifact formation during the lignin isolation procedure, partial loss of the lignin components inherent in the chemical degradative methods, and difficulty in the explanation of the complex spectra generally observed for the lignin components. Here we propose a new method to quantify lignin monomer composition in detail by pyrolysis-gas chromatography (Py-GC) using acetylated lignin samples. The lignin acetylation procedure would contribute to prevent secondary formation of cinnamaldehydes from the corresponding alcohol forms during pyrolysis, which are otherwise unavoidable in conventional Py-GC process to some extent. On the basis of the characteristic peaks on the pyrograms of the acetylated sample, lignin monomer compositions in various dehydrogenative polymers (DHP) as lignin model compounds were determined, taking even minor components such as cinnamaldehydes into consideration. The observed compositions by Py-GC were in good agreement with the supplied lignin monomer contents on DHP synthesis. The new Py-GC method combined with sample preacetylation allowed us an accurate quantitative analysis of detailed lignin monomer composition using a microgram order of extractive-free plant samples.

  12. High Definition Confocal Imaging Modalities for the Characterization of Tissue-Engineered Substitutes.

    PubMed

    Mayrand, Dominique; Fradette, Julie

    2018-01-01

    Optimal imaging methods are necessary in order to perform a detailed characterization of thick tissue samples from either native or engineered tissues. Tissue-engineered substitutes are featuring increasing complexity including multiple cell types and capillary-like networks. Therefore, technical approaches allowing the visualization of the inner structural organization and cellular composition of tissues are needed. This chapter describes an optical clearing technique which facilitates the detailed characterization of whole-mount samples from skin and adipose tissues (ex vivo tissues and in vitro tissue-engineered substitutes) when combined with spectral confocal microscopy and quantitative analysis on image renderings.

  13. Applying cost accounting to operating room staffing in otolaryngology: time-driven activity-based costing and outpatient adenotonsillectomy.

    PubMed

    Balakrishnan, Karthik; Goico, Brian; Arjmand, Ellis M

    2015-04-01

    (1) To describe the application of a detailed cost-accounting method (time-driven activity-cased costing) to operating room personnel costs, avoiding the proxy use of hospital and provider charges. (2) To model potential cost efficiencies using different staffing models with the case study of outpatient adenotonsillectomy. Prospective cost analysis case study. Tertiary pediatric hospital. All otolaryngology providers and otolaryngology operating room staff at our institution. Time-driven activity-based costing demonstrated precise per-case and per-minute calculation of personnel costs. We identified several areas of unused personnel capacity in a basic staffing model. Per-case personnel costs decreased by 23.2% by allowing a surgeon to run 2 operating rooms, despite doubling all other staff. Further cost reductions up to a total of 26.4% were predicted with additional staffing rearrangements. Time-driven activity-based costing allows detailed understanding of not only personnel costs but also how personnel time is used. This in turn allows testing of alternative staffing models to decrease unused personnel capacity and increase efficiency. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  14. Analysis of Reaction Products and Conversion Time in the Pyrolisis of Cellulose and Wood Particles

    NASA Technical Reports Server (NTRS)

    Miller, R. S.; Bellan, J.

    1996-01-01

    A detailed mathematical model is presented for the temporal and spatial accurate modeling of solid-fluid reactions in porous particles for which volumetric reaction rate data is known a priori and both the porosity and the permeability of the particle are large enough to allow for continuous gas flow.

  15. Expanding the New Design: The NAEP 1985-86 Technical Report.

    ERIC Educational Resources Information Center

    Beaton, Albert E.; And Others

    This report supplies details of the design and data analysis of the 1986 National Assessment of Educational Progress (NAEP) to allow the reader to judge the utility of the design, data quality, reasonableness of assumptions, appropriateness of data analyses, and generalizability of inferences made from the data. After an introduction by A. E.…

  16. An Analysis of Closed-Loop Detailing in the Naval Helicopter Community

    DTIC Science & Technology

    2014-03-01

    and uncanny ability to do it all allowed me the time and focus to accomplish this task. I am truly blessed to always have you by my side. xv THIS...Objective Reviews (LOR), exams, and oral boards. Simulator and flight training involves a program of flight cards to be completed by the trainee. Each

  17. Brought-Along Identities and the Dynamics of Ideology: Accomplishing Bivalent Stances in a Multilingual Interaction

    ERIC Educational Resources Information Center

    Williams, Ashley M.

    2008-01-01

    This paper examines how the interconnected aspects of the stance triangle (Du Bois 2007) allow speakers to tap into multiple ideological layers as they take a stance and reveal intra-ethnic group tensions. Using a detailed interaction analysis of a Chinese American family's multilingual interaction, the paper explores how such ideological dynamics…

  18. 117.6-kilobit telemetry from Mercury in-flight system analysis

    NASA Technical Reports Server (NTRS)

    Evanchuk, V. L.

    1974-01-01

    This paper discusses very specifically the mode of the Mariner Venus/Mercury 1973 (MVM'73) telecommunications system in the interplexed dual channel 117.6 kilobits per second (kbps) and 2.45 kbps telemetry. This mode, originally designed for only Venus encounter, was also used at Mercury despite significantly less performance margin. Detailed analysis and careful measurement of system performance before and during flight operations allowed critical operational decisions, which made maximum use of the system capabilities.

  19. Modern Methods of Rail Welding

    NASA Astrophysics Data System (ADS)

    Kozyrev, Nikolay A.; Kozyreva, Olga A.; Usoltsev, Aleksander A.; Kryukov, Roman E.; Shevchenko, Roman A.

    2017-10-01

    Existing methods of rail welding, which are enable to get continuous welded rail track, are observed in this article. Analysis of existing welding methods allows considering an issue of continuous rail track in detail. Metallurgical and welding technologies of rail welding and also process technologies reducing aftereffects of temperature exposure are important factors determining the quality and reliability of the continuous rail track. Analysis of the existing methods of rail welding enable to find the research line for solving this problem.

  20. KENNEDY SPACE CENTER, FLA. - These towers are part of one of the world’s highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data in preparation for the shuttle fleet’s return to flight. The system is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. Developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., the system allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

    NASA Image and Video Library

    2004-02-04

    KENNEDY SPACE CENTER, FLA. - These towers are part of one of the world’s highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data in preparation for the shuttle fleet’s return to flight. The system is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. Developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., the system allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

  1. Clinical use and misuse of automated semen analysis.

    PubMed

    Sherins, R J

    1991-01-01

    During the past six years, there has been an explosion of technology which allows automated machine-vision for sperm analysis. CASA clearly provides an opportunity for objective, systematic assessment of sperm motion. But there are many caveats in using this type of equipment. CASA requires a disciplined and standardized approach to semen collection, specimen preparation, machine settings, calibration and avoidance of sampling bias. Potential sources of error can be minimized. Unfortunately, the rapid commercialization of this technology preceded detailed statistical analysis of such data to allow equally rapid comparisons of data between different CASA machines and among different laboratories. Thus, it is now imperative that we standardize use of this technology and obtain more detailed biological insights into sperm motion parameters in semen and after capacitation before we empirically employ CASA for studies of fertility prediction. In the basic science arena, CASA technology will likely evolve to provide new algorithms for accurate sperm motion analysis and give us an opportunity to address the biophysics of sperm movement. In the clinical arena, CASA instruments provide the opportunity to share and compare sperm motion data among laboratories by virtue of its objectivity, assuming standardized conditions of utilization. Identification of men with specific sperm motion disorders is certain, but the biological relevance of motility dysfunction to actual fertilization remains uncertain and surely the subject for further study.

  2. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.

  3. BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.

  4. Development of BEM for ceramic composites

    NASA Technical Reports Server (NTRS)

    Henry, D. P.; Banerjee, P. K.; Dargush, G. F.

    1991-01-01

    It is evident that for proper micromechanical analysis of ceramic composites, one needs to use a numerical method that is capable of idealizing the individual fibers or individual bundles of fibers embedded within a three-dimensional ceramic matrix. The analysis must be able to account for high stress or temperature gradients from diffusion of stress or temperature from the fiber to the ceramic matrix and allow for interaction between the fibers through the ceramic matrix. The analysis must be sophisticated enough to deal with the failure of fibers described by a series of increasingly sophisticated constitutive models. Finally, the analysis must deal with micromechanical modeling of the composite under nonlinear thermal and dynamic loading. This report details progress made towards the development of a boundary element code designed for the micromechanical studies of an advanced ceramic composite. Additional effort has been made in generalizing the implementation to allow the program to be applicable to real problems in the aerospace industry.

  5. A Computational Observer For Performing Contrast-Detail Analysis Of Ultrasound Images

    NASA Astrophysics Data System (ADS)

    Lopez, H.; Loew, M. H.

    1988-06-01

    Contrast-Detail (C/D) analysis allows the quantitative determination of an imaging system's ability to display a range of varying-size targets as a function of contrast. Using this technique, a contrast-detail plot is obtained which can, in theory, be used to compare image quality from one imaging system to another. The C/D plot, however, is usually obtained by using data from human observer readings. We have shown earlier(7) that the performance of human observers in the task of threshold detection of simulated lesions embedded in random ultrasound noise is highly inaccurate and non-reproducible for untrained observers. We present an objective, computational method for the determination of the C/D curve for ultrasound images. This method utilizes digital images of the C/D phantom developed at CDRH, and lesion-detection algorithms that simulate the Bayesian approach using the likelihood function for an ideal observer. We present the results of this method, and discuss the relationship to the human observer and to the comparability of image quality between systems.

  6. Calibrating Detailed Chemical Analysis of M dwarfs

    NASA Astrophysics Data System (ADS)

    Veyette, Mark; Muirhead, Philip Steven; Mann, Andrew; Brewer, John; Allard, France; Homeier, Derek

    2018-01-01

    The ability to perform detailed chemical analysis of Sun-like F-, G-, and K-type stars is a powerful tool with many applications including studying the chemical evolution of the Galaxy, assessing membership in stellar kinematic groups, and constraining planet formation theories. Unfortunately, complications in modeling cooler stellar atmospheres has hindered similar analysis of M-dwarf stars. Large surveys of FGK abundances play an important role in developing methods to measure the compositions of M dwarfs by providing benchmark FGK stars that have widely-separated M dwarf companions. These systems allow us to empirically calibrate metallicity-sensitive features in M dwarf spectra. However, current methods to measure metallicity in M dwarfs from moderate-resolution spectra are limited to measuring overall metallicity and largely rely on astrophysical abundance correlations in stellar populations. In this talk, I will discuss how large, homogeneous catalogs of precise FGK abundances are crucial to advancing chemical analysis of M dwarfs beyond overall metallicity to direct measurements of individual elemental abundances. I will present a new method to analyze high-resolution, NIR spectra of M dwarfs that employs an empirical calibration of synthetic M dwarf spectra to infer effective temperature, Fe abundance, and Ti abundance. This work is a step toward detailed chemical analysis of M dwarfs at a similar precision achieved for FGK stars.

  7. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    PubMed

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  8. Analytical Characterization of Erythritol Tetranitrate, an Improvised Explosive.

    PubMed

    Matyáš, Robert; Lyčka, Antonín; Jirásko, Robert; Jakový, Zdeněk; Maixner, Jaroslav; Mišková, Linda; Künzel, Martin

    2016-05-01

    Erythritol tetranitrate (ETN), an ester of nitric acid and erythritol, is a solid crystalline explosive with high explosive performance. Although it has never been used in any industrial or military application, it has become one of the most prepared and misused improvise explosives. In this study, several analytical techniques were explored to facilitate analysis in forensic laboratories. FTIR and Raman spectrometry measurements expand existing data and bring more detailed assignment of bands through the parallel study of erythritol [(15) N4 ] tetranitrate. In the case of powder diffraction, recently published data were verified, and (1) H, (13) C, and (15) N NMR spectra are discussed in detail. The technique of electrospray ionization tandem mass spectrometry was successfully used for the analysis of ETN. Described methods allow fast, versatile, and reliable detection or analysis of samples containing erythritol tetranitrate in forensic laboratories. © 2016 American Academy of Forensic Sciences.

  9. Microfluidics for Single-Cell Genetic Analysis

    PubMed Central

    Thompson, A. M.; Paguirigan, A. L.; Kreutz, J. E.; Radich, J. P.; Chiu, D. T.

    2014-01-01

    The ability to correlate single-cell genetic information to cellular phenotypes will provide the kind of detailed insight into human physiology and disease pathways that is not possible to infer from bulk cell analysis. Microfluidic technologies are attractive for single-cell manipulation due to precise handling and low risk of contamination. Additionally, microfluidic single-cell techniques can allow for high-throughput and detailed genetic analyses that increase accuracy and decreases reagent cost compared to bulk techniques. Incorporating these microfluidic platforms into research and clinical laboratory workflows can fill an unmet need in biology, delivering the highly accurate, highly informative data necessary to develop new therapies and monitor patient outcomes. In this perspective, we describe the current and potential future uses of microfluidics at all stages of single-cell genetic analysis, including cell enrichment and capture, single-cell compartmentalization and manipulation, and detection and analyses. PMID:24789374

  10. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  11. Composition-driven Cu-speciation and reducibility in Cu-CHA zeolite catalysts: a multivariate XAS/FTIR approach to complexity† †Electronic supplementary information (ESI) available: Sample description and synthesis details, experimental setup for in situ XAS and FTIR spectroscopy, details on the MCR-ALS method, details on DFT-assisted XANES simulations, details on the determination of N pure by PCA, MCR-ALS results for downsized and upsized component spaces, additional information to support the assignment of theoretical XANES curves, details on EXAFS analysis, details on IR spectral deconvolution. See DOI: 10.1039/c7sc02266b Click here for additional data file.

    PubMed Central

    Martini, A.; Lomachenko, K. A.; Pankin, I. A.; Negri, C.; Berlier, G.; Beato, P.; Falsig, H.; Bordiga, S.; Lamberti, C.

    2017-01-01

    The small pore Cu-CHA zeolite is attracting increasing attention as a versatile platform to design novel single-site catalysts for deNOx applications and for the direct conversion of methane to methanol. Understanding at the atomic scale how the catalyst composition influences the Cu-species formed during thermal activation is a key step to unveil the relevant composition–activity relationships. Herein, we explore by in situ XAS the impact of Cu-CHA catalyst composition on temperature-dependent Cu-speciation and reducibility. Advanced multivariate analysis of in situ XANES in combination with DFT-assisted simulation of XANES spectra and multi-component EXAFS fits as well as in situ FTIR spectroscopy of adsorbed N2 allow us to obtain unprecedented quantitative structural information on the complex dynamics during the speciation of Cu-sites inside the framework of the CHA zeolite. PMID:29147509

  12. Language Geography from Microblogging Platforms

    NASA Astrophysics Data System (ADS)

    Mocanu, Delia; Baronchelli, Andrea; Perra, Nicola; Gonçalves, Bruno; Vespignani, Alessandro

    2013-03-01

    Microblogging platforms have now become major open source indicators for complex social interactions. With the advent of smartphones, the everincreasing mobile Internet traffic gives us the unprecedented opportunity to complement studies of complex social phenomena with real-time location information. In this work, we show that the data nowadays accessible allows for detailed studies at different scales, ranging from country-level aggregate analysis to the analysis of linguistic communities withing specific neighborhoods. The high resolution and coverage of this data permits us to investigate such issues as the linguistic homogeneity of different countries, touristic seasonal patterns within countries, and the geographical distribution of different languages in bilingual regions. This work highlights the potentialities of geolocalized studies of open data sources that can provide an extremely detailed picture of the language geography.

  13. Non-minimally coupled condensate cosmologies: a phase space analysis

    NASA Astrophysics Data System (ADS)

    Carloni, Sante; Vignolo, Stefano; Cianci, Roberto

    2014-09-01

    We present an analysis of the phase space of cosmological models based on a non-minimal coupling between the geometry and a fermionic condensate. We observe that the strong constraint coming from the Dirac equations allows a detailed design of the cosmology of these models, and at the same time guarantees an evolution towards a state indistinguishable from general relativistic cosmological models. In this light, we show in detail how the use of some specific potentials can naturally reproduce a phase of accelerated expansion. In particular, we find for the first time that an exponential potential is able to induce two de Sitter phases separated by a power law expansion, which could be an interesting model for the unification of an inflationary phase and a dark energy era.

  14. Rip current evidence by hydrodynamic simulations, bathymetric surveys and UAV observation

    NASA Astrophysics Data System (ADS)

    Benassai, Guido; Aucelli, Pietro; Budillon, Giorgio; De Stefano, Massimo; Di Luccio, Diana; Di Paola, Gianluigi; Montella, Raffaele; Mucerino, Luigi; Sica, Mario; Pennetta, Micla

    2017-09-01

    The prediction of the formation, spacing and location of rip currents is a scientific challenge that can be achieved by means of different complementary methods. In this paper the analysis of numerical and experimental data, including RPAS (remotely piloted aircraft systems) observations, allowed us to detect the presence of rip currents and rip channels at the mouth of Sele River, in the Gulf of Salerno, southern Italy. The dataset used to analyze these phenomena consisted of two different bathymetric surveys, a detailed sediment analysis and a set of high-resolution wave numerical simulations, completed with Google EarthTM images and RPAS observations. The grain size trend analysis and the numerical simulations allowed us to identify the rip current occurrence, forced by topographically constrained channels incised on the seabed, which were compared with observations.

  15. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  16. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  17. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  18. Transonic propulsion system integration analysis at McDonnell Aircraft Company

    NASA Technical Reports Server (NTRS)

    Cosner, Raymond R.

    1989-01-01

    The technology of Computational Fluid Dynamics (CFD) is becoming an important tool in the development of aircraft propulsion systems. Two of the most valuable features of CFD are: (1) quick acquisition of flow field data; and (2) complete description of flow fields, allowing detailed investigation of interactions. Current analysis methods complement wind tunnel testing in several ways. Herein, the discussion is focused on CFD methods. However, aircraft design studies need data from both CFD and wind tunnel testing. Each approach complements the other.

  19. NSTX Disruption Simulations of Detailed Divertor and Passive Plate Models by Vector Potential Transfer from OPERA Global Analysis Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. H. Titus, S. Avasaralla, A.Brooks, R. Hatcher

    2010-09-22

    The National Spherical Torus Experiment (NSTX) project is planning upgrades to the toroidal field, plasma current and pulse length. This involves the replacement of the center-stack, including the inner legs of the TF, OH, and inner PF coils. A second neutral beam will also be added. The increased performance of the upgrade requires qualification of the remaining components including the vessel, passive plates, and divertor for higher disruption loads. The hardware needing qualification is more complex than is typically accessible by large scale electromagnetic (EM) simulations of the plasma disruptions. The usual method is to include simplified representations of componentsmore » in the large EM models and attempt to extract forces to apply to more detailed models. This paper describes a more efficient approach of combining comprehensive modeling of the plasma and tokamak conducting structures, using the 2D OPERA code, with much more detailed treatment of individual components using ANSYS electromagnetic (EM) and mechanical analysis. This capture local eddy currents and resulting loads in complex details, and allows efficient non-linear, and dynamic structural analyses.« less

  20. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    PubMed

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  1. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.

  2. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    PubMed

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  3. Interactions in the Dark Sector of Cosmology

    NASA Astrophysics Data System (ADS)

    Bean, Rachel

    The success of modern cosmology hinges on two dramatic augmentations beyond the minimalist assumption of baryonic matter interacting gravitationally through general relativity. The first assumption is that there must exist either new gravitational dynamics or a new component of the cosmic energy budget - dark matter - that allows structure to form and accounts for weak lensing and galactic rotation curves. The second assumption is that a further dynamical modification or energy component - dark energy - exists, driving late-time cosmic acceleration. The need for these is now firmly established through a host of observations, which have raised crucial questions, and present a deep challenge to fundamental physics. The central theme of this proposal is the detailed understanding of the nature of the dark sector through the inevitable interactions between its individual components and with the visible universe. Such interactions can be crucial to a given model's viability, affecting its capability to reproduce the cosmic expansion history; the detailed predictions or structure formation; the gravitational dynamics on astrophysical and solar system scales; the stability of the microphysical model, and its ultimate consistency. While many models are consistent with cosmology on the coarsest scales, as is often the case, the devil may lie in the details. In this proposal we plan a comprehensive analysis of these details, focusing on the interactions within the dark sector and between it and visible matter, and on how these interactions affect the observational and theoretical consistency of models. Since it is unlikely that there will be a silver bullet allowing us to isolate the cause of cosmic acceleration, it is critical to develop a coherent view of the landscape of proposed models, extract clear predictions, and determine what combination of experiments and observations might allow us to test these predictions.

  4. Evaluation of standard radiation atmosphere aerosol models for a coastal environment

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Suttles, J. T.; Sebacher, D. I.; Fuller, W. H.; Lecroy, S. R.

    1986-01-01

    Calculations are compared with data from an experiment to evaluate the utility of standard radiation atmosphere (SRA) models for defining aerosol properties in atmospheric radiation computations. Initial calculations with only SRA aerosols in a four-layer atmospheric column simulation allowed a sensitivity study and the detection of spectral trends in optical depth, which differed from measurements. Subsequently, a more detailed analysis provided a revision in the stratospheric layer, which brought calculations in line with both optical depth and skylight radiance data. The simulation procedure allows determination of which atmospheric layers influence both downwelling and upwelling radiation spectra.

  5. νμ → ν e oscillations search in the OPERA experiment

    NASA Astrophysics Data System (ADS)

    Zemskova, S.

    2016-11-01

    The tracking capabilities of the OPERA detector allow to reconstruct τ-leptons and electrons. It gives a possibility to observe νμ → ντ oscillations in the appearance mode and to study νμ → ν e oscillations in the νμ CNGS beam. Current results on νμ → ν e channel in the three-flavour mixing model are presented. The same data allow to constrain the presence of additional sterile neutrino states. The analysis of the full 2008-2012 OPERA data set and work on its improvement are going on. Details of the achievements are presented.

  6. The Reactome pathway Knowledgebase

    PubMed Central

    Fabregat, Antonio; Sidiropoulos, Konstantinos; Garapati, Phani; Gillespie, Marc; Hausmann, Kerstin; Haw, Robin; Jassal, Bijay; Jupe, Steven; Korninger, Florian; McKay, Sheldon; Matthews, Lisa; May, Bruce; Milacic, Marija; Rothfels, Karen; Shamovsky, Veronica; Webber, Marissa; Weiser, Joel; Williams, Mark; Wu, Guanming; Stein, Lincoln; Hermjakob, Henning; D'Eustachio, Peter

    2016-01-01

    The Reactome Knowledgebase (www.reactome.org) provides molecular details of signal transduction, transport, DNA replication, metabolism and other cellular processes as an ordered network of molecular transformations—an extended version of a classic metabolic map, in a single consistent data model. Reactome functions both as an archive of biological processes and as a tool for discovering unexpected functional relationships in data such as gene expression pattern surveys or somatic mutation catalogues from tumour cells. Over the last two years we redeveloped major components of the Reactome web interface to improve usability, responsiveness and data visualization. A new pathway diagram viewer provides a faster, clearer interface and smooth zooming from the entire reaction network to the details of individual reactions. Tool performance for analysis of user datasets has been substantially improved, now generating detailed results for genome-wide expression datasets within seconds. The analysis module can now be accessed through a RESTFul interface, facilitating its inclusion in third party applications. A new overview module allows the visualization of analysis results on a genome-wide Reactome pathway hierarchy using a single screen page. The search interface now provides auto-completion as well as a faceted search to narrow result lists efficiently. PMID:26656494

  7. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  8. GOplot: an R package for visually combining expression data with functional analysis.

    PubMed

    Walter, Wencke; Sánchez-Cabo, Fátima; Ricote, Mercedes

    2015-09-01

    Despite the plethora of methods available for the functional analysis of omics data, obtaining comprehensive-yet detailed understanding of the results remains challenging. This is mainly due to the lack of publicly available tools for the visualization of this type of information. Here we present an R package called GOplot, based on ggplot2, for enhanced graphical representation. Our package takes the output of any general enrichment analysis and generates plots at different levels of detail: from a general overview to identify the most enriched categories (bar plot, bubble plot) to a more detailed view displaying different types of information for molecules in a given set of categories (circle plot, chord plot, cluster plot). The package provides a deeper insight into omics data and allows scientists to generate insightful plots with only a few lines of code to easily communicate the findings. The R package GOplot is available via CRAN-The Comprehensive R Archive Network: http://cran.r-project.org/web/packages/GOplot. The shiny web application of the Venn diagram can be found at: https://wwalter.shinyapps.io/Venn/. A detailed manual of the package with sample figures can be found at https://wencke.github.io/ fscabo@cnic.es or mricote@cnic.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Grazing incidence beam expander

    NASA Astrophysics Data System (ADS)

    Akkapeddi, P. R.; Glenn, P.; Fuschetto, A.; Appert, Q.; Viswanathan, V. K.

    1985-01-01

    A Grazing Incidence Beam Expander (GIBE) telescope is being designed and fabricated to be used as an equivalent end mirror in a long laser resonator cavity. The design requirements for this GIBE flow down from a generic Free Electron Laser (FEL) resonator. The nature of the FEL gain volume (a thin, pencil-like, on-axis region) dictates that the output beam be very small. Such a thin beam with the high power levels characteristic of FELs would have to travel perhaps hundreds of meters or more before expanding enough to allow reflection from cooled mirrors. A GIBE, on the other hand, would allow placing these optics closer to the gain region and thus reduces the cavity lengths substantially. Results are presented relating to optical and mechanical design, alignment sensitivity analysis, radius of curvature analysis, laser cavity stability analysis of a linear stable concentric laser cavity with a GIBE. Fabrication details of the GIBE are also given.

  10. Grazing incidence beam expander

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akkapeddi, P.R.; Glenn, P.; Fuschetto, A.

    1985-01-01

    A Grazing Incidence Beam Expander (GIBE) telescope is being designed and fabricated to be used as an equivalent end mirror in a long laser resonator cavity. The design requirements for this GIBE flow down from a generic Free Electron Laser (FEL) resonator. The nature of the FEL gain volume (a thin, pencil-like, on-axis region) dictates that the output beam be very small. Such a thin beam with the high power levels characteristic of FELs would have to travel perhaps hundreds of meters or more before expanding enough to allow reflection from cooled mirrors. A GIBE, on the other hand, wouldmore » allow placing these optics closer to the gain region and thus reduces the cavity lengths substantially. Results are presented relating to optical and mechanical design, alignment sensitivity analysis, radius of curvature analysis, laser cavity stability analysis of a linear stable concentric laser cavity with a GIBE. Fabrication details of the GIBE are also given.« less

  11. Analysis of the Emitted Wavelet of High-Resolution Bowtie GPR Antennas

    PubMed Central

    Rial, Fernando I.; Lorenzo, Henrique; Pereira, Manuel; Armesto, Julia

    2009-01-01

    Most Ground Penetrating Radars (GPR) cover a wide frequency range by emitting very short time wavelets. In this work, we study in detail the wavelet emitted by two bowtie GPR antennas with nominal frequencies of 800 MHz and 1 GHz. Knowledge of this emitted wavelet allows us to extract as much information as possible from recorded signals, using advanced processing techniques and computer simulations. Following previously published methodology used by Rial et al. [1], which ensures system stability and reliability in data acquisition, a thorough analysis of the wavelet in both time and frequency domain is performed. Most of tests were carried out with air as propagation medium, allowing a proper analysis of the geometrical attenuation factor. Furthermore, we attempt to determine, for each antenna, a time zero in the records to allow us to correctly assign a position to the reflectors detected by the radar. Obtained results indicate that the time zero is not a constant value for the evaluated antennas, but instead depends on the characteristics of the material in contact with the antenna. PMID:22408523

  12. [Mathematical model of micturition allowing a detailed analysis of free urine flowmetry].

    PubMed

    Valentini, F; Besson, G; Nelson, P

    1999-04-01

    A mathematical model of micturition allowing precise analysis of uroflowmetry curves (VBN method) is described together with some of its applications. The physiology of micturition and possible diagnostic hypotheses able to explain the shape of the uroflowmetry curve can be expressed by a series of differential equations. Integration of the system allows the validity of these hypotheses to be tested by simulation. A theoretical uroflowmetry is calculated in less than 1 second and analysis of a dysuric uroflowmetry takes about 5 minutes. The efficacy of the model is due to its rapidity and the precision of the comparisons between measured and predicted values. The method has been applied to almost one thousand curves. The uroflowmetries of normal subjects are restored without adjustment with a quadratic error of less than 1%, while those of dysuric patients require identification of one or two adaptive parameters characteristic of the underlying disease. These parameters remain constant during the same session, but vary with the disease and/or the treatment. This model could become a tool for noninvasive urodynamic studies.

  13. Spatial/Spectral Identification of Endmembers from AVIRIS Data using Mathematical Morphology

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; Martinez, Pablo; Gualtieri, J. Anthony; Perez, Rosa M.

    2001-01-01

    During the last several years, a number of airborne and satellite hyperspectral sensors have been developed or improved for remote sensing applications. Imaging spectrometry allows the detection of materials, objects and regions in a particular scene with a high degree of accuracy. Hyperspectral data typically consist of hundreds of thousands of spectra, so the analysis of this information is a key issue. Mathematical morphology theory is a widely used nonlinear technique for image analysis and pattern recognition. Although it is especially well suited to segment binary or grayscale images with irregular and complex shapes, its application in the classification/segmentation of multispectral or hyperspectral images has been quite rare. In this paper, we discuss a new completely automated methodology to find endmembers in the hyperspectral data cube using mathematical morphology. The extension of classic morphology to the hyperspectral domain allows us to integrate spectral and spatial information in the analysis process. In Section 3, some basic concepts about mathematical morphology and the technical details of our algorithm are provided. In Section 4, the accuracy of the proposed method is tested by its application to real hyperspectral data obtained from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) imaging spectrometer. Some details about these data and reference results, obtained by well-known endmember extraction techniques, are provided in Section 2. Finally, in Section 5 we expose the main conclusions at which we have arrived.

  14. Preliminary candidate advanced avionics system for general aviation

    NASA Technical Reports Server (NTRS)

    Mccalla, T. M.; Grismore, F. L.; Greatline, S. E.; Birkhead, L. M.

    1977-01-01

    An integrated avionics system design was carried out to the level which indicates subsystem function, and the methods of overall system integration. Sufficient detail was included to allow identification of possible system component technologies, and to perform reliability, modularity, maintainability, cost, and risk analysis upon the system design. Retrofit to older aircraft, availability of this system to the single engine two place aircraft, was considered.

  15. Microlensing observations rapid search for exoplanets: MORSE code for GPUs

    NASA Astrophysics Data System (ADS)

    McDougall, Alistair; Albrow, Michael D.

    2016-02-01

    The rapid analysis of ongoing gravitational microlensing events has been integral to the successful detection and characterization of cool planets orbiting low-mass stars in the Galaxy. In this paper, we present an implementation of search and fit techniques on graphical processing unit (GPU) hardware. The method allows for the rapid identification of candidate planetary microlensing events and their subsequent follow-up for detailed characterization.

  16. How pleasant sounds promote and annoying sounds impede health: a cognitive approach.

    PubMed

    Andringa, Tjeerd C; Lanser, J Jolie L

    2013-04-08

    This theoretical paper addresses the cognitive functions via which quiet and in general pleasurable sounds promote and annoying sounds impede health. The article comprises a literature analysis and an interpretation of how the bidirectional influence of appraising the environment and the feelings of the perceiver can be understood in terms of core affect and motivation. This conceptual basis allows the formulation of a detailed cognitive model describing how sonic content, related to indicators of safety and danger, either allows full freedom over mind-states or forces the activation of a vigilance function with associated arousal. The model leads to a number of detailed predictions that can be used to provide existing soundscape approaches with a solid cognitive science foundation that may lead to novel approaches to soundscape design. These will take into account that louder sounds typically contribute to distal situational awareness while subtle environmental sounds provide proximal situational awareness. The role of safety indicators, mediated by proximal situational awareness and subtle sounds, should become more important in future soundscape research.

  17. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1990-01-01

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  18. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE PAGES

    Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; ...

    2014-12-02

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signaturemore » and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  19. Low Cost, Scalable Proteomics Data Analysis Using Amazon's Cloud Computing Services and Open Source Search Algorithms

    PubMed Central

    Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.

    2009-01-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578

  20. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    NASA Astrophysics Data System (ADS)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  1. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belianinov, Alex, E-mail: belianinova@ornl.gov; Ganesh, Panchapakesan; Lin, Wenzhi

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe{sub 0.55}Se{sub 0.45} (T{sub c} = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe{sub 1−x}Se{sub x} structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified bymore » their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  2. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    PubMed

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  3. Active site specificity profiling datasets of matrix metalloproteinases (MMPs) 1, 2, 3, 7, 8, 9, 12, 13 and 14.

    PubMed

    Eckhard, Ulrich; Huesgen, Pitter F; Schilling, Oliver; Bellac, Caroline L; Butler, Georgina S; Cox, Jennifer H; Dufour, Antoine; Goebeler, Verena; Kappelhoff, Reinhild; Auf dem Keller, Ulrich; Klein, Theo; Lange, Philipp F; Marino, Giada; Morrison, Charlotte J; Prudova, Anna; Rodriguez, David; Starr, Amanda E; Wang, Yili; Overall, Christopher M

    2016-06-01

    The data described provide a comprehensive resource for the family-wide active site specificity portrayal of the human matrix metalloproteinase family. We used the high-throughput proteomic technique PICS (Proteomic Identification of protease Cleavage Sites) to comprehensively assay 9 different MMPs. We identified more than 4300 peptide cleavage sites, spanning both the prime and non-prime sides of the scissile peptide bond allowing detailed subsite cooperativity analysis. The proteomic cleavage data were expanded by kinetic analysis using a set of 6 quenched-fluorescent peptide substrates designed using these results. These datasets represent one of the largest specificity profiling efforts with subsequent structural follow up for any protease family and put the spotlight on the specificity similarities and differences of the MMP family. A detailed analysis of this data may be found in Eckhard et al. (2015) [1]. The raw mass spectrometry data and the corresponding metadata have been deposited in PRIDE/ProteomeXchange with the accession number PXD002265.

  4. Detailed low-energy electron diffraction analysis of the (4×4) surface structure of C60 on Cu(111): Seven-atom-vacancy reconstruction

    NASA Astrophysics Data System (ADS)

    Xu, Geng; Shi, Xing-Qiang; Zhang, R. Q.; Pai, Woei Wu; Jeng, H. T.; Van Hove, M. A.

    2012-08-01

    A detailed and exhaustive structural analysis by low-energy electron diffraction (LEED) is reported for the C60-induced reconstruction of Cu(111), in the system Cu(111) + (4 × 4)-C60. A wide LEED energy range allows enhanced sensitivity to the crucial C60-metal interface that is buried below the 7-Å-thick molecular layer. The analysis clearly favors a seven-Cu-atom vacancy model (with Pendry R-factor Rp = 0.376) over a one-Cu-atom vacancy model (Rp = 0.608) and over nonreconstructed models (Rp = 0.671 for atop site and Rp = 0.536 for hcp site). The seven-Cu-atom vacancy forms a (4 × 4) lattice of bowl-like holes. In each hole, a C60 molecule can nestle by forming strong bonds (shorter than 2.30 Å) between 15 C atoms of the molecule and 12 Cu atoms of the outermost and second Cu layers.

  5. Analyzing Visibility Configurations.

    PubMed

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  6. Characterization of Deposits on Glass Substrate as a Tool in Failure Analysis: The Orbiter Vehicle Columbia Case Study

    NASA Technical Reports Server (NTRS)

    Olivas, J. D.; Melroy, P.; McDanels, S.; Wallace, T.; Zapata, M. C.

    2006-01-01

    In connection with the accident investigation of the space shuttle Columbia, an analysis methodology utilizing well established microscopic and spectroscopic techniques was implemented for evaluating the environment to which the exterior fused silica glass was exposed. Through the implementation of optical microscopy, scanning electron microscopy, energy dispersive spectroscopy, transmission electron microscopy, and electron diffraction, details emerged regarding the manner in which a charred metallic deposited layer formed on top of the exposed glass. Due to nature of the substrate and the materials deposited, the methodology proved to allow for a more detailed analysis of the vehicle breakup. By contrast, similar analytical methodologies on metallic substrates have proven to be challenging due to strong potential for error resulting from substrate contamination. This information proved to be valuable to not only those involved in investigating the break up of Columbia, but also provides a potential guide for investigating future high altitude and high energy accidents.

  7. Reliability studies of Integrated Modular Engine system designs

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-01-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  8. Universal Serial Bus Architecture for Removable Media (USB-ARM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-03-09

    USB-ARM creates operating system drivers which sit between removable media and the user and applications. The drivers isolate the media and submit the contents of the media to a virtual machine containing an entire scanning system. This scanning system may include traditional anti-virus, but also allows more detailed analysis of files, including dynamic run-time analysis, helping to prevent "zero-day" threats not already identified in anti-virus signatures. Once cleared, the media is presented to the operating system, at which point it becomes available to users and applications.

  9. Structural Analysis of Thermal Shields During a Quench of a Torus Magnet for the 12 GeV Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pastor, Orlando; Willard, Thomas; Ghoshal, Probir K.

    A toroidal magnet system consisting of six superconducting coils is being built for the Jefferson Lab 12- GeV accelerator upgrade project. This paper details the analysis of eddy current effects during a quench event on the aluminum thermal shield. The shield has been analyzed for mechanical stresses induced as a result of a coil quench as well as a fast discharge of the complete magnet system. The shield has been designed to reduce the eddy current effects and result in stresses within allowable limits.

  10. Reliability studies of integrated modular engine system designs

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-01-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  11. Novel type of neutron polarization analysis using the multianalyzer-equipment of the three-axes spectrometer PUMA

    NASA Astrophysics Data System (ADS)

    Schwesig, Steffen; Maity, Avishek; Sobolev, Oleg; Ziegler, Fabian; Eckold, Götz

    2018-01-01

    The combination of polarization analysis and multianalyzer system available at the three axes spectrometer PUMA@FRM II allows the simultaneous determination of both spin states of the scattered neutrons and the absolute value of the polarization. The present paper describes the technical details along with the basic formalism used for the precise calibration. Moreover, the performance of this method is illustrated by several test experiments including first polarized inelastic studies of the magnetic excitations of CuO in the multiferroic and the uniaxial antiferromagnetic phases.

  12. Detailed Requirements Analysis for a Management Information System for the Department of Family Practice and Community Medicine at Silas B. Hays Army Community Hospital, Fort Ord, California

    DTIC Science & Technology

    1989-03-01

    Chapter II. Chapter UI discusses the theory of information systems and the analysis and design of such systems. The last section of Chapter II introduces...34 Improved personnel morale and job satisfaction. Doctors and hospital administrators are trying to recover from the medical computing lag which has...discussed below). The primary source of equipment authorizations is the Table of Distribution and Allowances ( TDA ) which shows the equipment authorized to be

  13. Reliability studies of integrated modular engine system designs

    NASA Astrophysics Data System (ADS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-06-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  14. Reliability studies of Integrated Modular Engine system designs

    NASA Astrophysics Data System (ADS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-06-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  15. Measuring silicon pore optics

    NASA Astrophysics Data System (ADS)

    Vacanti, Giuseppe; Barrière, Nicolas; Bavdaz, Marcos; Chatbi, Abdelhakim; Collon, Maximilien; Dekker, Daniëlle; Girou, David; Günther, Ramses; van der Hoeven, Roy; Krumrey, Michael; Landgraf, Boris; Müller, Peter; Schreiber, Swenja; Vervest, Mark; Wille, Eric

    2017-09-01

    While predictions based on the metrology (local slope errors and detailed geometrical details) play an essential role in controlling the development of the manufacturing processes, X-ray characterization remains the ultimate indication of the actual performance of Silicon Pore Optics (SPO). For this reason SPO stacks and mirror modules are routinely characterized at PTB's X-ray Pencil Beam Facility at BESSY II. Obtaining standard X-ray results quickly, right after the production of X-ray optics is essential to making sure that X-ray results can inform decisions taken in the lab. We describe the data analysis pipeline in operations at cosine, and how it allows us to go from stack production to full X-ray characterization in 24 hours.

  16. Grazing-incidence small angle x-ray scattering studies of nanoscale polymer gratings

    NASA Astrophysics Data System (ADS)

    Doxastakis, Manolis; Suh, Hyo Seon; Chen, Xuanxuan; Rincon Delgadillo, Paulina A.; Wan, Lingshu; Williamson, Lance; Jiang, Zhang; Strzalka, Joseph; Wang, Jin; Chen, Wei; Ferrier, Nicola; Ramirez-Hernandez, Abelardo; de Pablo, Juan J.; Gronheid, Roel; Nealey, Paul

    2015-03-01

    Grazing-Incidence Small Angle X-ray Scattering (GISAXS) offers the ability to probe large sample areas, providing three-dimensional structural information at high detail in a thin film geometry. In this study we exploit the application of GISAXS to structures formed at one step of the LiNe (Liu-Nealey) flow using chemical patterns for directed self-assembly of block copolymer films. Experiments conducted at the Argonne National Laboratory provided scattering patterns probing film characteristics at both parallel and normal directions to the surface. We demonstrate the application of new computational methods to construct models based on scattering measured. Such analysis allows for extraction of structural characteristics at unprecedented detail.

  17. Structure and information in spatial segregation

    PubMed Central

    2017-01-01

    Ethnoracial residential segregation is a complex, multiscalar phenomenon with immense moral and economic costs. Modeling the structure and dynamics of segregation is a pressing problem for sociology and urban planning, but existing methods have limitations. In this paper, we develop a suite of methods, grounded in information theory, for studying the spatial structure of segregation. We first advance existing profile and decomposition methods by posing two related regionalization methods, which allow for profile curves with nonconstant spatial scale and decomposition analysis with nonarbitrary areal units. We then formulate a measure of local spatial scale, which may be used for both detailed, within-city analysis and intercity comparisons. These methods highlight detailed insights in the structure and dynamics of urban segregation that would be otherwise easy to miss or difficult to quantify. They are computationally efficient, applicable to a broad range of study questions, and freely available in open source software. PMID:29078323

  18. Structure and information in spatial segregation.

    PubMed

    Chodrow, Philip S

    2017-10-31

    Ethnoracial residential segregation is a complex, multiscalar phenomenon with immense moral and economic costs. Modeling the structure and dynamics of segregation is a pressing problem for sociology and urban planning, but existing methods have limitations. In this paper, we develop a suite of methods, grounded in information theory, for studying the spatial structure of segregation. We first advance existing profile and decomposition methods by posing two related regionalization methods, which allow for profile curves with nonconstant spatial scale and decomposition analysis with nonarbitrary areal units. We then formulate a measure of local spatial scale, which may be used for both detailed, within-city analysis and intercity comparisons. These methods highlight detailed insights in the structure and dynamics of urban segregation that would be otherwise easy to miss or difficult to quantify. They are computationally efficient, applicable to a broad range of study questions, and freely available in open source software. Published under the PNAS license.

  19. Model systems in heterogeneous catalysis: towards the design and understanding of structure and electronic properties.

    PubMed

    Pan, Q; Li, L; Shaikhutdinov, S; Fujimori, Y; Hollerer, M; Sterrer, M; Freund, H-J

    2018-05-29

    We discuss in this paper two case studies related to nano-particle catalyst systems. One concerns a model system for the Cr/SiO2 Phillips catalyst for ethylene polymerization and here we present XPS data to complement the previously published TPD, IRAS and reactivity studies to elucidate the electronic structure of the system in some detail. The second case study provides additional information on Au nano-particles supported on ultrathin MgO(100)/Ag(100) films where we had observed a specific activity of the particle's rim at the metal-oxide interface with respect to CO2 activation and oxalate formation, obviously connected to electron transfer through the MgO film from the metal substrate underneath. Here we present XPS and Auger data, which allows detailed analysis of the observed chemical shifts. This analysis corroborates previous findings deduced via STM.

  20. Direct optical detection of protein-ligand interactions.

    PubMed

    Gesellchen, Frank; Zimmermann, Bastian; Herberg, Friedrich W

    2005-01-01

    Direct optical detection provides an excellent means to investigate interactions of molecules in biological systems. The dynamic equilibria inherent to these systems can be described in greater detail by recording the kinetics of a biomolecular interaction. Optical biosensors allow direct detection of interaction patterns without the need for labeling. An overview covering several commercially available biosensors is given, with a focus on instruments based on surface plasmon resonance (SPR) and reflectometric interference spectroscopy (RIFS). Potential assay formats and experimental design, appropriate controls, and calibration procedures, especially when handling low molecular weight substances, are discussed. The single steps of an interaction analysis combined with practical tips for evaluation, data processing, and interpretation of kinetic data are described in detail. In a practical example, a step-by-step procedure for the analysis of a low molecular weight compound interaction with serum protein, determined on a commercial SPR sensor, is presented.

  1. Helios1A EoL: A Success. For the first Time a Long Final Thrust Scenario, Respecting the French Law on Space Operations

    NASA Astrophysics Data System (ADS)

    Guerry, Agnes; Moussi, Aurelie; Sartine, Christian; Beaumet, Gregory

    2013-09-01

    HELIOS1A End Of Live (EOL) operations occurred in the early 2012. Through this EOL operation, CNES wanted to make an example of French Space Act compliance. Because the satellite wasn't natively designed for such an EOL phase, the operation was touchy and risky. It was organized as a real full project in order to assess every scenario details with dedicated Mission Analysis, to secure the operations through detailed risk analysis at system level and to consider the major failures that could occur during the EOL. A short scenario allowing to reach several objectives with benefits was eventually selected. The main objective of this project was to preserve space environment. The operations were led on a "best effort" basis. The French Space Operations Act (FSOA) requirements were met: HELIOS-1A EOL operations had been led successfully.

  2. Gait Analysis From a Single Ear-Worn Sensor: Reliability and Clinical Evaluation for Orthopaedic Patients.

    PubMed

    Jarchi, Delaram; Lo, Benny; Wong, Charence; Ieong, Edmund; Nathwani, Dinesh; Yang, Guang-Zhong

    2016-08-01

    Objective assessment of detailed gait patterns after orthopaedic surgery is important for post-surgical follow-up and rehabilitation. The purpose of this paper is to assess the use of a single ear-worn sensor for clinical gait analysis. A reliability measure is devised for indicating the confidence level of the estimated gait events, allowing it to be used in free-walking environments and for facilitating clinical assessment of orthopaedic patients after surgery. Patient groups prior to or following anterior cruciate ligament (ACL) reconstruction and knee replacement were recruited to assess the proposed method. The ability of the sensor for detailed longitudinal analysis is demonstrated with a group of patients after lower limb reconstruction by considering parameters such as temporal and force-related gait asymmetry derived from gait events. The results suggest that the ear-worn sensor can be used for objective gait assessments of orthopaedic patients without the requirement and expense of an elaborate laboratory setup for gait analysis. It significantly simplifies the monitoring protocol and opens the possibilities for home-based remote patient assessment.

  3. Developments to the Sylvan stand structure model to describe wood quality changes in southern bottomland hardwood forests because of forest management

    Treesearch

    Ian R. Scott

    2009-01-01

    Growth models can produce a wealth of detailed information that is often very difficult to perceive because it is frequently presented either as summary tables, stand view or landscape view visualizations. We have developed new tools for use with the Sylvan model (Larsen 1994) that allow the analysis of wood-quality changes as a consequence of forest management....

  4. Design of the advanced regional aircraft, the DART-75

    NASA Technical Reports Server (NTRS)

    Elliott, Steve; Gislason, Jason; Huffstetler, Mark; Mann, Jon; Withers, Ashley; Zimmerman, Mark

    1992-01-01

    This design analysis is intended to show the capabilities of the DART-75, a 75 passenger medium-range regional transport. Included are the detailed descriptions of the structures, performance, stability and control, weight and balance, and engine design. The design should allow for the DART to become the premier regional aircraft of the future due to some advanced features like the canard, semi-composite construction, and advanced engines.

  5. A genome-wide screening of BEL-Pao like retrotransposons in Anopheles gambiae by the LTR_STRUC program.

    PubMed

    Marsano, Renè Massimiliano; Caizzi, Ruggiero

    2005-09-12

    The advanced status of assembly of the nematoceran Anopheles gambiae genomic sequence allowed us to perform a wide genome analysis to looking at the presence of Long Terminal Repeats (LTRs) in the range of 10 kb by means of the LTR_STRUC tool. More than three hundred sequences were retrieved and 210 were treated as putative complete retrotransposons that were individually analysed with respect to known retrotransposons of A. gambiae and D. melanogaster. The results show that the vast majority of the retrotransposons analysed belong to the Ty3/gypsy class and only 8% to the Ty1/copia class. In addition, phylogenetic analysis allowed us to characterize in more detail the relationship of a large BEL-Pao lineage in which a single family was shown to harbour an additional env gene.

  6. MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.

    PubMed

    Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan

    2017-01-01

    Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.

  7. Composition and stratigraphy of the paint layers: investigation on the Madonna dei Fusi by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Grassi, N.

    2005-06-01

    In the framework of the extensive study on the wood painting "Madonna dei fusi" attributed to Leonardo da Vinci, Ion Beam Analysis (IBA) techniques were used at the Florence accelerator laboratory to get information about the elemental composition of the paint layers. After a brief description of the basic principle and the general features of IBA techniques, we will illustrate in detail how the analysis allowed us to characterise the pigments of original and restored areas and the substrate composition, and to obtain information about the stratigraphy of the painting, also providing an estimate of the paint layer thickness.

  8. Spherical roller bearing analysis. SKF computer program SPHERBEAN. Volume 1: Analysis

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Pirvics, J.

    1980-01-01

    The models and associated mathematics used within the SPHERBEAN computer program for prediction of the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings are presented. The analysis allows six degrees of freedom for each roller and three for each half of an optionally split cage. Roller skew, free lubricant, inertial loads, appropriate elastic and friction forces, and flexible outer ring are considered. Roller quasidynamic equilibrium is calculated for a bearing with up to 30 rollers per row, and distinct roller and flange geometries are specifiable. The user is referred to the material contained here for formulation assumptions and algorithm detail.

  9. Developments in remote sensing technology enable more detailed urban flood risk analysis.

    NASA Astrophysics Data System (ADS)

    Denniss, A.; Tewkesbury, A.

    2009-04-01

    Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest digital airborne sensors, both optical and lidar, to produce the input layer for surface water flood modelling. A national flood map product has been created. The new product utilises sophisticated modelling techniques, perfected over many years, which harness graphical processing power. This product will prove particularly valuable for risk assessment decision support within insurance/reinsurance, property/environmental, utilities, risk management and government agencies. However, it is not just the ground elevation that determines the behaviour of surface water. By combining height information (surface and terrain) with high resolution aerial photography and colour infrared imagery, a high definition land cover mapping dataset (LandBase) is being produced, which provides a precise measure of sealed versus non sealed surface. This will allows even more sophisticated modelling of flood scenarios. Thus, the value of airborne survey data can be demonstrated by flood risk analysis down to individual addresses in urban areas. However for some risks, an even more detailed survey may be justified. In order to achieve this, Infoterra is testing new 360˚ mobile lidar technology. Collecting lidar data from a moving vehicle allows each street to be mapped in very high detail, allowing precise information about the location, size and shape of features such as kerbstones, gullies, road camber and building threshold level to be captured quickly and accurately. These data can then be used to model the problem of overland flood risk at the scale of individual properties. Whilst at present it might be impractical to undertake such detailed modelling for all properties, these techniques can certainly be used to improve the flood risk analysis of key locations. This paper will demonstrate how these new high resolution remote sensing techniques can be combined to provide a new resolution of detail to aid urban flood modelling.

  10. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  11. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  12. Options to improve energy efficiency for educational building

    NASA Astrophysics Data System (ADS)

    Jahan, Mafruha

    The cost of energy is a major factor that must be considered for educational facility budget planning purpose. The analysis of energy related issues and options can be complex and requires significant time and detailed effort. One way to facilitate the inclusion of energy option planning in facility planning efforts is to utilize a tool that allows for quick appraisal of the facility energy profile. Once such an appraisal is accomplished, it is then possible to rank energy improvement options consistently with other facility needs and requirements. After an energy efficiency option has been determined to have meaningful value in comparison with other facility planning options, it is then possible to utilize the initial appraisal as the basis for an expanded consideration of additional facility and energy use detail using the same analytic system used for the initial appraisal. This thesis has developed a methodology and an associated analytic model to assist in these tasks and thereby improve the energy efficiency of educational facilities. A detailed energy efficiency and analysis tool is described that utilizes specific university building characteristics such as size, architecture, envelop, lighting, occupancy, thermal design which allows reducing the annual energy consumption. Improving the energy efficiency of various aspects of an educational building's energy performance can be complex and can require significant time and experience to make decisions. The approach developed in this thesis initially assesses the energy design for a university building. This initial appraisal is intended to assist administrators in assessing the potential value of energy efficiency options for their particular facility. Subsequently this scoping design can then be extended as another stage of the model by local facility or planning personnel to add more details and engineering aspects to the initial screening model. This approach can assist university planning efforts to identify the most cost effective combinations of energy efficiency strategies. The model analyzes and compares the payback periods of all proposed Energy Performance Measures (EPMs) to determine which has the greatest potential value.

  13. Dynamic Modelling under Uncertainty: The Case of Trypanosoma brucei Energy Metabolism

    PubMed Central

    Achcar, Fiona; Kerkhoven, Eduard J.; Bakker, Barbara M.; Barrett, Michael P.; Breitling, Rainer

    2012-01-01

    Kinetic models of metabolism require detailed knowledge of kinetic parameters. However, due to measurement errors or lack of data this knowledge is often uncertain. The model of glycolysis in the parasitic protozoan Trypanosoma brucei is a particularly well analysed example of a quantitative metabolic model, but so far it has been studied with a fixed set of parameters only. Here we evaluate the effect of parameter uncertainty. In order to define probability distributions for each parameter, information about the experimental sources and confidence intervals for all parameters were collected. We created a wiki-based website dedicated to the detailed documentation of this information: the SilicoTryp wiki (http://silicotryp.ibls.gla.ac.uk/wiki/Glycolysis). Using information collected in the wiki, we then assigned probability distributions to all parameters of the model. This allowed us to sample sets of alternative models, accurately representing our degree of uncertainty. Some properties of the model, such as the repartition of the glycolytic flux between the glycerol and pyruvate producing branches, are robust to these uncertainties. However, our analysis also allowed us to identify fragilities of the model leading to the accumulation of 3-phosphoglycerate and/or pyruvate. The analysis of the control coefficients revealed the importance of taking into account the uncertainties about the parameters, as the ranking of the reactions can be greatly affected. This work will now form the basis for a comprehensive Bayesian analysis and extension of the model considering alternative topologies. PMID:22379410

  14. Assessing the Predictability of Convection using Ensemble Data Assimilation of Simulated Radar Observations in an LETKF system

    NASA Astrophysics Data System (ADS)

    Lange, Heiner; Craig, George

    2014-05-01

    This study uses the Local Ensemble Transform Kalman Filter (LETKF) to perform storm-scale Data Assimilation of simulated Doppler radar observations into the non-hydrostatic, convection-permitting COSMO model. In perfect model experiments (OSSEs), it is investigated how the limited predictability of convective storms affects precipitation forecasts. The study compares a fine analysis scheme with small RMS errors to a coarse scheme that allows for errors in position, shape and occurrence of storms in the ensemble. The coarse scheme uses superobservations, a coarser grid for analysis weights, a larger localization radius and larger observation error that allow a broadening of the Gaussian error statistics. Three hour forecasts of convective systems (with typical lifetimes exceeding 6 hours) from the detailed analyses of the fine scheme are found to be advantageous to those of the coarse scheme during the first 1-2 hours, with respect to the predicted storm positions. After 3 hours in the convective regime used here, the forecast quality of the two schemes appears indiscernible, judging by RMSE and verification methods for rain-fields and objects. It is concluded that, for operational assimilation systems, the analysis scheme might not necessarily need to be detailed to the grid scale of the model. Depending on the forecast lead time, and on the presence of orographic or synoptic forcing that enhance the predictability of storm occurrences, analyses from a coarser scheme might suffice.

  15. [Application of evidence based medicine to the individual patient: the role of decision analysis].

    PubMed

    Housset, B; Junod, A F

    2003-11-01

    The objective of evidence based medicine (EBM) is to contribute to medical decision making by providing the best possible information in terms of validity and relevance. This allows evaluation in a specific manner of the benefits and risks of a decision. The limitations and hazards of this approach are discussed in relation to a clinical case where the diagnosis of pulmonary embolism was under consideration. The individual details and the limited availability of some technical procedures illustrate the need to adapt the data of EBM to the circumstances. The choice between two diagnostic tests (d-dimers and ultrasound of the legs) and their optimal timing is analysed with integration of the consequences for the patient of the treatments proposed. This allows discussion of the concept of utility and the use of sensitivity analysis. If EBM is the cornerstone of rational and explicit practise it should also allow for the constraints of real life. Decision analysis, which depends on the same critical demands as EBM but can also take account of the individual features of each patient and test the robustness of a decision, gives a unique opportunity reconcile rigorous reasoning with individualisation of management.

  16. Metamorphosis revealed: time-lapse three-dimensional imaging inside a living chrysalis.

    PubMed

    Lowe, Tristan; Garwood, Russell J; Simonsen, Thomas J; Bradley, Robert S; Withers, Philip J

    2013-07-06

    Studies of model insects have greatly increased our understanding of animal development. Yet, they are limited in scope to this small pool of model species: a small number of representatives for a hyperdiverse group with highly varied developmental processes. One factor behind this narrow scope is the challenging nature of traditional methods of study, such as histology and dissection, which can preclude quantitative analysis and do not allow the development of a single individual to be followed. Here, we use high-resolution X-ray computed tomography (CT) to overcome these issues, and three-dimensionally image numerous lepidopteran pupae throughout their development. The resulting models are presented in the electronic supplementary material, as are figures and videos, documenting a single individual throughout development. They provide new insight and details of lepidopteran metamorphosis, and allow the measurement of tracheal and gut volume. Furthermore, this study demonstrates early and rapid development of the tracheae, which become visible in scans just 12 h after pupation. This suggests that there is less remodelling of the tracheal system than previously expected, and is methodologically important because the tracheal system is an often-understudied character system in development. In the future, this form of time-lapse CT-scanning could allow faster and more detailed developmental studies on a wider range of taxa than is presently possible.

  17. Integration of ground-based laser scanner and aerial digital photogrammetry for topographic modelling of Vesuvio volcano

    NASA Astrophysics Data System (ADS)

    Pesci, Arianna; Fabris, Massimo; Conforti, Dario; Loddo, Fabiana; Baldi, Paolo; Anzidei, Marco

    2007-05-01

    This work deals with the integration of different surveying methodologies for the definition of very accurate Digital Terrain Models (DTM) and/or Digital Surface Models (DSM): in particular, the aerial digital photogrammetry and the terrestrial laser scanning were used to survey the Vesuvio volcano, allowing the total coverage of the internal cone and surroundings (the whole surveyed area was about 3 km × 3 km). The possibility to reach a very high precision, especially from the laser scanner data set, allowed a detailed description of the morphology of the volcano. The comparisons of models obtained in repeated surveys allow a detailed map of residuals providing a data set that can be used for detailed studies of the morphological evolution. Moreover, the reflectivity information, highly correlated to materials properties, allows for the measurement and quantification of some morphological variations in areas where structural discontinuities and displacements are present.

  18. Design and fabrication of brayton cycle solar heat receiver

    NASA Technical Reports Server (NTRS)

    Mendelson, I.

    1971-01-01

    A detail design and fabrication of a solar heat receiver using lithium fluoride as the heat storage material was completed. A gas flow analysis was performed to achieve uniform flow distribution within overall pressure drop limitations. Structural analyses and allowable design criteria were developed for anticipated environments such as launch, pressure containment, and thermal cycling. A complete heat receiver assembly was fabricated almost entirely from the refractory alloy, niobium-1% zirconium.

  19. Drilling and Caching Architecture for the Mars2020 Mission

    NASA Astrophysics Data System (ADS)

    Zacny, K.

    2013-12-01

    We present a Sample Acquisition and Caching (SAC) architecture for the Mars2020 mission and detail how the architecture meets the sampling requirements described in the Mars2020 Science Definition Team (SDT) report. The architecture uses 'One Bit per Core' approach. Having dedicated bit for each rock core allows a reduction in the number of core transfer steps and actuators and this reduces overall mission risk. It also alleviates the bit life problem, eliminates cross contamination, and aids in hermetic sealing. An added advantage is faster drilling time, lower power, lower energy, and lower Weight on Bit (which reduces Arm preload requirements). To enable replacing of core samples, the drill bits are based on the BigTooth bit design. The BigTooth bit cuts a core diameter slightly smaller than the imaginary hole inscribed by the inner surfaces of the bits. Hence the rock core could be much easier ejected along the gravity vector. The architecture also has three additional types of bits that allow analysis of rocks. Rock Abrasion and Brushing Bit (RABBit) allows brushing and grinding of rocks in the same was as Rock Abrasion Tool does on MER. PreView bit allows viewing and analysis of rock core surfaces. Powder and Regolith Acquisition Bit (PRABit) captures regolith and rock powder either for in situ analysis or sample return. PRABit also allows sieving capabilities. The architecture can be viewed here: http://www.youtube.com/watch?v=_-hOO4-zDtE

  20. All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis.

    PubMed

    Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L; Terés, Lluís; Baumann, Reinhard R

    2016-09-21

    We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement.

  1. Ageing management of french NPP civil work structures

    NASA Astrophysics Data System (ADS)

    Gallitre, E.; Dauffer, D.

    2011-04-01

    This paper presents EDF practice about concrete structure ageing management, from the mechanisms analysis to the formal procedure which allows the French company to increase 900 MWe NPP lifetime until 40 years; it will also introduce its action plan for 60 years lifetime extension. This practice is based on a methodology which identifies every ageing mechanism; both plants feedback and state of the art are screened and conclusions are drawn up into an "ageing analysis data sheet". That leads at first to a collection of 57 data sheets which give the mechanism identification, the components that are concerned and an analysis grid which is designed to assess the safety risk. This analysis screens the reference documents describing the mechanism, the design lifetime hypotheses, the associated regulation or codification, the feedback experiences, the accessibility, the maintenance actions, the repair possibility and so one. This analysis has to lead to a conclusion about the risk taking into account monitoring and maintenance. If the data sheet conclusion is not clear enough, then a more detailed report is launched. The technical document which is needed, is a formal detailed report which summarizes every theoretical knowledge and monitoring data: its objective is to propose a solution for ageing management: this solution can include more inspections or specific research development, or additional maintenance. After a first stage on the 900 MWe units, only two generic ageing management detailed reports have been needed for the civil engineering part: one about reactor building containment, and one about other structures which focuses on concrete inflating reactions. The second stage consists on deriving this generic analysis (ageing mechanism and detailed reports) to every plant where a complete ageing report is required (one report for all equipments and structures of the plant, but specific for each reactor). This ageing management is a continuous process because the 57 generic data sheets set is updated every year and the detailed generic reports every five years. After this 40 year lifetime extension, EDF is preparing a 60 years lifetime action plan which includes R&D actions, specific industrial studies and also monitoring improvements.

  2. Wash water recovery system

    NASA Technical Reports Server (NTRS)

    Deckman, G.; Rousseau, J. (Editor)

    1973-01-01

    The Wash Water Recovery System (WWRS) is intended for use in processing shower bath water onboard a spacecraft. The WWRS utilizes flash evaporation, vapor compression, and pyrolytic reaction to process the wash water to allow recovery of potable water. Wash water flashing and foaming characteristics, are evaluated physical properties, of concentrated wash water are determined, and a long term feasibility study on the system is performed. In addition, a computer analysis of the system and a detail design of a 10 lb/hr vortex-type water vapor compressor were completed. The computer analysis also sized remaining system components on the basis of the new vortex compressor design.

  3. Applied cartographic communication: map symbolization for atlases.

    USGS Publications Warehouse

    Morrison, J.L.

    1984-01-01

    A detailed investigation of the symbolization used on general-purpose atlas reference maps. It indicates how theories of cartographic communication can be put into practice. Two major points emerge. First, that a logical scheme can be constructed from existing cartographic research and applied to an analysis of the choice of symbolization on a map. Second, the same structure appears to allow the cartographer to specify symbolization as a part of map design. An introductory review of cartographic communication is followed by an analysis of selected maps' usage of point, area and line symbols, boundaries, text and colour usage.-after Author

  4. ExoMars 2016 Trace Gas Orbiter and Mars Express Coordinated Science Operations Planning

    NASA Astrophysics Data System (ADS)

    Cardesin Moinelo, Alejandro; Geiger, Bernhard; Costa, Marc; Breitfellner, Michel; Castillo, Manuel; Marin Yaseli de la Parra, Julia; Martin, Patrick; Merritt, Donald R.; Grotheer, Emmanuel; Aberasturi Vega, Miriam; Ashman, Mike; Frew, David; Garcia Beteta, Juan Jose; Metcalfe, Leo; Muñoz, Claudio; Muñoz, Michela; Titov, Dimitri; Svedhem, Hakan

    2018-05-01

    In this contribution we focus on the science opportunity analysis between the Mars Express and the ExoMars 2016 Trace Gas Orbiter missions and the observations that can be combined to improve the scientific outcome of both missions. In particular we will describe the long term analysis of geometrical conditions that allow for coordinated science observations for solar occultation and nadir pointing. We will provide details on the calculations and results for simultaneous and quasi-simultaneous opportunities, taking into account the observation requirements of the instruments and the operational requirements for feasibility checks.

  5. Analysis of Yb3+/Er3+-codoped microring resonator cross-grid matrices

    NASA Astrophysics Data System (ADS)

    Vallés, Juan A.; Gǎlǎtuş, Ramona

    2014-09-01

    An analytic model of the scattering response of a highly Yb3+/Er3+-codoped phosphate glass microring resonator matrix is considered to obtain the transfer functions of an M x N cross-grid microring resonator structure. Then a detailed model is used to calculate the pump and signal propagation, including a microscopic statistical formalism to describe the high-concentration induced energy-transfer mechanisms and passive and active features are combined to realistically simulate the performance as a wavelength-selective amplifier or laser. This analysis allows the optimization of these structures for telecom or sensing applications.

  6. A combined Bodian-Nissl stain for improved network analysis in neuronal cell culture.

    PubMed

    Hightower, M; Gross, G W

    1985-11-01

    Bodian and Nissl procedures were combined to stain dissociated mouse spinal cord cells cultured on coverslips. The Bodian technique stains fine neuronal processes in great detail as well as an intracellular fibrillar network concentrated around the nucleus and in proximal neurites. The Nissl stain clearly delimits neuronal cytoplasm in somata and in large dendrites. A combination of these techniques allows the simultaneous depiction of neuronal perikarya and all afferent and efferent processes. Costaining with little background staining by either procedure suggests high specificity for neurons. This procedure could be exploited for routine network analysis of cultured neurons.

  7. A Reverse-Genetics Mutational Analysis of the Barley HvDWARF Gene Results in Identification of a Series of Alleles and Mutants with Short Stature of Various Degree and Disturbance in BR Biosynthesis Allowing a New Insight into the Process.

    PubMed

    Gruszka, Damian; Gorniak, Malgorzata; Glodowska, Ewelina; Wierus, Ewa; Oklestkova, Jana; Janeczko, Anna; Maluszynski, Miroslaw; Szarejko, Iwona

    2016-04-22

    Brassinosteroids (BRs) are plant steroid hormones, regulating a broad range of physiological processes. The largest amount of data related with BR biosynthesis has been gathered in Arabidopsis thaliana, however understanding of this process is far less elucidated in monocot crops. Up to now, only four barley genes implicated in BR biosynthesis have been identified. Two of them, HvDWARF and HvBRD, encode BR-6-oxidases catalyzing biosynthesis of castasterone, but their relation is not yet understood. In the present study, the identification of the HvDWARF genomic sequence, its mutational and functional analysis and characterization of new mutants are reported. Various types of mutations located in different positions within functional domains were identified and characterized. Analysis of their impact on phenotype of the mutants was performed. The identified homozygous mutants show reduced height of various degree and disrupted skotomorphogenesis. Mutational analysis of the HvDWARF gene with the "reverse genetics" approach allowed for its detailed functional analysis at the level of protein functional domains. The HvDWARF gene function and mutants' phenotypes were also validated by measurement of endogenous BR concentration. These results allowed a new insight into the BR biosynthesis in barley.

  8. Capturing Fine Details Involving Low-Cost Sensors -a Comparative Study

    NASA Astrophysics Data System (ADS)

    Rehany, N.; Barsi, A.; Lovas, T.

    2017-11-01

    Capturing the fine details on the surface of small objects is a real challenge to many conventional surveying methods. Our paper discusses the investigation of several data acquisition technologies, such as arm scanner, structured light scanner, terrestrial laser scanner, object line-scanner, DSLR camera, and mobile phone camera. A palm-sized embossed sculpture reproduction was used as a test object; it has been surveyed by all the instruments. The result point clouds and meshes were then analyzed, using the arm scanner's dataset as reference. In addition to general statistics, the results have been evaluated based both on 3D deviation maps and 2D deviation graphs; the latter allows even more accurate analysis of the characteristics of the different data acquisition approaches. Additionally, own-developed local minimum maps were created that nicely visualize the potential level of detail provided by the applied technologies. Besides the usual geometric assessment, the paper discusses the different resource needs (cost, time, expertise) of the discussed techniques. Our results proved that even amateur sensors operated by amateur users can provide high quality datasets that enable engineering analysis. Based on the results, the paper contains an outlook to potential future investigations in this field.

  9. Mars Global Geologic Mapping: About Half Way Done

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Dohm, J. M.; Irwin, R.; Kolb, E. J.; Skinner, J. A., Jr.; Hare, T. M.

    2009-01-01

    We are in the third year of a five-year effort to map the geology of Mars using mainly Mars Global Surveyor, Mars Express, and Mars Odyssey imaging and altimetry datasets. Previously, we have reported on details of project management, mapping datasets (local and regional), initial and anticipated mapping approaches, and tactics of map unit delineation and description [1-2]. For example, we have seen how the multiple types and huge quantity of image data as well as more accurate and detailed altimetry data now available allow for broader and deeper geologic perspectives, based largely on improved landform perception, characterization, and analysis. Here, we describe mapping and unit delineation results thus far, a new unit identified in the northern plains, and remaining steps to complete the map.

  10. Methods and Algorithms for Computer-aided Engineering of Die Tooling of Compressor Blades from Titanium Alloy

    NASA Astrophysics Data System (ADS)

    Khaimovich, A. I.; Khaimovich, I. N.

    2018-01-01

    The articles provides the calculation algorithms for blank design and die forming fitting to produce the compressor blades for aircraft engines. The design system proposed in the article allows generating drafts of trimming and reducing dies automatically, leading to significant reduction of work preparation time. The detailed analysis of the blade structural elements features was carried out, the taken limitations and technological solutions allowed to form generalized algorithms of forming parting stamp face over the entire circuit of the engraving for different configurations of die forgings. The author worked out the algorithms and programs to calculate three dimensional point locations describing the configuration of die cavity.

  11. Continuous, data-rich appraisal of surgical trainees' operative abilities: a novel approach for measuring performance and providing feedback.

    PubMed

    Roach, Paul B; Roggin, Kevin K; Selkov, Gene; Posner, Mitchell C; Silverstein, Jonathan C

    2009-01-01

    We developed a convenient mechanism, Surgical Training and Assessment Tool (STAT), to accomplish detailed, continuous analysis of surgical trainees' operative abilities, and a simple method, Quality Based Surgical Training (QBST) for implementing it. Using a web-accessed computer program, attending physicians and trainees independently assessed the trainee's operative performance after every operative (training) case. Global attributes of surgical knowledge, skill, and independence were assessed as well as the key technical maneuvers of each operation. A system of hierarchical, expandable menus specific to each of hundreds of different surgical procedures allowed the assessments to be made as detailed or as general as the users felt were necessary. In addition, freehand, unscripted commentary was recorded via an optional "remarks" box feature. Finally, an independently chosen, "overall" grade scaled F through A+ concluded each assessment. Over a 31 month period, 72 different users (52 trainees, 20 attending physicians) submitted 3849 performance assessments on 2424 cases, including 132 different case types and amassing 68,260 distinct data points. The mean number of data points per trainee was 1313; the median time spent per assessment was 60 seconds. Graphic displays allowed formative review of individual cases in real time, and summative review of long term trends. Appraisals of knowledge, skill, and independence were strongly correlated with and independently predictive of the overall competency grade (model r(2) = 0.68; test of predictive significance p < 0.001 for each rating). Trainee and attending physician scores were highly correlated (> 0.7) with one another. QBST/STAT achieves detailed, continuous analysis of surgical trainees' operative abilities, and facilitates timely, specific, and thorough feedback regarding their performance in theater. QBST/STAT promotes trainee self-reflection and generation of continuous, transparent, iterative training goals.

  12. How Pleasant Sounds Promote and Annoying Sounds Impede Health: A Cognitive Approach

    PubMed Central

    Andringa, Tjeerd C.; Lanser, J. Jolie L.

    2013-01-01

    This theoretical paper addresses the cognitive functions via which quiet and in general pleasurable sounds promote and annoying sounds impede health. The article comprises a literature analysis and an interpretation of how the bidirectional influence of appraising the environment and the feelings of the perceiver can be understood in terms of core affect and motivation. This conceptual basis allows the formulation of a detailed cognitive model describing how sonic content, related to indicators of safety and danger, either allows full freedom over mind-states or forces the activation of a vigilance function with associated arousal. The model leads to a number of detailed predictions that can be used to provide existing soundscape approaches with a solid cognitive science foundation that may lead to novel approaches to soundscape design. These will take into account that louder sounds typically contribute to distal situational awareness while subtle environmental sounds provide proximal situational awareness. The role of safety indicators, mediated by proximal situational awareness and subtle sounds, should become more important in future soundscape research. PMID:23567255

  13. Application of Particle Image Velocimetry and Reference Image Topography to jet shock cells using the hydraulic analogy

    NASA Astrophysics Data System (ADS)

    Kumar, Vaibhav; Ng, Ivan; Sheard, Gregory J.; Brocher, Eric; Hourigan, Kerry; Fouras, Andreas

    2011-08-01

    This paper examines the shock cell structure, vorticity and velocity field at the exit of an underexpanded jet nozzle using a hydraulic analogy and the Reference Image Topography technique. Understanding the flow in this region is important for the mitigation of screech, an aeroacoustic problem harmful to aircraft structures. Experiments are conducted on a water table, allowing detailed quantitative investigation of this important flow regime at a greatly reduced expense. Conventional Particle Image Velocimetry is employed to determine the velocity and vorticity fields of the nozzle exit region. Applying Reference Image Topography, the wavy water surface is reconstructed and when combined with the hydraulic analogy, provides a pressure map of the region. With this approach subtraction of surfaces is used to highlight the unsteady regions of the flow, which is not as convenient or quantitative with conventional Schlieren techniques. This allows a detailed analysis of the shock cell structures and their interaction with flow instabilities in the shear layer that are the underlying cause of jet screech.

  14. Computerization of guidelines: a knowledge specification method to convert text to detailed decision tree for electronic implementation.

    PubMed

    Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles

    2004-01-01

    The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.

  15. Software for storage and processing coded messages for the international exchange of meteorological information

    NASA Astrophysics Data System (ADS)

    Popov, V. N.; Botygin, I. A.; Kolochev, A. S.

    2017-01-01

    The approach allows representing data of international codes for exchange of meteorological information using metadescription as the formalism associated with certain categories of resources. Development of metadata components was based on an analysis of the data of surface meteorological observations, atmosphere vertical sounding, atmosphere wind sounding, weather radar observing, observations from satellites and others. A common set of metadata components was formed including classes, divisions and groups for a generalized description of the meteorological data. The structure and content of the main components of a generalized metadescription are presented in detail by the example of representation of meteorological observations from land and sea stations. The functional structure of a distributed computing system is described. It allows organizing the storage of large volumes of meteorological data for their further processing in the solution of problems of the analysis and forecasting of climatic processes.

  16. Multi-frequency data analysis in AFM by wavelet transform

    NASA Astrophysics Data System (ADS)

    Pukhova, V.; Ferrini, G.

    2017-10-01

    Interacting cantilevers in AFM experiments generate non-stationary, multi-frequency signals consisting of numerous excited flexural and torsional modes and their harmonics. The analysis of such signals is challenging, requiring special methodological approaches and a powerful mathematical apparatus. The most common approach to the signal analysis is to apply Fourier transform analysis. However, FT gives accurate spectra for stationary signals, and for signals changing their spectral content over time, FT provides only an averaged spectrum. Hence, for non-stationary and rapidly varying signals, such as those from interacting cantilevers, a method that shows the spectral evolution in time is needed. One of the most powerful techniques, allowing detailed time-frequency representation of signals, is the wavelet transform. It is a method of analysis that allows representation of energy associated to the signal at a particular frequency and time, providing correlation between the spectral and temporal features of the signal, unlike FT. This is particularly important in AFM experiments because signals nonlinearities contains valuable information about tip-sample interactions and consequently surfaces properties. The present work is aimed to show the advantages of wavelet transform in comparison with FT using as an example the force curve analysis in dynamic force spectroscopy.

  17. Automated document analysis system

    NASA Astrophysics Data System (ADS)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  18. [3D visualization and analysis of vocal fold dynamics].

    PubMed

    Bohr, C; Döllinger, M; Kniesburges, S; Traxdorf, M

    2016-04-01

    Visual investigation methods of the larynx mainly allow for the two-dimensional presentation of the three-dimensional structures of the vocal fold dynamics. The vertical component of the vocal fold dynamics is often neglected, yielding a loss of information. The latest studies show that the vertical dynamic components are in the range of the medio-lateral dynamics and play a significant role within the phonation process. This work presents a method for future 3D reconstruction and visualization of endoscopically recorded vocal fold dynamics. The setup contains a high-speed camera (HSC) and a laser projection system (LPS). The LPS projects a regular grid on the vocal fold surfaces and in combination with the HSC allows a three-dimensional reconstruction of the vocal fold surface. Hence, quantitative information on displacements and velocities can be provided. The applicability of the method is presented for one ex-vivo human larynx, one ex-vivo porcine larynx and one synthetic silicone larynx. The setup introduced allows the reconstruction of the entire visible vocal fold surfaces for each oscillation status. This enables a detailed analysis of the three dimensional dynamics (i. e. displacements, velocities, accelerations) of the vocal folds. The next goal is the miniaturization of the LPS to allow clinical in-vivo analysis in humans. We anticipate new insight on dependencies between 3D dynamic behavior and the quality of the acoustic outcome for healthy and disordered phonation.

  19. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.

    2016-12-01

    Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.

  20. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. [A rare case of infant poisoning due to accidental administration of 1,2,3-triketohydrinden hydrate (ninhydrin)].

    PubMed

    Polak, Piotr; Sołtyszewski, Ireneusz; Niemcunowicz-Janica, Anna; Siwińska-Ziółkowska, Agnieszka; Widecka-Deptuch, Emilia; Lukasik, Marcin; Janica, Jerzy

    2007-01-01

    The paper presents a case of medical malpractice during the test for phenylketonuria. The authors analyzed all documents collected in the course of the investigation of infant poisoning due to accidental administration of ninhydrin. The medical assessment was based on an extensive review of the case history, as well as on spectroscopy (FT-IR), chromatography and chemical analysis findings that allowed for confirming the presence of the toxic substance in the evidence material collected during the initial investigation. The obtained results confirmed the presence of ninhydrin in the tea cup and in the teaspoon, which were used to prepare the diagnostic medium. No ninhydrin was found in other investigated materials. The employment of routine research methods, including GC-MS, FT-IR and UV-VIS, allowed for detection and identification of the pure chemical form of ninhydrin, as well as its color complex with amino acids. The detailed case analysis, as well as the variability of extensive evidence material collected during the investigation allowed for determining the identity of persons responsible for accidental administration of the poisoning substance to the infant.

  2. Detailed product analysis during the low temperature oxidation of n-butane

    PubMed Central

    Herbinet, Olivier; Battin-Leclerc, Frédérique; Bax, Sarah; Le Gall, Hervé; Glaude, Pierre-Alexandre; Fournet, René; Zhou, Zhongyue; Deng, Liulin; Guo, Huijun; Xie, Mingfeng; Qi, Fei

    2013-01-01

    The products obtained from the low-temperature oxidation of n-butane in a jet-stirred reactor (JSR) have been analysed using two methods: gas chromatography analysis of the outlet gas and reflectron time-of-flight mass spectrometry. The mass spectrometer was combined with tunable synchrotron vacuum ultraviolet photoionization and coupled with a JSR via a molecular-beam sampling system. Experiments were performed under quasi-atmospheric pressure, for temperatures between 550 and 800 K, at a mean residence time of 6s and with a stoichiometric n-butane/oxygen/argon mixture (composition = 4/26/70 in mol %). 36 reaction products have been quantified, including addition to the usual oxidation products, acetic acid, hydrogen peroxide, C1, C2 and C4 alkylhydroperoxides and C4 ketohydroperoxides. Evidence of the possible formation of products (dihydrofuranes, furanones) derived from cyclic ethers has also been found. The performance of a detailed kinetic model of the literature has been assessed with the simulation of the formation of this extended range of species. These simulations have also allowed the analysis of possible pathways for the formation of some obtained products. PMID:21031192

  3. High Fidelity System Simulation of Multiple Components in Support of the UEET Program

    NASA Technical Reports Server (NTRS)

    Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton

    2006-01-01

    The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.

  4. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  5. The use of interpractive graphic displays for interpretation of surface design parameters

    NASA Technical Reports Server (NTRS)

    Talcott, N. A., Jr.

    1981-01-01

    An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.

  6. Aswan High Dam in 6-meter Resolution from the International Space Station

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Astronaut photography of the Earth from the International Space Station has achieved resolutions close to those available from commercial remote sensing satellites-with many photographs having spatial resolutions of less than six meters. Astronauts take the photographs by hand and physically compensate for the motion of the spacecraft relative to the Earth while the images are being acquired. The achievement was highlighted in an article entitled 'Space Station Allows Remote Sensing of Earth to within Six Meters' published in this week's edition of Eos, Transactions of the American Geophysical Union. Lines painted on airport runways at the Aswan Airport served to independently validate the spatial resolution of the camera sensor. For press information, read: International Space Station Astronauts Set New Standard for Earth Photography For details, see Robinson, J. A. and Evans, C. A. 2002. Space Station Allows Remote Sensing of Earth to within Six Meters. Eos, Transactions, American Geophysical Union 83(17):185, 188. See some of the other detailed photographs posted to Earth Observatory: Pyramids at Giza Bermuda Downtown Houston The image above represents a detailed portion of a digitized NASA photograph STS102-303-17, and was provided by the Earth Sciences and Image Analysis Laboratory at Johnson Space Center. Additional images taken by astronauts and cosmonauts can be viewed at the NASA-JSC Gateway to Astronaut Photography of Earth.

  7. Absorbable energy monitoring scheme: new design protocol to test vehicle structural crashworthiness.

    PubMed

    Ofochebe, Sunday M; Enibe, Samuel O; Ozoegwu, Chigbogu G

    2016-05-01

    In vehicle crashworthiness design optimization detailed system evaluation capable of producing reliable results are basically achieved through high-order numerical computational (HNC) models such as the dynamic finite element model, mesh-free model etc. However the application of these models especially during optimization studies is basically challenged by their inherent high demand on computational resources, conditional stability of the solution process, and lack of knowledge of viable parameter range for detailed optimization studies. The absorbable energy monitoring scheme (AEMS) presented in this paper suggests a new design protocol that attempts to overcome such problems in evaluation of vehicle structure for crashworthiness. The implementation of the AEMS involves studying crash performance of vehicle components at various absorbable energy ratios based on a 2DOF lumped-mass-spring (LMS) vehicle impact model. This allows for prompt prediction of useful parameter values in a given design problem. The application of the classical one-dimensional LMS model in vehicle crash analysis is further improved in the present work by developing a critical load matching criterion which allows for quantitative interpretation of the results of the abstract model in a typical vehicle crash design. The adequacy of the proposed AEMS for preliminary vehicle crashworthiness design is demonstrated in this paper, however its extension to full-scale design-optimization problem involving full vehicle model that shows greater structural detail requires more theoretical development.

  8. Near-infrared Spectroscopy to Reduce Prophylactic Fasciotomies for and Missed Cases of Acute Compartment Syndrome in Soldiers Injured in OEF/OIF

    DTIC Science & Technology

    2012-10-01

    studies demonstrated that NIRS measurement of hemoglobin oxygen saturation in the tibial compartment provided reliable and sensitive correlation to...pressure increases with muscle damage, there is not a complete loss of tissue oxygen saturation in the tissue over the 14 hours of the protocol. In...allow greater detail of information and flexibility in the analysis of tissue oxygenation levels. Although the 7610 oximeter has not been

  9. Assessing the Economic and Environmental Impacts Associated with Current Street Lighting Technologies

    DTIC Science & Technology

    2010-03-01

    AFIT/GEM/ENV/10-M01 Abstract Rising global energy demand and natural disasters continuously threaten energy supplies and prices. As a result , the...light bulbs. The study used the Process-Sum and Economic Input-Output Life-cycle Assessment (EIO- LCA ) methods. The results of the study found that... results for this phase of the analysis. Summary This chapter has detailed the methodology used in this study. Using both LCCA and EIO- LCA allowed for

  10. High frequency ultrasound with color Doppler in dermatology*

    PubMed Central

    Barcaui, Elisa de Oliveira; Carvalho, Antonio Carlos Pires; Lopes, Flavia Paiva Proença Lobo; Piñeiro-Maceira, Juan; Barcaui, Carlos Baptista

    2016-01-01

    Ultrasonography is a method of imaging that classically is used in dermatology to study changes in the hypoderma, as nodules and infectious and inflammatory processes. The introduction of high frequency and resolution equipments enabled the observation of superficial structures, allowing differentiation between skin layers and providing details for the analysis of the skin and its appendages. This paper aims to review the basic principles of high frequency ultrasound and its applications in different areas of dermatology. PMID:27438191

  11. A review and critique of some models used in competing risk analysis.

    PubMed

    Gail, M

    1975-03-01

    We have introduced a notation which allows one to define competing risk models easily and to examine underlying assumptions. We have treated the actuarial model for competing risk in detail, comparing it with other models and giving useful variance formulae both for the case when times of death are available and for the case when they are not. The generality of these methods is illustrated by an example treating two dependent competing risks.

  12. Belowground neighbor perception in Arabidopsis thaliana studied by transcriptome analysis: roots of Hieracium pilosella cause biotic stress

    PubMed Central

    Schmid, Christoph; Bauer, Sibylle; Müller, Benedikt; Bartelheimer, Maik

    2013-01-01

    Root-root interactions are much more sophisticated than previously thought, yet the mechanisms of belowground neighbor perception remain largely obscure. Genome-wide transcriptome analyses allow detailed insight into plant reactions to environmental cues. A root interaction trial was set up to explore both morphological and whole genome transcriptional responses in roots of Arabidopsis thaliana in the presence or absence of an inferior competitor, Hieracium pilosella. Neighbor perception was indicated by Arabidopsis roots predominantly growing away from the neighbor (segregation), while solitary plants placed more roots toward the middle of the pot. Total biomass remained unaffected. Database comparisons in transcriptome analysis revealed considerable similarity between Arabidopsis root reactions to neighbors and reactions to pathogens. Detailed analyses of the functional category “biotic stress” using MapMan tools found the sub-category “pathogenesis-related proteins” highly significantly induced. A comparison to a study on intraspecific competition brought forward a core of genes consistently involved in reactions to neighbor roots. We conclude that beyond resource depletion roots perceive neighboring roots or their associated microorganisms by a relatively uniform mechanism that involves the strong induction of pathogenesis-related proteins. In an ecological context the findings reveal that belowground neighbor detection may occur independently of resource depletion, allowing for a time advantage for the root to prepare for potential interactions. PMID:23967000

  13. Fatigue design procedure for the American SST prototype

    NASA Technical Reports Server (NTRS)

    Doty, R. J.

    1972-01-01

    For supersonic airline operations, significantly higher environmental temperature is the primary new factor affecting structural service life. Methods for incorporating the influence of temperature in detailed fatigue analyses are shown along with current test indications. Thermal effects investigated include real-time compared with short-time testing, long-time temperature exposure, and stress-temperature cycle phasing. A method is presented which allows designers and stress analyzers to check fatigue resistance of structural design details. A communicative rating system is presented which defines the relative fatigue quality of the detail so that the analyst can define cyclic-load capability of the design detail by entering constant-life charts for varying detail quality. If necessary then, this system allows the designer to determine ways to improve the fatigue quality for better life or to determine the operating stresses which will provide the required service life.

  14. NGScloud: RNA-seq analysis of non-model species using cloud computing.

    PubMed

    Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai

    2018-05-03

    RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.

  15. Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers

    DOE Data Explorer

    Ken Rhinefrank

    2016-07-25

    Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.

  16. An economic toolkit for identifying the cost of emergency medical services (EMS) systems: detailed methodology of the EMS Cost Analysis Project (EMSCAP).

    PubMed

    Lerner, E Brooke; Garrison, Herbert G; Nichol, Graham; Maio, Ronald F; Lookman, Hunaid A; Sheahan, William D; Franz, Timothy R; Austad, James D; Ginster, Aaron M; Spaite, Daniel W

    2012-02-01

    Calculating the cost of an emergency medical services (EMS) system using a standardized method is important for determining the value of EMS. This article describes the development of a methodology for calculating the cost of an EMS system to its community. This includes a tool for calculating the cost of EMS (the "cost workbook") and detailed directions for determining cost (the "cost guide"). The 12-step process that was developed is consistent with current theories of health economics, applicable to prehospital care, flexible enough to be used in varying sizes and types of EMS systems, and comprehensive enough to provide meaningful conclusions. It was developed by an expert panel (the EMS Cost Analysis Project [EMSCAP] investigator team) in an iterative process that included pilot testing the process in three diverse communities. The iterative process allowed ongoing modification of the toolkit during the development phase, based upon direct, practical, ongoing interaction with the EMS systems that were using the toolkit. The resulting methodology estimates EMS system costs within a user-defined community, allowing either the number of patients treated or the estimated number of lives saved by EMS to be assessed in light of the cost of those efforts. Much controversy exists about the cost of EMS and whether the resources spent for this purpose are justified. However, the existence of a validated toolkit that provides a standardized process will allow meaningful assessments and comparisons to be made and will supply objective information to inform EMS and community officials who are tasked with determining the utilization of scarce societal resources. © 2012 by the Society for Academic Emergency Medicine.

  17. Shock wave viscosity measurements

    NASA Astrophysics Data System (ADS)

    Celliers, Peter

    2013-06-01

    Several decades ago a method was proposed and demonstrated to measure the viscosity of fluids at high pressure by observing the oscillatory damping of sinusoidal perturbations on a shock front. A detailed mathematical analysis of the technique carried out subsequently by Miller and Ahrens revealed its potential, as well as a deep level of complexity in the analysis. We revisit the ideas behind this technique in the context of a recent experimental development: two-dimensional imaging velocimetry. The new technique allows one to capture a broad spectrum of perturbations down to few micron scale-lengths imposed on a shock front from an initial perturbation. The detailed evolution of the perturbation spectrum is sensitive to the viscosity in the fluid behind the shock front. Initial experiments are aimed at examining the viscosity of shock compressed SiO2 just above the shock melting transition. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  18. Strain-induced macroscopic magnetic anisotropy from smectic liquid-crystalline elastomer-maghemite nanoparticle hybrid nanocomposites.

    PubMed

    Haberl, Johannes M; Sánchez-Ferrer, Antoni; Mihut, Adriana M; Dietsch, Hervé; Hirt, Ann M; Mezzenga, Raffaele

    2013-06-21

    We combine tensile strength analysis and X-ray scattering experiments to establish a detailed understanding of the microstructural coupling between liquid-crystalline elastomer (LCE) networks and embedded magnetic core-shell ellipsoidal nanoparticles (NPs). We study the structural and magnetic re-organization at different deformations and NP loadings, and the associated shape and magnetic memory features. In the quantitative analysis of a stretching process, the effect of the incorporated NPs on the smectic LCE is found to be prominent during the reorientation of the smectic domains and the softening of the nanocomposite. Under deformation, the soft response of the nanocomposite material allows the organization of the nanoparticles to yield a permanent macroscopically anisotropic magnetic material. Independent of the particle loading, the shape-memory properties and the smectic phase of the LCEs are preserved. Detailed studies on the magnetic properties demonstrate that the collective ensemble of individual particles is responsible for the macroscopic magnetic features of the nanocomposite.

  19. [The consequences of stroke for the artist Lovis Corinth].

    PubMed

    Bäzner, H; Hennerici, M G

    2006-09-01

    The artist Lovis Corinth suffered a right-hemispheric stroke at the age of 53 years but only died 14 years later. The huge amount of work he produced after this life threatening disease allows detailed analysis of his post-stroke artwork in comparison to pre-stroke. When performing this analysis as a neurologist, an enormous diversity of subtle stroke sequelae can be discovered that are mostly explained by left-sided hemi-neglect. These findings clearly go far beyond pure psychological processes. Moreover, Corinth is a motivating example for disabled patients because he was able to produce great artwork after his stroke. He was struggling against a motor disability admittedly not severely affecting his artistic production, but also against severe neuropsychological deficits that did have clear consequences. Lovis Corinth left us the credo "True art means to use unreality". Taken together with the often cited phrase "Drawing means to leave out (details)", a clear-cut interpretation for neurologists can be derived from the understanding of right-hemisphere lesions and subsequent left-sided neglect.

  20. Terminal Area Productivity Airport Wind Analysis and Chicago O'Hare Model Description

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Shapiro, Gerald

    1998-01-01

    This paper describes two results from a continuing effort to provide accurate cost-benefit analyses of the NASA Terminal Area Productivity (TAP) program technologies. Previous tasks have developed airport capacity and delay models and completed preliminary cost benefit estimates for TAP technologies at 10 U.S. airports. This task covers two improvements to the capacity and delay models. The first improvement is the completion of a detailed model set for the Chicago O'Hare (ORD) airport. Previous analyses used a more general model to estimate the benefits for ORD. This paper contains a description of the model details with results corresponding to current conditions. The second improvement is the development of specific wind speed and direction criteria for use in the delay models to predict when the Aircraft Vortex Spacing System (AVOSS) will allow use of reduced landing separations. This paper includes a description of the criteria and an estimate of AVOSS utility for 10 airports based on analysis of 35 years of weather data.

  1. X-ray vision of fuel sprays.

    PubMed

    Wang, Jin

    2005-03-01

    With brilliant synchrotron X-ray sources, microsecond time-resolved synchrotron X-ray radiography and tomography have been used to elucidate the detailed three-dimensional structure and dynamics of high-pressure high-speed fuel sprays in the near-nozzle region. The measurement allows quantitative determination of the fuel distribution in the optically impenetrable region owing to the multiple scattering of visible light by small atomized fuel droplets surrounding the jet. X-radiographs of the jet-induced shock waves prove that the fuel jets become supersonic under appropriate injection conditions and that the quantitative analysis of the thermodynamic properties of the shock waves can also be derived from the most direct measurement. In other situations where extremely axial-asymmetric sprays are encountered, mass deconvolution and cross-sectional fuel distribution models can be computed based on the monochromatic and time-resolved X-radiographic images collected from various rotational orientations of the sprays. Such quantitative analysis reveals the never-before-reported characteristics and most detailed near-nozzle mass distribution of highly transient fuel sprays.

  2. New perspectives on archaeological prospecting: Multispectral imagery analysis from Army City, Kansas, USA

    NASA Astrophysics Data System (ADS)

    Banks, Benjamin Daniel

    Aerial imagery analysis has a long history in European archaeology and despite early attempts little progress has been made to promote its use in North America. Recent advances in multispectral satellite and aerial sensors are helping to make aerial imagery analysis more effective in North America, and more cost effective. A site in northeastern Kansas is explored using multispectral aerial and satellite imagery allowing buried features to be mapped. Many of the problems associated with early aerial imagery analysis are explored, such as knowledge of archeological processes that contribute to crop mark formation. Use of multispectral imagery provides a means of detecting and enhancing crop marks not easily distinguishable in visible spectrum imagery. Unsupervised computer classifications of potential archaeological features permits their identification and interpretation while supervised classifications, incorporating limited amounts of geophysical data, provide a more detailed understanding of the site. Supervised classifications allow archaeological processes contributing to crop mark formation to be explored. Aerial imagery analysis is argued to be useful to a wide range of archeological problems, reducing person hours and expenses needed for site delineation and mapping. This technology may be especially useful for cultural resources management.

  3. A Comparison of Functional Models for Use in the Function-Failure Design Method

    NASA Technical Reports Server (NTRS)

    Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.

    2006-01-01

    When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.

  4. Diagnostic emulation: Implementation and user's guide

    NASA Technical Reports Server (NTRS)

    Becher, Bernice

    1987-01-01

    The Diagnostic Emulation Technique was developed within the System Validation Methods Branch as a part of the development of methods for the analysis of the reliability of highly reliable, fault tolerant digital avionics systems. This is a general technique which allows for the emulation of a digital hardware system. The technique is general in the sense that it is completely independent of the particular target hardware which is being emulated. Parts of the system are described and emulated at the logic or gate level, while other parts of the system are described and emulated at the functional level. This algorithm allows for the insertion of faults into the system, and for the observation of the response of the system to these faults. This allows for controlled and accelerated testing of system reaction to hardware failures in the target machine. This document describes in detail how the algorithm was implemented at NASA Langley Research Center and gives instructions for using the system.

  5. Genome Evolution of Plant-Parasitic Nematodes.

    PubMed

    Kikuchi, Taisei; Eves-van den Akker, Sebastian; Jones, John T

    2017-08-04

    Plant parasitism has evolved independently on at least four separate occasions in the phylum Nematoda. The application of next-generation sequencing (NGS) to plant-parasitic nematodes has allowed a wide range of genome- or transcriptome-level comparisons, and these have identified genome adaptations that enable parasitism of plants. Current genome data suggest that horizontal gene transfer, gene family expansions, evolution of new genes that mediate interactions with the host, and parasitism-specific gene regulation are important adaptations that allow nematodes to parasitize plants. Sequencing of a larger number of nematode genomes, including plant parasites that show different modes of parasitism or that have evolved in currently unsampled clades, and using free-living taxa as comparators would allow more detailed analysis and a better understanding of the organization of key genes within the genomes. This would facilitate a more complete understanding of the way in which parasitism has shaped the genomes of plant-parasitic nematodes.

  6. Metallurgical Plant Optimization Through the use of Flowsheet Simulation Modelling

    NASA Astrophysics Data System (ADS)

    Kennedy, Mark William

    Modern metallurgical plants typically have complex flowsheets and operate on a continuous basis. Real time interactions within such processes can be complex and the impacts of streams such as recycles on process efficiency and stability can be highly unexpected prior to actual operation. Current desktop computing power, combined with state-of-the-art flowsheet simulation software like Metsim, allow for thorough analysis of designs to explore the interaction between operating rate, heat and mass balances and in particular the potential negative impact of recycles. Using plant information systems, it is possible to combine real plant data with simple steady state models, using dynamic data exchange links to allow for near real time de-bottlenecking of operations. Accurate analytical results can also be combined with detailed unit operations models to allow for feed-forward model-based-control. This paper will explore some examples of the application of Metsim to real world engineering and plant operational issues.

  7. Fossil insect evidence for the end of the Western Settlement in Norse Greenland

    NASA Astrophysics Data System (ADS)

    Panagiotakopulu, Eva; Skidmore, Peter; Buckland, Paul

    2007-04-01

    The fate of Norse farming settlements in southwest Greenland has often been seen as one of the great mysteries of North Atlantic colonization and expansion. Preservation of organic remains in the permafrost of the area of the Western Settlement, inland from the modern capital Nuuk, allowed very detailed study of the phases of occupation. Samples were taken from house floors and middens during the process of archaeological excavations and from insect remains were abstracted and identified in the laboratory. In this study, we present a new paleoecological approach principally examining the fossil fly faunas from house floors. The results of our study provide contrasting detailed pictures of the demise of two neighboring farms, Gården under Sandet and Nipaatsoq, one where abandonment appears as part of a normal process of site selection and desertion, and the other where the end was more traumatic. The level of detail, which was obtained by analysis of the dipterous (true fly) remains, exceeds all previous work and provides insights otherwise unobtainable.

  8. The role of orbital dynamics and cloud-cloud collisions in the formation of giant molecular clouds in global spiral structures

    NASA Technical Reports Server (NTRS)

    Roberts, William W., Jr.; Stewart, Glen R.

    1987-01-01

    The role of orbit crowding and cloud-cloud collisions in the formation of GMCs and their organization in global spiral structure is investigated. Both N-body simulations of the cloud system and a detailed analysis of individual particle orbits are used to develop a conceptual understanding of how individual clouds participate in the collective density response. Detailed comparisons are made between a representative cloud-particle simulation in which the cloud particles collide inelastically with one another and give birth to and subsequently interact with young star associations and stripped down simulations in which the cloud particles are allowed to follow ballistic orbits in the absence of cloud-cloud collisions or any star formation processes. Orbit crowding is then related to the behavior of individual particle trajectories in the galactic potential field. The conceptual picture of how GMCs are formed in the clumpy ISMs of spiral galaxies is formulated, and the results are compared in detail with those published by other authors.

  9. Fossil insect evidence for the end of the Western Settlement in Norse Greenland.

    PubMed

    Panagiotakopulu, Eva; Skidmore, Peter; Buckland, Paul

    2007-04-01

    The fate of Norse farming settlements in southwest Greenland has often been seen as one of the great mysteries of North Atlantic colonization and expansion. Preservation of organic remains in the permafrost of the area of the Western Settlement, inland from the modern capital Nuuk, allowed very detailed study of the phases of occupation. Samples were taken from house floors and middens during the process of archaeological excavations and from insect remains were abstracted and identified in the laboratory. In this study, we present a new paleoecological approach principally examining the fossil fly faunas from house floors. The results of our study provide contrasting detailed pictures of the demise of two neighboring farms, Gården under Sandet and Nipaatsoq, one where abandonment appears as part of a normal process of site selection and desertion, and the other where the end was more traumatic. The level of detail, which was obtained by analysis of the dipterous (true fly) remains, exceeds all previous work and provides insights otherwise unobtainable.

  10. Revealing Individual Lifestyles through Mass Spectrometry Imaging of Chemical Compounds in Fingerprints.

    PubMed

    Hinners, Paige; O'Neill, Kelly C; Lee, Young Jin

    2018-03-26

    Fingerprints, specifically the ridge details within the print, have long been used in forensic investigations for individual identification. Beyond the ridge detail, fingerprints contain useful chemical information. The study of fingerprint chemical information has become of interest, especially with mass spectrometry imaging technologies. Mass spectrometry imaging visualizes the spatial relationship of each compound detected, allowing ridge detail and chemical information in a single analysis. In this work, a range of exogenous fingerprint compounds that may reveal a personal lifestyle were studied using matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI). Studied chemical compounds include various brands of bug sprays and sunscreens, as well as food oils, alcohols, and citrus fruits. Brand differentiation and source determination were possible based on the active ingredients or exclusive compounds left in fingerprints. Tandem mass spectrometry was performed for the key compounds, so that these compounds could be confidently identified in a single multiplex mass spectrometry imaging data acquisition.

  11. Intracellular O2 sensing probe based on cell-penetrating phosphorescent nanoparticles.

    PubMed

    Fercher, Andreas; Borisov, Sergey M; Zhdanov, Alexander V; Klimant, Ingo; Papkovsky, Dmitri B

    2011-07-26

    A new intracellular O(2) (icO(2)) sensing probe is presented, which comprises a nanoparticle (NP) formulation of a cationic polymer Eudragit RL-100 and a hydrophobic phosphorescent dye Pt(II)-tetrakis(pentafluorophenyl)porphyrin (PtPFPP). Using the time-resolved fluorescence (TR-F) plate reader set-up, cell loading was investigated in detail, particularly the effects of probe concentration, loading time, serum content in the medium, cell type, density, etc. The use of a fluorescent analogue of the probe in conjunction with confocal microscopy and flow cytometry analysis, revealed that cellular uptake of the NPs is driven by nonspecific energy-dependent endocytosis and that the probe localizes inside the cell close to the nucleus. Probe calibration in biological environment was performed, which allowed conversion of measured phosphorescence lifetime signals into icO(2) concentration (μM). Its analytical performance in icO(2) sensing experiments was demonstrated by monitoring metabolic responses of mouse embryonic fibroblast cells under ambient and hypoxic macroenvironment. The NP probe was seen to generate stable and reproducible signals in different types of mammalian cells and robust responses to their metabolic stimulation, thus allowing accurate quantitative analysis. High brightness and photostability allow its use in screening experiments with cell populations on a commercial TR-F reader, and for single cell analysis on a fluorescent microscope.

  12. Radiation damages during synchrotron X-ray micro-analyses of Prussian blue and zinc white historic paintings: detection, mitigation and integration

    NASA Astrophysics Data System (ADS)

    Gervais, Claire; Thoury, Mathieu; Réguer, Solenn; Gueriau, Pierre; Mass, Jennifer

    2015-11-01

    High-flux synchrotron techniques allow microspectroscopic analyses of artworks that were not feasible even a few years ago, allowing for a more detailed characterization of their constituent materials and a better understanding of their chemistry. However, interaction between high-flux photons and matter at the sub-microscale can generate damages which are not visually detectable. We show here different methodologies allowing to evidence the damages induced by microscopic X-ray absorption near-edge structure spectroscopy analysis (μXANES) at the Fe and Zn K-edges of a painting dating from the turn of the twentieth century containing Prussian blue and zinc white. No significant degradation of the pigments was noticed, in agreement with the excellent condition of the painting. However, synchrotron radiation damages occurred at several levels, from chemical changes of the binder, modification of crystal defects in zinc oxide, to Prussian blue photoreduction. They could be identified by using both the μXANES signal during analysis and with photoluminescence imaging in the deep ultraviolet and visible ranges after analysis. We show that recording accurately damaged areas is a key step to prevent misinterpretation of results during future re-examination of the sample. We conclude by proposing good practices that could help in integrating radiation damage avoidance into the analytical pathway.

  13. Mars Global Geologic Mapping: Amazonian Results

    NASA Technical Reports Server (NTRS)

    Tanaka, K. L.; Dohm, J. M.; Irwin, R.; Kolb, E. J.; Skinner, J. A., Jr.; Hare, T. M.

    2008-01-01

    We are in the second year of a five-year effort to map the geology of Mars using mainly Mars Global Surveyor, Mars Express, and Mars Odyssey imaging and altimetry datasets. Previously, we have reported on details of project management, mapping datasets (local and regional), initial and anticipated mapping approaches, and tactics of map unit delineation and description [1-2]. For example, we have seen how the multiple types and huge quantity of image data as well as more accurate and detailed altimetry data now available allow for broader and deeper geologic perspectives, based largely on improved landform perception, characterization, and analysis. Here, we describe early mapping results, which include updating of previous northern plains mapping [3], including delineation of mainly Amazonian units and regional fault mapping, as well as other advances.

  14. A stethoscope with wavelet separation of cardiac and respiratory sounds for real time telemedicine implemented on field-programmable gate array

    NASA Astrophysics Data System (ADS)

    Castro, Víctor M.; Muñoz, Nestor A.; Salazar, Antonio J.

    2015-01-01

    Auscultation is one of the most utilized physical examination procedures for listening to lung, heart and intestinal sounds during routine consults and emergencies. Heart and lung sounds overlap in the thorax. An algorithm was used to separate them based on the discrete wavelet transform with multi-resolution analysis, which decomposes the signal into approximations and details. The algorithm was implemented in software and in hardware to achieve real-time signal separation. The heart signal was found in detail eight and the lung signal in approximation six. The hardware was used to separate the signals with a delay of 256 ms. Sending wavelet decomposition data - instead of the separated full signa - allows telemedicine applications to function in real time over low-bandwidth communication channels.

  15. The evolving corona and evidence for jet launching from the supermassive black hole in Markarian 335

    NASA Astrophysics Data System (ADS)

    Wilkins, Daniel; Gallo, Luigi C.

    2015-01-01

    Through detailed analysis of the X-rays that are reflected from the accretion disc, it is possible to probe structures right down to the innermost stable circular orbit and event horizon around the supermassive black holes in AGN. By measuring the illumination pattern of the accretion disc, along with reverberation time lags between variability in the X-ray continuum and reflection, unprecedented detail of the geometry and spatial extent of the corona that produces the X-ray continuum has emerged when the observed data are combined with insight gained from general relativistic ray tracing simulations.We conducted detailed analysis of both the X-ray continuum and its reflection from the accretion disc in the narrow line Seyfert 1 galaxy Markarian 335, over observations spanning nearly a decade to measure the underlying changes in the structure of the X-ray emitting corona that gave rise to more than an order of magnitude variation in luminosity.Underlying this long timescale variability lies much more complex patterns of behaviour on short timescales. We are, for the first time, able to observe and measure the changes in the structure of the corona that give rise to transient phenomena including a flare in the X-ray emission seen during a low flux state by Suzaku in July 2013. This flaring event was found to mark a reconfiguration of the corona while there is evidence that the flare itself was cased by an aborted jet-launching event. More recently, detailed analysis of a NuSTAR target of opportunity observation is letting us understand the sudden increase in X-ray flux by a factor of 15 in Markarian 335 seen in September 2014.These observations allow us to trace, from observations, the evolution of the X-ray emitting corona that gives rise to not only the extreme variability seen in the X-ray emission from AGN, but also the processes by which jets and other outflow are launched from the extreme environments around black holes. This gives us important insight into the physical processes by which energy is liberated from black hole accretion flows and allows observational constraints to be placed upon theoretical models of how these extreme objects are powered.

  16. Metamorphosis revealed: time-lapse three-dimensional imaging inside a living chrysalis

    PubMed Central

    Lowe, Tristan; Garwood, Russell J.; Simonsen, Thomas J.; Bradley, Robert S.; Withers, Philip J.

    2013-01-01

    Studies of model insects have greatly increased our understanding of animal development. Yet, they are limited in scope to this small pool of model species: a small number of representatives for a hyperdiverse group with highly varied developmental processes. One factor behind this narrow scope is the challenging nature of traditional methods of study, such as histology and dissection, which can preclude quantitative analysis and do not allow the development of a single individual to be followed. Here, we use high-resolution X-ray computed tomography (CT) to overcome these issues, and three-dimensionally image numerous lepidopteran pupae throughout their development. The resulting models are presented in the electronic supplementary material, as are figures and videos, documenting a single individual throughout development. They provide new insight and details of lepidopteran metamorphosis, and allow the measurement of tracheal and gut volume. Furthermore, this study demonstrates early and rapid development of the tracheae, which become visible in scans just 12 h after pupation. This suggests that there is less remodelling of the tracheal system than previously expected, and is methodologically important because the tracheal system is an often-understudied character system in development. In the future, this form of time-lapse CT-scanning could allow faster and more detailed developmental studies on a wider range of taxa than is presently possible. PMID:23676900

  17. Pervasive Sound Sensing: A Weakly Supervised Training Approach.

    PubMed

    Kelly, Daniel; Caulfield, Brian

    2016-01-01

    Modern smartphones present an ideal device for pervasive sensing of human behavior. Microphones have the potential to reveal key information about a person's behavior. However, they have been utilized to a significantly lesser extent than other smartphone sensors in the context of human behavior sensing. We postulate that, in order for microphones to be useful in behavior sensing applications, the analysis techniques must be flexible and allow easy modification of the types of sounds to be sensed. A simplification of the training data collection process could allow a more flexible sound classification framework. We hypothesize that detailed training, a prerequisite for the majority of sound sensing techniques, is not necessary and that a significantly less detailed and time consuming data collection process can be carried out, allowing even a nonexpert to conduct the collection, labeling, and training process. To test this hypothesis, we implement a diverse density-based multiple instance learning framework, to identify a target sound, and a bag trimming algorithm, which, using the target sound, automatically segments weakly labeled sound clips to construct an accurate training set. Experiments reveal that our hypothesis is a valid one and results show that classifiers, trained using the automatically segmented training sets, were able to accurately classify unseen sound samples with accuracies comparable to supervised classifiers, achieving an average F -measure of 0.969 and 0.87 for two weakly supervised datasets.

  18. Dynamic Displacement Disorder of Cubic BaTiO3

    NASA Astrophysics Data System (ADS)

    Paściak, M.; Welberry, T. R.; Kulda, J.; Leoni, S.; Hlinka, J.

    2018-04-01

    The three-dimensional distribution of the x-ray diffuse scattering intensity of BaTiO3 has been recorded in a synchrotron experiment and simultaneously computed using molecular dynamics simulations of a shell model. Together, these have allowed the details of the disorder in paraelectric BaTiO3 to be clarified. The narrow sheets of diffuse scattering, related to the famous anisotropic longitudinal correlations of Ti ions, are shown to be caused by the overdamped anharmonic soft phonon branch. This finding demonstrates that the occurrence of narrow sheets of diffuse scattering agrees with a displacive picture of the cubic phase of this textbook ferroelectric material. The presented methodology allows one to go beyond the harmonic approximation in the analysis of phonons and phonon-related scattering.

  19. The Role of Data Archives in Synoptic Solar Physics

    NASA Astrophysics Data System (ADS)

    Reardon, Kevin

    The detailed study of solar cycle variations requires analysis of recorded datasets spanning many years of observations, that is, a data archive. The use of digital data, combined with powerful database server software, gives such archives new capabilities to provide, quickly and flexibly, selected pieces of information to scientists. Use of standardized protocols will allow multiple databases, independently maintained, to be seamlessly joined, allowing complex searches spanning multiple archives. These data archives also benefit from being developed in parallel with the telescope itself, which helps to assure data integrity and to provide close integration between the telescope and archive. Development of archives that can guarantee long-term data availability and strong compatibility with other projects makes solar-cycle studies easier to plan and realize.

  20. Shining light on neurons--elucidation of neuronal functions by photostimulation.

    PubMed

    Eder, Matthias; Zieglgänsberger, Walter; Dodt, Hans-Ulrich

    2004-01-01

    Many neuronal functions can be elucidated by techniques that allow for a precise stimulation of defined regions of a neuron and its afferents. Photolytic release of neurotransmitters from 'caged' derivates in the vicinity of visualized neurons in living brain slices meets this request. This technique allows the study of the subcellular distribution and properties of functional native neurotransmitter receptors. These are prerequisites for a detailed analysis of the expression and spatial specificity of synaptic plasticity. Photostimulation can further be used to fast map the synaptic connectivity between nearby and, more importantly, distant cells in a neuronal network. Here we give a personal review of some of the technical aspects of photostimulation and recent findings, which illustrate the advantages of this technique.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petiteau, Antoine; Auger, Gerard; Halloin, Hubert

    A new LISA simulator (LISACode) is presented. Its ambition is to achieve a new degree of sophistication allowing to map, as closely as possible, the impact of the different subsystems on the measurements. LISACode is not a detailed simulator at the engineering level but rather a tool whose purpose is to bridge the gap between the basic principles of LISA and a future, sophisticated end-to-end simulator. This is achieved by introducing, in a realistic manner, most of the ingredients that will influence LISA's sensitivity as well as the application of TDI combinations. Many user-defined parameters allow the code to studymore » different configurations of LISA thus helping to finalize the definition of the detector. Another important use of LISACode is in generating time-series for data analysis developments.« less

  2. Large Eddy Simulation of Gravitational Effects on Transitional and Turbulent Gas-Jet Diffusion Flames

    NASA Technical Reports Server (NTRS)

    Givi, Peyman; Jaberi, Farhad A.

    2001-01-01

    The basic objective of this work is to assess the influence of gravity on "the compositional and the spatial structures" of transitional and turbulent diffusion flames via large eddy simulation (LES), and direct numerical simulation (DNS). The DNS is conducted for appraisal of the various closures employed in LES, and to study the effect of buoyancy on the small scale flow features. The LES is based on our "filtered mass density function"' (FMDF) model. The novelty of the methodology is that it allows for reliable simulations with inclusion of "realistic physics." It also allows for detailed analysis of the unsteady large scale flow evolution and compositional flame structure which is not usually possible via Reynolds averaged simulations.

  3. Analysis of lead twist in modern high-performance grinding methods

    NASA Astrophysics Data System (ADS)

    Kundrák, J.; Gyáni, K.; Felhő, C.; Markopoulos, AP; Deszpoth, I.

    2016-11-01

    According to quality requirements of road vehicles shafts, which bear dynamic seals, twisted-pattern micro-geometrical topography is not allowed. It is a question whether newer modern grinding methods - such as quick-point grinding and peel grinding - could provide twist- free topography. According to industrial experience, twist-free surfaces can be made, however with certain settings, same twist occurs. In this paper it is proved by detailed chip-geometrical analysis that the topography generated by the new procedures is theoretically twist-patterned because of the feeding motion of the CBN tool. The presented investigation was carried out by a single-grain wheel model and computer simulation.

  4. Navier-Stokes analysis of radial turbine rotor performance

    NASA Technical Reports Server (NTRS)

    Larosiliere, L. M.

    1993-01-01

    An analysis of flow through a radial turbine rotor using the three-dimensional, thin-layer Navier-Stokes code RVC3D is described. The rotor is a solid version of an air-cooled metallic radial turbine having thick trailing edges, shroud clearance, and scalloped-backface clearance. Results are presented at the nominal operating condition using both a zero-clearance model and a model simulating the effects of the shroud and scalloped-backface clearance flows. A comparison with the available test data is made and details of the internal flow physics are discussed, allowing a better understanding of the complex flow distribution within the rotor.

  5. KSC-04PD-0150

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. One of the worlds highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data (shown here) in preparation for the shuttle fleets return to flight, is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. The system, developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

  6. KSC-04PD-0151

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. One of the worlds highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data (shown here) in preparation for the shuttle fleets return to flight, is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. The system, developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

  7. KSC-04PD-0154

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. One of the worlds highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data (shown here) in preparation for the shuttle fleets return to flight, is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. The system, developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

  8. KSC-04PD-0152

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. These towers are part of one of the worlds highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data in preparation for the shuttle fleets return to flight. The system is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. Developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., the system allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI

  9. Development of a Gravity-Insensitive Heat Pump for Lunar Applications

    NASA Technical Reports Server (NTRS)

    Cole, Gregory S.; Scaringe, Robert P.; Grzyll, Lawrence R.; Ewert, Michael K.

    2006-01-01

    Mainstream Engineering Corporation is developing a gravity-insensitive system that will allow a vapor-compression-cycle heat pump to be used in both microgravity (10(exp -6)g) and lunar (10(exp -6)g) environments. System capacity is 5 kW to 15 kW at design refrigerant operating conditions of 4.44 C and 60 C evaporating and condensing temperatures, respectively. The current program, performed for NASA Johnson Space Center (JSC) and presented in this paper, includes compressor performance analysis, detailed system design, and thermal analysis. Future efforts, including prototype fabrication, integration of a solar power source and controls, ground-testing, and flight-testing support, are also discussed.

  10. Procedure for implementation of temperature-dependent mechanical property capability in the Engineering Analysis Language (EAL) system

    NASA Technical Reports Server (NTRS)

    Glass, David E.; Robinson, James C.

    1990-01-01

    A procedure is presented to allow the use of temperature dependent mechanical properties in the Engineering Analysis Language (EAL) System for solid structural elements. This is accomplished by including a modular runstream in the main EAL runstream. The procedure is applicable for models with multiple materials and with anisotropic properties, and can easily be incorporated into an existing EAL runstream. The procedure (which is applicable for EAL elastic solid elements) is described in detail, followed by a description of the validation of the routine. A listing of the EAL runstream used to validate the procedure is included in the Appendix.

  11. Fluorescence-Assisted Gamma Spectrometry for Surface Contamination Analysis

    NASA Astrophysics Data System (ADS)

    Ihantola, Sakari; Sand, Johan; Perajarvi, Kari; Toivonen, Juha; Toivonen, Harri

    2013-02-01

    A fluorescence-based alpha-gamma coincidence spectrometry approach has been developed for the analysis of alpha-emitting radionuclides. The thermalization of alpha particles in air produces UV light, which in turn can be detected over long distances. The simultaneous detection of UV and gamma photons allows detailed gamma analyses of a single spot of interest even in highly active surroundings. Alpha particles can also be detected indirectly from samples inside sealed plastic bags, which minimizes the risk of cross-contamination. The position-sensitive alpha-UV-gamma coincidence technique reveals the presence of alpha emitters and identifies the nuclides ten times faster than conventional gamma spectrometry.

  12. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  13. Ultra-high Q terahertz whispering-gallery modes in a silicon resonator

    NASA Astrophysics Data System (ADS)

    Vogt, Dominik Walter; Leonhardt, Rainer

    2018-05-01

    We report on the first experimental demonstration of terahertz (THz) whispering-gallery modes (WGMs) with an ultra-high quality factor of 1.5 × 104 at 0.62 THz. The WGMs are observed in a high resistivity float zone silicon spherical resonator coupled to a sub-wavelength silica waveguide. A detailed analysis of the coherent continuous wave THz spectroscopy measurements combined with a numerical model based on Mie-Debye-Aden-Kerker theory allows us to unambiguously identify the observed higher order radial THz WGMs.

  14. Corps Helicopter Attack Planning System (CHAPS). Positional Handbook. Appendix A. Messages. Appendix B. Statespace Construction Sample Session

    DTIC Science & Technology

    1989-10-01

    REVIEW MENU PROGRAM (S) CHAPS PURPOSE AND OVERVIEV The Do Review menu allows the user to select which missions to perform detailed analysis on and...input files must be resident on the computer you are running SUPR on. Any interface or file transfer programs must be successfully executed prior to... COMPUTER PROGRAM WAS DEVELOPED BY SYSTEMS CONTROL TECHNOLOGY FOR THE DEPUTY CHIEF OF STAFF/OPERATIONS,HQ USAFE. THE USE OF THE COMPUTER PROGRAM IS

  15. Autoantigenicity of nucleolar complexes.

    PubMed

    Welting, Tim J M; Raijmakers, Reinout; Pruijn, Ger J M

    2003-10-01

    Autoantibodies targeting nucleolar autoantigens (ANoA) are most frequently found in sera from patients with systemic sclerosis (SSc, also designated scleroderma) or with SSc overlap syndromes. During the last decade an extensive number of nucleolar components have been identified and this allowed a more detailed analysis of the identity of nucleolar autoantigens. This review intends to give an overview of the molecular composition of the major (families of) autoantigenic nucleolar complexes, to provide some insight into their functions and to summarise the data concerning their autoantigenicity.

  16. Design of a nano-layered tunable optical filter

    NASA Astrophysics Data System (ADS)

    Banerjee, A.; Awasthi, S. K.; Malaviya, U.; Ojha, S. P.

    2006-12-01

    A novel theory to design tunable band pass filters using one-dimensional nano-photonic structures is proposed. Periodic structures consisting of different dielectrics and semiconductor materials are considered. A detailed mathematical analysis is presented to predict allowed and forbidden bands of wavelengths with variation of angle of incidence and lattice parameters. It is possible to get desired ranges of the electromagnetic spectrum filtered with this structure by changing the incidence angle of light and/or changing the value of the lattice parameters.

  17. Site-3 sea anemone toxins: molecular probes of gating mechanisms in voltage-dependent sodium channels.

    PubMed

    Smith, Jaime J; Blumenthal, Kenneth M

    2007-02-01

    Sea anemone toxins, whose biological function is the capture of marine prey, are invaluable tools for studying the structure and function of mammalian voltage-gated sodium channels. Their high degree of specificity and selectivity have allowed for detailed analysis of inactivation gating and assignment of molecular entities responsible for this process. Because of their ability to discriminate among channel isoforms, and their high degree of structural conservation, these toxins could serve as important lead compounds for future pharmaceutical design.

  18. Coup d’Oeil: Military Geography and the Operational Level of War

    DTIC Science & Technology

    1991-05-16

    afUP D’OEIL Every day I feel nmre rnd more in need of an atlas, as geogrphvy iv the minutest details. is essential to a true nli "tary education. I...categorizing terrain have provided the essential prerequisites for the development of the IPB process. The process allows for an in-depth technical analysis of...is theater whiidch define the lines of essential to the c •umnrs plan. operation. ... defined by a conpetent authority. CENTER Zi f GRAVIMI Center of

  19. Frequency-Tracking CW Doppler Radar Solving Small-Angle Approximation and Null Point Issues in Non-Contact Vital Signs Monitoring.

    PubMed

    Mercuri, Marco; Liu, Yao-Hong; Lorato, Ilde; Torfs, Tom; Bourdoux, Andre; Van Hoof, Chris

    2017-06-01

    A Doppler radar operating as a Phase-Locked-Loop (PLL) in frequency demodulator configuration is presented and discussed. The proposed radar presents a unique architecture, using a single channel mixer, and allows to detect contactless vital signs parameters while solving the null point issue and without requiring the small angle approximation condition. Spectral analysis, simulations, and experimental results are presented and detailed to demonstrate the feasibility and the operational principle of the proposed radar architecture.

  20. Linear Spectral Analysis of Plume Emissions Using an Optical Matrix Processor

    NASA Technical Reports Server (NTRS)

    Gary, C. K.

    1992-01-01

    Plume spectrometry provides a means to monitor the health of a burning rocket engine, and optical matrix processors provide a means to analyze the plume spectra in real time. By observing the spectrum of the exhaust plume of a rocket engine, researchers have detected anomalous behavior of the engine and have even determined the failure of some equipment before it would normally have been noticed. The spectrum of the plume is analyzed by isolating information in the spectrum about the various materials present to estimate what materials are being burned in the engine. Scientists at the Marshall Space Flight Center (MSFC) have implemented a high resolution spectrometer to discriminate the spectral peaks of the many species present in the plume. Researchers at the Stennis Space Center Demonstration Testbed Facility (DTF) have implemented a high resolution spectrometer observing a 1200-lb. thrust engine. At this facility, known concentrations of contaminants can be introduced into the burn, allowing for the confirmation of diagnostic algorithms. While the high resolution of the measured spectra has allowed greatly increased insight into the functioning of the engine, the large data flows generated limit the ability to perform real-time processing. The use of an optical matrix processor and the linear analysis technique described below may allow for the detailed real-time analysis of the engine's health. A small optical matrix processor can perform the required mathematical analysis both quicker and with less energy than a large electronic computer dedicated to the same spectral analysis routine.

  1. Unusual ISS Rate Signature

    NASA Technical Reports Server (NTRS)

    Laible, Michael R.

    2011-01-01

    On November 23, 2011 International Space Station Guidance, Navigation, and Control reported unusual pitch rate disturbance. These disturbances were an order of magnitude greater than nominal rates. The Loads and Dynamics team was asked to review and analyze current accelerometer data to investigate this disturbance. This paper will cover the investigation process under taken by the Loads and Dynamics group. It will detail the accelerometers used and analysis performed. The analysis included performing Frequency Fourier Transform of the data to identify the mode of interest. This frequency data is then reviewed with modal analysis of the ISS system model. Once this analysis is complete and the disturbance quantified, a forcing function was produced to replicate the disturbance. This allows the Loads and Dynamics team to report the load limit values for the 100's of interfaces on the ISS.

  2. Metabolic reconstruction, constraint-based analysis and game theory to probe genome-scale metabolic networks.

    PubMed

    Ruppin, Eytan; Papin, Jason A; de Figueiredo, Luis F; Schuster, Stefan

    2010-08-01

    With the advent of modern omics technologies, it has become feasible to reconstruct (quasi-) whole-cell metabolic networks and characterize them in more and more detail. Computer simulations of the dynamic behavior of such networks are difficult due to a lack of kinetic data and to computational limitations. In contrast, network analysis based on appropriate constraints such as the steady-state condition (constraint-based analysis) is feasible and allows one to derive conclusions about the system's metabolic capabilities. Here, we review methods for the reconstruction of metabolic networks, modeling techniques such as flux balance analysis and elementary flux modes and current progress in their development and applications. Game-theoretical methods for studying metabolic networks are discussed as well. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Phenotypic mapping of metabolic profiles using self-organizing maps of high-dimensional mass spectrometry data.

    PubMed

    Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A

    2014-07-01

    A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.

  4. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  5. Optical systems integrated modeling

    NASA Technical Reports Server (NTRS)

    Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck

    1992-01-01

    An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.

  6. Partial differential equation techniques for analysing animal movement: A comparison of different methods.

    PubMed

    Wang, Yi-Shan; Potts, Jonathan R

    2017-03-07

    Recent advances in animal tracking have allowed us to uncover the drivers of movement in unprecedented detail. This has enabled modellers to construct ever more realistic models of animal movement, which aid in uncovering detailed patterns of space use in animal populations. Partial differential equations (PDEs) provide a popular tool for mathematically analysing such models. However, their construction often relies on simplifying assumptions which may greatly affect the model outcomes. Here, we analyse the effect of various PDE approximations on the analysis of some simple movement models, including a biased random walk, central-place foraging processes and movement in heterogeneous landscapes. Perhaps the most commonly-used PDE method dates back to a seminal paper of Patlak from 1953. However, our results show that this can be a very poor approximation in even quite simple models. On the other hand, more recent methods, based on transport equation formalisms, can provide more accurate results, as long as the kernel describing the animal's movement is sufficiently smooth. When the movement kernel is not smooth, we show that both the older and newer methods can lead to quantitatively misleading results. Our detailed analysis will aid future researchers in the appropriate choice of PDE approximation for analysing models of animal movement. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. An x ray archive on your desk: The Einstein CD-ROM's

    NASA Technical Reports Server (NTRS)

    Prestwich, A.; Mcdowell, J.; Plummer, D.; Manning, K.; Garcia, M.

    1992-01-01

    Data from the Einstein Observatory imaging proportional counter (IPC) and high resolution imager (HRI) were released on several CD-ROM sets. The sets released so far include pointed IPC and HRI observations in both simple image and detailed photon event list format, as well as the IPC slew survey. With the data on these CD-ROMS's the user can perform spatial analysis (e.g., surface brightness distributions), spectral analysis (with the IPC event lists), and timing analysis (with the IPC and HRI event lists). The next CD-ROM set will contain IPC unscreened data, allowing the user to perform custom screening to recover, for instance, data during times of lost aspect data or high particle background rates.

  8. High-Throughput Analysis of Sucrose Fatty Acid Esters by Supercritical Fluid Chromatography/Tandem Mass Spectrometry

    PubMed Central

    Hori, Katsuhito; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi

    2014-01-01

    Supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry was applied to the profiling of sucrose fatty acid esters (SEs). The SFC conditions (column and modifier gradient) were optimized for the effective separation of SEs. In the column test, a silica gel reversed-phase column was selected. Then, the method was used for the detailed characterization of commercial SEs and the successful analysis of SEs containing different fatty acids. The present method allowed for fast and high-resolution separation of monoesters to tetra-esters within a shorter time (15 min) as compared to the conventional high-performance liquid chromatography. The applicability of our method for the analysis of SEs was thus demonstrated. PMID:26819875

  9. DATAMAP upgrade version 4.0

    NASA Technical Reports Server (NTRS)

    Watts, Michael E.; Dejpour, Shabob R.

    1989-01-01

    The changes made on the data analysis and management program DATAMAP (Data from Aeromechanics Test and Analytics - Management and Analysis Package) are detailed. These changes are made to Version 3.07 (released February, 1981) and are called Version 4.0. Version 4.0 improvements were performed by Sterling Software under contract to NASA Ames Research Center. The increased capabilities instituted in this version include the breakout of the source code into modules for ease of modification, addition of a more accurate curve fit routine, ability to handle higher frequency data, additional data analysis features, and improvements in the functionality of existing features. These modification will allow DATAMAP to be used on more data sets and will make future modifications and additions easier to implement.

  10. Description of a Portable Wireless Device for High-Frequency Body Temperature Acquisition and Analysis

    PubMed Central

    Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau

    2009-01-01

    We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility. PMID:22408473

  11. Description of a portable wireless device for high-frequency body temperature acquisition and analysis.

    PubMed

    Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau

    2009-01-01

    We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility.

  12. Quasi-Static Indentation Analysis of Carbon-Fiber Laminates.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briggs, Timothy; English, Shawn Allen; Nelson, Stacy Michelle

    2015-12-01

    A series of quasi - static indentation experiments are conducted on carbon fiber reinforced polymer laminates with a systematic variation of thicknesses and fixture boundary conditions. Different deformation mechanisms and their resulting damage mechanisms are activated b y changing the thickn ess and boundary conditions. The quasi - static indentation experiments have been shown to achieve damage mechanisms similar to impact and penetration, however without strain rate effects. The low rate allows for the detailed analysis on the load response. Moreover, interrupted tests allow for the incremental analysis of various damage mechanisms and pr ogressions. The experimentally tested specimens aremore » non - destructively evaluated (NDE) with optical imaging, ultrasonics and computed tomography. The load displacement responses and the NDE are then utilized in numerical simulations for the purpose of model validation and vetting. The accompanying numerical simulation work serves two purposes. First, the results further reveal the time sequence of events and the meaning behind load dro ps not clear from NDE . Second, the simulations demonstrate insufficiencies in the code and can then direct future efforts for development.« less

  13. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    NASA Astrophysics Data System (ADS)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  14. All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis

    PubMed Central

    Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L.; Terés, Lluís; Baumann, Reinhard R.

    2016-01-01

    We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement. PMID:27649784

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palanisamy, Giri

    The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document willmore » enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.« less

  16. Opportunity cost based analysis of corporate eco-efficiency: a methodology and its application to the CO2-efficiency of German companies.

    PubMed

    Hahn, Tobias; Figge, Frank; Liesen, Andrea; Barkemeyer, Ralf

    2010-10-01

    In this paper, we propose the return-to-cost-ratio (RCR) as an alternative approach to the analysis of operational eco-efficiency of companies based on the notion of opportunity costs. RCR helps to overcome two fundamental deficits of existing approaches to eco-efficiency. (1) It translates eco-efficiency into managerial terms by applying the well-established notion of opportunity costs to eco-efficiency analysis. (2) RCR allows to identify and quantify the drivers behind changes in corporate eco-efficiency. RCR is applied to the analysis of the CO(2)-efficiency of German companies in order to illustrate its usefulness for a detailed analysis of changes in corporate eco-efficiency as well as for the development of effective environmental strategies. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  17. Micro-Macro Analysis of Complex Networks

    PubMed Central

    Marchiori, Massimo; Possamai, Lino

    2015-01-01

    Complex systems have attracted considerable interest because of their wide range of applications, and are often studied via a “classic” approach: study a specific system, find a complex network behind it, and analyze the corresponding properties. This simple methodology has produced a great deal of interesting results, but relies on an often implicit underlying assumption: the level of detail on which the system is observed. However, in many situations, physical or abstract, the level of detail can be one out of many, and might also depend on intrinsic limitations in viewing the data with a different level of abstraction or precision. So, a fundamental question arises: do properties of a network depend on its level of observability, or are they invariant? If there is a dependence, then an apparently correct network modeling could in fact just be a bad approximation of the true behavior of a complex system. In order to answer this question, we propose a novel micro-macro analysis of complex systems that quantitatively describes how the structure of complex networks varies as a function of the detail level. To this extent, we have developed a new telescopic algorithm that abstracts from the local properties of a system and reconstructs the original structure according to a fuzziness level. This way we can study what happens when passing from a fine level of detail (“micro”) to a different scale level (“macro”), and analyze the corresponding behavior in this transition, obtaining a deeper spectrum analysis. The obtained results show that many important properties are not universally invariant with respect to the level of detail, but instead strongly depend on the specific level on which a network is observed. Therefore, caution should be taken in every situation where a complex network is considered, if its context allows for different levels of observability. PMID:25635812

  18. Simulating observations with HARMONI: the integral field spectrograph for the European Extremely Large Telescope

    NASA Astrophysics Data System (ADS)

    Zieleniewski, Simon; Thatte, Niranjan; Kendrew, Sarah; Houghton, Ryan; Tecza, Matthias; Clarke, Fraser; Fusco, Thierry; Swinbank, Mark

    2014-07-01

    With the next generation of extremely large telescopes commencing construction, there is an urgent need for detailed quantitative predictions of the scientific observations that these new telescopes will enable. Most of these new telescopes will have adaptive optics fully integrated with the telescope itself, allowing unprecedented spatial resolution combined with enormous sensitivity. However, the adaptive optics point spread function will be strongly wavelength dependent, requiring detailed simulations that accurately model these variations. We have developed a simulation pipeline for the HARMONI integral field spectrograph, a first light instrument for the European Extremely Large Telescope. The simulator takes high-resolution input data-cubes of astrophysical objects and processes them with accurate atmospheric, telescope and instrumental effects, to produce mock observed cubes for chosen observing parameters. The output cubes represent the result of a perfect data reduc- tion process, enabling a detailed analysis and comparison between input and output, showcasing HARMONI's capabilities. The simulations utilise a detailed knowledge of the telescope's wavelength dependent adaptive op- tics point spread function. We discuss the simulation pipeline and present an early example of the pipeline functionality for simulating observations of high redshift galaxies.

  19. Experiment to evaluate feasibility of utilizing Skylab-EREP remote sensing data for tectonic analysis of the Bighorn Mountains region, Wyoming-Montana

    NASA Technical Reports Server (NTRS)

    Hoppin, R. A. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Excellent imagery has been obtained from SL-3 along track 5 across the Bighorn Mountains and track 19 across the northern Black Hills. The red band is by far the best of the four black and white films of S-190A. Excellent detail is visible of topography, structure, resistant lithologies, and culture with good resolution obtainable at high magnification (30X). The infrared bands do not have as good resolution and are grainy at high magnification. They are of use as a complement to the red band particularly for relief enhancement in areas of heavy green grass and forest cover. S-190B high definition black and white is comparable to the red band (S-190A) in detail. Its main advantage is larger initial scale and slightly better resolution. High resolution color transparencies along track 19 allow detailed delineation of cultivated land and strip mining. A group of folds northwest of Billings stand out clearly. Light colored units in northwestern Black Hills and in the badlands can be mapped in great detail.

  20. Erosion in extruder flow

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Fodor, Petru S.

    A detailed analysis of the fluid flow in Tadmor's unwound channel model of the single screw extruder is performed by combining numerical and analytical methods. Using the analytical solution for the longitudinal velocity field (in the limit of zero Reynolds number) allows us to devote all the computational resources solely for a detailed numerical solution of the transversal velocity field. This high resolution 3D model of the fluid flow in a single-screw extruder allows us to identify the position and extent of Moffatt eddies that impede mixing. We further consider the erosion of particles (e.g. carbon-black agglomerates) advected by the polymeric flow. We assume a particle to be made of primary fragments bound together. In the erosion process a primary fragment breaks out of a given particle. Particles are advected by the laminar flow and they disperse because of the shear stresses imparted by the fluid. The time evolution of the numbers of particles of different sizes is described by the Bateman coupled differential equations used to model radioactivity. Using the particle size distribution we compute an entropic fragmentation index which varies from 0 for a monodisperse system to 1 for an extreme poly-disperse system.

  1. Reconstruction of vessel structures from serial whole slide sections of murine liver samples

    NASA Astrophysics Data System (ADS)

    Schwier, Michael; Hahn, Horst K.; Dahmen, Uta; Dirsch, Olaf

    2013-03-01

    Image-based analysis of the vascular structures of murine liver samples is an important tool for scientists to understand liver physiology and morphology. Typical assessment methods are MicroCT, which allows for acquiring images of the whole organ while lacking resolution for fine details, and confocal laser scanning microscopy, which allows detailed insights into fine structures while lacking the broader context. Imaging of histological serial whole slide sections is a recent technology able to fill this gap, since it provides a fine resolution up to the cellular level, but on a whole organ scale. However, whole slide imaging is a modality providing only 2D images. Therefore the challenge is to use stacks of serial sections from which to reconstruct the 3D vessel structures. In this paper we present a semi-automatic procedure to achieve this goal. We employ an automatic method that detects vessel structures based on continuity and shape characteristics. Furthermore it supports the user to perform manual corrections where required. With our methods we were able to successfully extract and reconstruct vessel structures from a stack of 100 and a stack of 397 serial sections of a mouse liver lobe, thus proving the potential of our approach.

  2. A mobile phone application for the collection of opinion data for forest planning purposes.

    PubMed

    Kangas, Annika; Rasinmäki, Jussi; Eyvindson, Kyle; Chambers, Philip

    2015-04-01

    The last 30 years has seen an increase in environmental, socio-economic, and recreational objectives being considered throughout the forest planning process. In the Finnish context these are considered mainly at the regional level potentially missing out on more local issues and problems. Such local information would be most efficiently collected with a participatory GIS approach. A mobile participatory GIS application called Tienoo was developed as a tool for collecting location-specific opinions of recreational and aesthetical characteristics of forests and forest management. The application also contains information the user can access regarding the practical details of the area, for instance about the recreational infrastructure. The application was tested in Ruunaa National Hiking Area, North Karelia, Eastern Finland. Through this application it is possible to continuously collect geolocated preference information. As a result, the collected opinions have details which can be located in both time and space. This allows for the possibility to monitor the changes in opinions when the stands are treated, and it also allows for easily analyzing the effect of time of year on the opinions. It is also possible to analyze the effect of the spatial location and the forest characteristics on the opinions using GIS analysis.

  3. Cryptic or pseudocryptic: can morphological methods inform copepod taxonomy? An analysis of publications and a case study of the Eurytemora affinis species complex

    PubMed Central

    Lajus, Dmitry; Sukhikh, Natalia; Alekseev, Victor

    2015-01-01

    Interest in cryptic species has increased significantly with current progress in genetic methods. The large number of cryptic species suggests that the resolution of traditional morphological techniques may be insufficient for taxonomical research. However, some species now considered to be cryptic may, in fact, be designated pseudocryptic after close morphological examination. Thus the “cryptic or pseudocryptic” dilemma speaks to the resolution of morphological analysis and its utility for identifying species. We address this dilemma first by systematically reviewing data published from 1980 to 2013 on cryptic species of Copepoda and then by performing an in-depth morphological study of the former Eurytemora affinis complex of cryptic species. Analyzing the published data showed that, in 5 of 24 revisions eligible for systematic review, cryptic species assignment was based solely on the genetic variation of forms without detailed morphological analysis to confirm the assignment. Therefore, some newly described cryptic species might be designated pseudocryptic under more detailed morphological analysis as happened with Eurytemora affinis complex. Recent genetic analyses of the complex found high levels of heterogeneity without morphological differences; it is argued to be cryptic. However, next detailed morphological analyses allowed to describe a number of valid species. Our study, using deep statistical analyses usually not applied for new species describing, of this species complex confirmed considerable differences between former cryptic species. In particular, fluctuating asymmetry (FA), the random variation of left and right structures, was significantly different between forms and provided independent information about their status. Our work showed that multivariate statistical approaches, such as principal component analysis, can be powerful techniques for the morphological discrimination of cryptic taxons. Despite increasing cryptic species designations, morphological techniques have great potential in determining copepod taxonomy. PMID:26120427

  4. Elliptical Fourier descriptors of outline and morphological analysis in caudal view of foramen magnum of the tropical raccoon (Procyon cancrivorus) (Linnaeus, 1758).

    PubMed

    Samuel, O M; Casanova, P M; Olopade, J O

    2018-03-01

    To evaluate sexual-size dimorphism and attempt at categorization of inter-individual shapes of foramen magnum outlines using Fourier descriptors which allow for shape outline evaluations with a resultant specimen character definition. Individual characterization and quantification of foramen magnum shapes in direct caudal view based on elliptical Fourier technique was applied to 46 tropical raccoon skulls (26 females, 20 males). Incremental number of harmonics demonstrates morphological contributions of such descriptors with their relations to specific anatomical constructions established. The initial harmonics (1st to 3rd) described the general foramen shapes while the second (4th to 12th) demonstrated fine morphological details. Sexual-size dimorphism was observed in females (87.1%) and 91.7% in males, normalization of size produces 75% in females and 83% in males. With respect to foramen magnum dimorphism analysis, the result obtained through elliptic Fourier analysis was comparatively better in detail information of outline contours than earlier classical methods. The first four effective principal components defined 70.63% of its shape properties while the rest (22.51%) constituted fine details of morphology. Both size and shape seems important in sexual dimorphisms in this species, this investigation suggest clinical implications, taxonomic and anthropologic perspectives in foramen characterization magnum characterization and further postulates an increased possibility of volume reduction cerebellar protrusion, ontogenic magnum shape irregularities in the sample population with neurologic consequences especially among females. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  5. Electromagnetic simulation of helicon plasma antennas for their electrostatic shield design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stratakos, Yorgos, E-mail: y.stratakos@gmail.com; Zeniou, Angelos, E-mail: a.zeniou@inn.demokritos.gr; Gogolides, Evangelos, E-mail: e.gogolides@inn.demokritos.gr

    A detailed electromagnetic parametric analysis of the helicon antenna (half Nagoya type) is shown at 13.56 MHz using a CST Microwave Studio 2012. The antenna is used to excite plasma inside a dielectric cylinder similar to a commercial reactor. Instead of focusing on the plasma state, the authors focus on the penetration and the three dimensional distribution of electric fields through the dielectric wall. Our aim is to reduce capacitive coupling which produces unwanted longitudinal and radial electric fields. Comparison of the helicon antenna electromagnetic performance under diverse boundary conditions shows that one is allowed to use vacuum simulations without plasmamore » present in the cylinder, or approximate the plasma as a column of gyrotropic material with a tensor dielectric permittivity and with a sheath of a few millimeters in order to qualitatively predict the electric field distribution, thus avoiding a full plasma simulation. This way the analysis of the full problem is much faster and allows an optimal shield design. A detailed study of various shields shows that one can reduce the radial and axial fields by more than 1 order of magnitude compared to the unshielded antenna, while the azimuthal field is reduced only by a factor of 2. Optimal shield design in terms of pitch and spacing of openings is determined. Finally, an experimental proof of concept of the effect of shielding on reduced wall sputtering is provided, by monitoring the roughness created during oxygen plasma etching of an organic polymer.« less

  6. Proxy-based accelerated discovery of Fischer–Tropsch catalysts† †Electronic supplementary information (ESI) available: Details of synthesis, analysis and testing, validation experiments for high-throughput XRD and gas treatment, details of statistical analysis and calculations, tabulation of synthesis parameters and XRD results, alternatives to Fig. 3 highlighting different data points, FTS testing results displayed graphically. See DOI: 10.1039/c4sc02116a Click here for additional data file.

    PubMed Central

    Boldrin, Paul; Gallagher, James R.; Combes, Gary B.; Enache, Dan I.; James, David; Ellis, Peter R.; Kelly, Gordon; Claridge, John B.

    2015-01-01

    Development of heterogeneous catalysts for complex reactions such as Fischer–Tropsch synthesis of fuels is hampered by difficult reaction conditions, slow characterisation techniques such as chemisorption and temperature-programmed reduction and the need for long term stability. High-throughput (HT) methods may help, but their use has until now focused on bespoke micro-reactors for direct measurements of activity and selectivity. These are specific to individual reactions and do not provide more fundamental information on the materials. Here we report using simpler HT characterisation techniques (XRD and TGA) along with ageing under Fischer–Tropsch reaction conditions to provide information analogous to metal surface area, degree of reduction and thousands of hours of stability testing time for hundreds of samples per month. The use of this method allowed the identification of a series of highly stable, high surface area catalysts promoted by Mg and Ru. In an advance over traditional multichannel HT reactors, the chemical and structural information we obtain on the materials allows us to identify the structural effects of the promoters and their effects on the modes of deactivation observed. PMID:29560180

  7. UAV-based detection and spatial analyses of periglacial landforms on Demay Point (King George Island, South Shetland Islands, Antarctica)

    NASA Astrophysics Data System (ADS)

    Dąbski, Maciej; Zmarz, Anna; Pabjanek, Piotr; Korczak-Abshire, Małgorzata; Karsznia, Izabela; Chwedorzewska, Katarzyna J.

    2017-08-01

    High-resolution aerial images allow detailed analyses of periglacial landforms, which is of particular importance in light of climate change and resulting changes in active layer thickness. The aim of this study is to show possibilities of using UAV-based photography to perform spatial analysis of periglacial landforms on the Demay Point peninsula, King George Island, and hence to supplement previous geomorphological studies of the South Shetland Islands. Photogrammetric flights were performed using a PW-ZOOM fixed-winged unmanned aircraft vehicle. Digital elevation models (DEM) and maps of slope and contour lines were prepared in ESRI ArcGIS 10.3 with the Spatial Analyst extension, and three-dimensional visualizations in ESRI ArcScene 10.3 software. Careful interpretation of orthophoto and DEM, allowed us to vectorize polygons of landforms, such as (i) solifluction landforms (solifluction sheets, tongues, and lobes); (ii) scarps, taluses, and a protalus rampart; (iii) patterned ground (hummocks, sorted circles, stripes, nets and labyrinths, and nonsorted nets and stripes); (iv) coastal landforms (cliffs and beaches); (v) landslides and mud flows; and (vi) stone fields and bedrock outcrops. We conclude that geomorphological studies based on commonly accessible aerial and satellite images can underestimate the spatial extent of periglacial landforms and result in incomplete inventories. The PW-ZOOM UAV is well suited to gather detailed geomorphological data and can be used in spatial analysis of periglacial landforms in the Western Antarctic Peninsula region.

  8. Comparison of Multi-Scale Digital Elevation Models for Defining Waterways and Catchments Over Large Areas

    NASA Astrophysics Data System (ADS)

    Harris, B.; McDougall, K.; Barry, M.

    2012-07-01

    Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.

  9. An interactive local flattening operator to support digital investigations on artwork surfaces.

    PubMed

    Pietroni, Nico; Massimiliano, Corsini; Cignoni, Paolo; Scopigno, Roberto

    2011-12-01

    Analyzing either high-frequency shape detail or any other 2D fields (scalar or vector) embedded over a 3D geometry is a complex task, since detaching the detail from the overall shape can be tricky. An alternative approach is to move to the 2D space, resolving shape reasoning to easier image processing techniques. In this paper we propose a novel framework for the analysis of 2D information distributed over 3D geometry, based on a locally smooth parametrization technique that allows us to treat local 3D data in terms of image content. The proposed approach has been implemented as a sketch-based system that allows to design with a few gestures a set of (possibly overlapping) parameterizations of rectangular portions of the surface. We demonstrate that, due to the locality of the parametrization, the distortion is under an acceptable threshold, while discontinuities can be avoided since the parametrized geometry is always homeomorphic to a disk. We show the effectiveness of the proposed technique to solve specific Cultural Heritage (CH) tasks: the analysis of chisel marks over the surface of a unfinished sculpture and the local comparison of multiple photographs mapped over the surface of an artwork. For this very difficult task, we believe that our framework and the corresponding tool are the first steps toward a computer-based shape reasoning system, able to support CH scholars with a medium they are more used to. © 2011 IEEE

  10. DataHub: Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  11. DataHub - Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  12. Strategies for the profiling, characterisation and detailed structural analysis of N-linked oligosaccharides.

    PubMed

    Tharmalingam, Tharmala; Adamczyk, Barbara; Doherty, Margaret A; Royle, Louise; Rudd, Pauline M

    2013-02-01

    Many post-translational modifications, including glycosylation, are pivotal for the structural integrity, location and functional activity of glycoproteins. Sub-populations of proteins that are relocated or functionally changed by such modifications can change resting proteins into active ones, mediating specific effector functions, as in the case of monoclonal antibodies. To ensure safe and efficacious drugs it is essential to employ appropriate robust, quantitative analytical strategies that can (i) perform detailed glycan structural analysis, (ii) characterise specific subsets of glycans to assess known critical features of therapeutic activities (iii) rapidly profile glycan pools for at-line monitoring or high level batch to batch screening. Here we focus on these aspects of glycan analysis, showing how state-of-the-art technologies are required at all stages during the production of recombinant glycotherapeutics. These data can provide insights into processing pathways and suggest markers for intervention at critical control points in bioprocessing and also critical decision points in disease and drug monitoring in patients. Importantly, these tools are now enabling the first glycome/genome studies in large populations, allowing the integration of glycomics into other 'omics platforms in a systems biology context.

  13. Stream network analysis from orbital and suborbital imagery, Colorado River Basin, Texas

    NASA Technical Reports Server (NTRS)

    Baker, V. R. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Orbital SL-2 imagery (earth terrain camera S-190B), received September 5, 1973, was subjected to quantitative network analysis and compared to 7.5 minute topographic mapping (scale: 1/24,000) and U.S.D.A. conventional black and white aerial photography (scale: 1/22,200). Results can only be considered suggestive because detail on the SL-2 imagery was badly obscured by heavy cloud cover. The upper Bee Creek basin was chosen for analysis because it appeared in a relatively cloud-free portion of the orbital imagery. Drainage maps were drawn from the three sources digitized into a computer-compatible format, and analyzed by the WATER system computer program. Even at its small scale (1/172,000) and with bad haze the orbital photo showed much drainage detail. The contour-like character of the Glen Rose Formation's resistant limestone units allowed channel definition. The errors in pattern recognition can be attributed to local areas of dense vegetation and to other areas of very high albedo caused by surficial exposure of caliche. The latter effect caused particular difficulty in the determination of drainage divides.

  14. Insights into Substrate Specificity and Metal Activation of Mammalian Tetrahedral Aspartyl Aminopeptidase*

    PubMed Central

    Chen, Yuanyuan; Farquhar, Erik R.; Chance, Mark R.; Palczewski, Krzysztof; Kiser, Philip D.

    2012-01-01

    Aminopeptidases are key enzymes involved in the regulation of signaling peptide activity. Here, we present a detailed biochemical and structural analysis of an evolutionary highly conserved aspartyl aminopeptidase called DNPEP. We show that this peptidase can cleave multiple physiologically relevant substrates, including angiotensins, and thus may play a key role in regulating neuron function. Using a combination of x-ray crystallography, x-ray absorption spectroscopy, and single particle electron microscopy analysis, we provide the first detailed structural analysis of DNPEP. We show that this enzyme possesses a binuclear zinc-active site in which one of the zinc ions is readily exchangeable with other divalent cations such as manganese, which strongly stimulates the enzymatic activity of the protein. The plasticity of this metal-binding site suggests a mechanism for regulation of DNPEP activity. We also demonstrate that DNPEP assembles into a functionally relevant tetrahedral complex that restricts access of peptide substrates to the active site. These structural data allow rationalization of the enzyme's preference for short peptide substrates with N-terminal acidic residues. This study provides a structural basis for understanding the physiology and bioinorganic chemistry of DNPEP and other M18 family aminopeptidases. PMID:22356908

  15. Multivariate analysis of longitudinal rates of change.

    PubMed

    Bryan, Matthew; Heagerty, Patrick J

    2016-12-10

    Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed in the literature. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, 'accelerated time' methods have been developed which assume that covariates rescale time in longitudinal models for disease progression. In this manuscript, we detail an alternative multivariate model formulation that directly structures longitudinal rates of change and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Mechanical Design Studies of the MQXF Long Model Quadrupole for the HiLumi LHC

    DOE PAGES

    Pan, Heng; Anderssen, Eric; Ambrosio, Giorgio; ...

    2016-12-20

    The Large Hadron Collider Luminosity upgrade (HiLumi) program requires new low-β triplet quadrupole magnets, called MQXF, in the Interaction Region (IR) to increase the LHC peak and integrated luminosity. The MQXF magnets, designed and fabricated in collaboration between CERN and the U.S. LARP, will all have the same cross section. The MQXF long model, referred as MQXFA, is a quadrupole using the Nb3Sn superconducting technology with 150 mm aperture and a 4.2 m magnetic length and is the first long prototype of the final MQXF design. The MQXFA magnet is based on the previous LARP HQ and MQXFS designs. Inmore » this paper we present the baseline design of the MQXFA structure with detailed 3D numerical analysis. A detailed tolerance analysis of the baseline case has been performed by using a 3D finite element model, which allows fast computation of structures modelled with actual tolerances. Tolerance sensitivity of each component is discussed to verify the actual tolerances to be achieved by vendors. In conclusion, tolerance stack-up analysis is presented in the end of this paper.« less

  17. A rapid-screening approach to detect and quantify microplastics based on fluorescent tagging with Nile Red

    NASA Astrophysics Data System (ADS)

    Maes, Thomas; Jessop, Rebecca; Wellner, Nikolaus; Haupt, Karsten; Mayes, Andrew G.

    2017-03-01

    A new approach is presented for analysis of microplastics in environmental samples, based on selective fluorescent staining using Nile Red (NR), followed by density-based extraction and filtration. The dye adsorbs onto plastic surfaces and renders them fluorescent when irradiated with blue light. Fluorescence emission is detected using simple photography through an orange filter. Image-analysis allows fluorescent particles to be identified and counted. Magnified images can be recorded and tiled to cover the whole filter area, allowing particles down to a few micrometres to be detected. The solvatochromic nature of Nile Red also offers the possibility of plastic categorisation based on surface polarity characteristics of identified particles. This article details the development of this staining method and its initial cross-validation by comparison with infrared (IR) microscopy. Microplastics of different sizes could be detected and counted in marine sediment samples. The fluorescence staining identified the same particles as those found by scanning a filter area with IR-microscopy.

  18. Wavenumber-frequency Spectra of Pressure Fluctuations Measured via Fast Response Pressure Sensitive Paint

    NASA Technical Reports Server (NTRS)

    Panda, J.; Roozeboom, N. H.; Ross, J. C.

    2016-01-01

    The recent advancement in fast-response Pressure-Sensitive Paint (PSP) allows time-resolved measurements of unsteady pressure fluctuations from a dense grid of spatial points on a wind tunnel model. This capability allows for direct calculations of the wavenumber-frequency (k-?) spectrum of pressure fluctuations. Such data, useful for the vibro-acoustics analysis of aerospace vehicles, are difficult to obtain otherwise. For the present work, time histories of pressure fluctuations on a flat plate subjected to vortex shedding from a rectangular bluff-body were measured using PSP. The light intensity levels in the photographic images were then converted to instantaneous pressure histories by applying calibration constants, which were calculated from a few dynamic pressure sensors placed at selective points on the plate. Fourier transform of the time-histories from a large number of spatial points provided k-? spectra for pressure fluctuations. The data provides first glimpse into the possibility of creating detailed forcing functions for vibro-acoustics analysis of aerospace vehicles, albeit for a limited frequency range.

  19. Automatic analysis of dividing cells in live cell movies to detect mitotic delays and correlate phenotypes in time.

    PubMed

    Harder, Nathalie; Mora-Bermúdez, Felipe; Godinez, William J; Wünsche, Annelie; Eils, Roland; Ellenberg, Jan; Rohr, Karl

    2009-11-01

    Live-cell imaging allows detailed dynamic cellular phenotyping for cell biology and, in combination with small molecule or drug libraries, for high-content screening. Fully automated analysis of live cell movies has been hampered by the lack of computational approaches that allow tracking and recognition of individual cell fates over time in a precise manner. Here, we present a fully automated approach to analyze time-lapse movies of dividing cells. Our method dynamically categorizes cells into seven phases of the cell cycle and five aberrant morphological phenotypes over time. It reliably tracks cells and their progeny and can thus measure the length of mitotic phases and detect cause and effect if mitosis goes awry. We applied our computational scheme to annotate mitotic phenotypes induced by RNAi gene knockdown of CKAP5 (also known as ch-TOG) or by treatment with the drug nocodazole. Our approach can be readily applied to comparable assays aiming at uncovering the dynamic cause of cell division phenotypes.

  20. A rapid-screening approach to detect and quantify microplastics based on fluorescent tagging with Nile Red

    PubMed Central

    Maes, Thomas; Jessop, Rebecca; Wellner, Nikolaus; Haupt, Karsten; Mayes, Andrew G.

    2017-01-01

    A new approach is presented for analysis of microplastics in environmental samples, based on selective fluorescent staining using Nile Red (NR), followed by density-based extraction and filtration. The dye adsorbs onto plastic surfaces and renders them fluorescent when irradiated with blue light. Fluorescence emission is detected using simple photography through an orange filter. Image-analysis allows fluorescent particles to be identified and counted. Magnified images can be recorded and tiled to cover the whole filter area, allowing particles down to a few micrometres to be detected. The solvatochromic nature of Nile Red also offers the possibility of plastic categorisation based on surface polarity characteristics of identified particles. This article details the development of this staining method and its initial cross-validation by comparison with infrared (IR) microscopy. Microplastics of different sizes could be detected and counted in marine sediment samples. The fluorescence staining identified the same particles as those found by scanning a filter area with IR-microscopy. PMID:28300146

  1. Microneedle-based analysis of the micromechanics of the metaphase spindle assembled in Xenopus laevis egg extracts

    PubMed Central

    Shimamoto, Yuta; Kapoor, Tarun M.

    2014-01-01

    SUMMARY To explain how micron-sized cellular structures generate and respond to forces we need to characterize their micromechanical properties. Here we provide a protocol to build and use a dual force-calibrated microneedle-based set-up to quantitatively analyze the micromechanics of a metaphase spindle assembled in Xenopus laevis egg extracts. This cell-free extract system allows for controlled biochemical perturbations of spindle components. We describe how the microneedles are prepared and how they can be used to apply and measure forces. A multi-mode imaging system allows tracking of microtubules, chromosomes and needle tips. This set-up can be used to analyze the viscoelastic properties of the spindle on time-scales ranging from minutes to sub-seconds. A typical experiment, along with data analysis, is also detailed. We anticipate that our protocol can be readily extended to analyze the micromechanics of other cellular structures assembled in cell-free extracts. The entire procedure can take 3-4 days. PMID:22538847

  2. The analysis of latent fingermarks on polymer banknotes using MALDI-MS.

    PubMed

    Scotcher, K; Bradshaw, R

    2018-06-08

    In September 2016, the UK adopted a new Bank of England (BoE) £5 polymer banknote, followed by the £10 polymer banknote in September 2017. They are designed to be cleaner, stronger and have increased counterfeit resilience; however, fingermark development can be problematic from the polymer material as various security features and coloured/textured areas have been found to alter the effectiveness of conventional fingermark enhancement techniques (FETs). As fingermarks are one of the most widely used forms of identification in forensic cases, it is important that maximum ridge detail be obtained in order to allow for comparison. This research explores the use of matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS) profiling and imaging for the analysis of fingermarks deposited on polymer banknotes. The proposed methodology was able to obtain both physical and chemical information from fingermarks deposited in a range of scenarios including; different note areas, depletion series, aged samples and following conventional FETs. The analysis of forensically important molecular targets within these fingermarks was also explored, focussing specifically on cocaine. The ability of MALDI-MS to provide ridge detail and chemical information highlights the forensic applicability of this technique and potential for the analysis of fingermarks deposited onto this problematic surface.

  3. Water chemistry of tundra lakes in the periglacial zone of the Bellsund Fiord (Svalbard) in the summer of 2013.

    PubMed

    Szumińska, Danuta; Szopińska, Małgorzata; Lehmann-Konera, Sara; Franczak, Łukasz; Kociuba, Waldemar; Chmiel, Stanisław; Kalinowski, Paweł; Polkowska, Żaneta

    2018-05-15

    Climate changes observed in the Arctic (e.g. permafrost degradation, glacier retreat) may have significant influence on sensitive polar wetlands. The main objectives of this paper are defining chemical features of water within six small arctic lakes located in Bellsund (Svalbard) in the area of continuous permafrost occurrence. The unique environmental conditions of the study area offer an opportunity to observe phenomena influencing water chemistry, such as: chemical weathering, permafrost thawing, marine aerosols, atmospheric deposition and biological inputs. In the water samples collected during the summer 2013, detailed tundra lake water chemistry characteristics regarding ions, trace elements, pH and specific electrolytic conductivity (SEC 25 ) analysis were determined. Moreover, water chemistry of the studied lakes was compared to the water samples from the Tyvjobekken Creek and precipitation water samples. As a final step of data analysis, Principal Component Analysis (PCA) was performed. Detailed chemical analysis allowed us to conclude what follows: (1) Ca 2+ , Mg 2+ , SO 4 2- , Sr are of geogenic origin, (2) NO 3 - present in tundra lakes and the Tyvjobekken Creek water samples (ranging from 0.31 to 1.69mgL - 1 and from 0.25 to 1.58mgL - 1 respectively) may be of mixed origin, i.e. from biological processes and permafrost thawing, (3) high contribution of non-sea-salt SO 4 2- >80% in majority of studied samples indicate considerable inflow of sulphate-rich air to the study area, (4) high content of chlorides in tundra lakes (range: 25.6-32.0% meqL - 1 ) indicates marine aerosol influence, (5) PCA result shows that atmospheric transport may constitute a source of Mn, Co, Ni, Cu, Ga, Ba and Cd. However, further detailed inter-season and multi-seasonal study of tundra lakes in the Arctic are recommended. Especially in terms of detailed differentiation of sources influence (atmospheric transport vs. permafrost degradation). Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Neutrino and axion bounds from the globular cluster M5 (NGC 5904).

    PubMed

    Viaux, N; Catelan, M; Stetson, P B; Raffelt, G G; Redondo, J; Valcarce, A A R; Weiss, A

    2013-12-06

    The red-giant branch (RGB) in globular clusters is extended to larger brightness if the degenerate helium core loses too much energy in "dark channels." Based on a large set of archival observations, we provide high-precision photometry for the Galactic globular cluster M5 (NGC 5904), allowing for a detailed comparison between the observed tip of the RGB with predictions based on contemporary stellar evolution theory. In particular, we derive 95% confidence limits of g(ae)<4.3×10(-13) on the axion-electron coupling and μ(ν)<4.5×10(-12)μ(B) (Bohr magneton μ(B)=e/2m(e)) on a neutrino dipole moment, based on a detailed analysis of statistical and systematic uncertainties. The cluster distance is the single largest source of uncertainty and can be improved in the future.

  5. Using telephony data to facilitate discovery of clinical workflows.

    PubMed

    Rucker, Donald W

    2017-04-19

    Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.

  6. Structural Information from Single-molecule FRET Experiments Using the Fast Nano-positioning System

    PubMed Central

    Röcker, Carlheinz; Nagy, Julia; Michaelis, Jens

    2017-01-01

    Single-molecule Förster Resonance Energy Transfer (smFRET) can be used to obtain structural information on biomolecular complexes in real-time. Thereby, multiple smFRET measurements are used to localize an unknown dye position inside a protein complex by means of trilateration. In order to obtain quantitative information, the Nano-Positioning System (NPS) uses probabilistic data analysis to combine structural information from X-ray crystallography with single-molecule fluorescence data to calculate not only the most probable position but the complete three-dimensional probability distribution, termed posterior, which indicates the experimental uncertainty. The concept was generalized for the analysis of smFRET networks containing numerous dye molecules. The latest version of NPS, Fast-NPS, features a new algorithm using Bayesian parameter estimation based on Markov Chain Monte Carlo sampling and parallel tempering that allows for the analysis of large smFRET networks in a comparably short time. Moreover, Fast-NPS allows the calculation of the posterior by choosing one of five different models for each dye, that account for the different spatial and orientational behavior exhibited by the dye molecules due to their local environment. Here we present a detailed protocol for obtaining smFRET data and applying the Fast-NPS. We provide detailed instructions for the acquisition of the three input parameters of Fast-NPS: the smFRET values, as well as the quantum yield and anisotropy of the dye molecules. Recently, the NPS has been used to elucidate the architecture of an archaeal open promotor complex. This data is used to demonstrate the influence of the five different dye models on the posterior distribution. PMID:28287526

  7. Structural Information from Single-molecule FRET Experiments Using the Fast Nano-positioning System.

    PubMed

    Dörfler, Thilo; Eilert, Tobias; Röcker, Carlheinz; Nagy, Julia; Michaelis, Jens

    2017-02-09

    Single-molecule Förster Resonance Energy Transfer (smFRET) can be used to obtain structural information on biomolecular complexes in real-time. Thereby, multiple smFRET measurements are used to localize an unknown dye position inside a protein complex by means of trilateration. In order to obtain quantitative information, the Nano-Positioning System (NPS) uses probabilistic data analysis to combine structural information from X-ray crystallography with single-molecule fluorescence data to calculate not only the most probable position but the complete three-dimensional probability distribution, termed posterior, which indicates the experimental uncertainty. The concept was generalized for the analysis of smFRET networks containing numerous dye molecules. The latest version of NPS, Fast-NPS, features a new algorithm using Bayesian parameter estimation based on Markov Chain Monte Carlo sampling and parallel tempering that allows for the analysis of large smFRET networks in a comparably short time. Moreover, Fast-NPS allows the calculation of the posterior by choosing one of five different models for each dye, that account for the different spatial and orientational behavior exhibited by the dye molecules due to their local environment. Here we present a detailed protocol for obtaining smFRET data and applying the Fast-NPS. We provide detailed instructions for the acquisition of the three input parameters of Fast-NPS: the smFRET values, as well as the quantum yield and anisotropy of the dye molecules. Recently, the NPS has been used to elucidate the architecture of an archaeal open promotor complex. This data is used to demonstrate the influence of the five different dye models on the posterior distribution.

  8. High-resolution analysis of Quaternary calcretes: a coupled stable isotope and micromorphological approach

    NASA Astrophysics Data System (ADS)

    Adamson, Kathryn; Candy, Ian; Whitfield, Liz

    2015-04-01

    Pedogenic calcretes are abundant in arid and semi-arid regions, and they are widely used as proxy records of palaeoclimatic change. Calcrete oxygen (δ18O) and carbon (δ13C) isotopic signatures are indicative of temperature, aridity, or vegetation at the time of calcrete formation. Their microfabrics also reflect carbonate formation mechanisms in response to the prevailing environmental conditions. Many studies have explored calcrete micromorphology or stable isotope composition, but these techniques have not yet been applied simultaneously. This co-analysis is important as it allows us to establish whether calcrete morphology directly reflects environmental change. This study tests the potential of combining these analyses to examine the relationships between calcrete microfabrics, their isotopic signals, and Quaternary climate change. Calcretes from four river terraces of the Rio Alias in southeast Spain have been analysed in detail. On the basis of morphostratigraphic correlation (Maher et al., 2007) and Uranium-series ages (Candy et al., 2005), these span the period from 304 ± 26 ka (MIS 9) to the Holocene. The oldest profiles have therefore been exposed to multiple glacial-interglacial cycles. A total of 37 micromorphological profiles have been used to extract stable oxygen and carbon isotopic indicators from 77 microfacies. The morphological and isotopic complexity of the calcrete profiles increases with progressive age. The oldest samples display multiple calcretisation phases, and their microfabrics have a larger isotopic range than the younger samples. Alpha (non-biogenic) fabrics have higher δ13C and δ18O values than beta (biogenic) fabrics. Strong positive covariance between δ13C and δ18O within all profiles suggests that both isotopes are responding to the same environmental parameter. We suggest that this is relative aridity. The study demonstrates that the detailed co-analysis of calcrete micromorphology and stable isotope signatures allows calcrete formation patterns to be placed into a wider palaeoclimatic context. Importantly, this technique provides a level of detail that is not possible through bulk isotope sampling alone. It demonstrates the potential of this technique to more reliably constrain the palaeoenvironmental significance of secondary carbonates in dryland settings where other proxy records may be poorly preserved.

  9. Nanoscale Analysis of a Hierarchical Hybrid Solar Cell in 3D.

    PubMed

    Divitini, Giorgio; Stenzel, Ole; Ghadirzadeh, Ali; Guarnera, Simone; Russo, Valeria; Casari, Carlo S; Bassi, Andrea Li; Petrozza, Annamaria; Di Fonzo, Fabio; Schmidt, Volker; Ducati, Caterina

    2014-05-01

    A quantitative method for the characterization of nanoscale 3D morphology is applied to the investigation of a hybrid solar cell based on a novel hierarchical nanostructured photoanode. A cross section of the solar cell device is prepared by focused ion beam milling in a micropillar geometry, which allows a detailed 3D reconstruction of the titania photoanode by electron tomography. It is found that the hierarchical titania nanostructure facilitates polymer infiltration, thus favoring intermixing of the two semiconducting phases, essential for charge separation. The 3D nanoparticle network is analyzed with tools from stochastic geometry to extract information related to the charge transport in the hierarchical solar cell. In particular, the experimental dataset allows direct visualization of the percolation pathways that contribute to the photocurrent.

  10. Nanoscale Analysis of a Hierarchical Hybrid Solar Cell in 3D

    PubMed Central

    Divitini, Giorgio; Stenzel, Ole; Ghadirzadeh, Ali; Guarnera, Simone; Russo, Valeria; Casari, Carlo S; Bassi, Andrea Li; Petrozza, Annamaria; Di Fonzo, Fabio; Schmidt, Volker; Ducati, Caterina

    2014-01-01

    A quantitative method for the characterization of nanoscale 3D morphology is applied to the investigation of a hybrid solar cell based on a novel hierarchical nanostructured photoanode. A cross section of the solar cell device is prepared by focused ion beam milling in a micropillar geometry, which allows a detailed 3D reconstruction of the titania photoanode by electron tomography. It is found that the hierarchical titania nanostructure facilitates polymer infiltration, thus favoring intermixing of the two semiconducting phases, essential for charge separation. The 3D nanoparticle network is analyzed with tools from stochastic geometry to extract information related to the charge transport in the hierarchical solar cell. In particular, the experimental dataset allows direct visualization of the percolation pathways that contribute to the photocurrent. PMID:25834481

  11. Direct to consumer advertising in pharmaceutical markets.

    PubMed

    Brekke, Kurt R; Kuhn, Michael

    2006-01-01

    We study effects of direct-to-consumer advertising (DTCA) in the prescription drug market. There are two pharmaceutical firms providing horizontally differentiated (branded) drugs. Patients differ in their susceptibility to the drugs. If DTCA is allowed, this can be employed to induce (additional) patient visits. Physicians perfectly observe the patients' type (of illness), but rely on information to prescribe the correct drug. Drug information is conveyed by marketing (detailing), creating a captive and a selective segment of physicians. First, we show that detailing, DTCA and price (if not regulated) are complementary strategies for the firms. Thus, allowing DTCA induces more detailing and higher prices. Second, firms benefit from DTCA if detailing competition is not too fierce, which is true if investing in detailing is sufficiently costly. Otherwise, firms are better off with a ban on DTCA. Finally, DTCA tends to lower welfare if insurance is generous (low copayments) and/or price regulation is lenient. The desirability of DTCA also depends on whether or not the regulator is concerned with firms' profit.

  12. Proteomic analysis of plasma-purified VLDL, LDL, and HDL fractions from atherosclerotic patients undergoing carotid endarterectomy: identification of serum amyloid A as a potential marker.

    PubMed

    Lepedda, Antonio J; Nieddu, Gabriele; Zinellu, Elisabetta; De Muro, Pierina; Piredda, Franco; Guarino, Anna; Spirito, Rita; Carta, Franco; Turrini, Francesco; Formato, Marilena

    2013-01-01

    Apolipoproteins are very heterogeneous protein family, implicated in plasma lipoprotein structural stabilization, lipid metabolism, inflammation, or immunity. Obtaining detailed information on apolipoprotein composition and structure may contribute to elucidating lipoprotein roles in atherogenesis and to developing new therapeutic strategies for the treatment of lipoprotein-associated disorders. This study aimed at developing a comprehensive method for characterizing the apolipoprotein component of plasma VLDL, LDL, and HDL fractions from patients undergoing carotid endarterectomy, by means of two-dimensional electrophoresis (2-DE) coupled with Mass Spectrometry analysis, useful for identifying potential markers of plaque presence and vulnerability. The adopted method allowed obtaining reproducible 2-DE maps of exchangeable apolipoproteins from VLDL, LDL, and HDL. Twenty-three protein isoforms were identified by peptide mass fingerprinting analysis. Differential proteomic analysis allowed for identifying increased levels of acute-phase serum amyloid A protein (AP SAA) in all lipoprotein fractions, especially in LDL from atherosclerotic patients. Results have been confirmed by western blotting analysis on each lipoprotein fraction using apo AI levels for data normalization. The higher levels of AP SAA found in patients suggest a role of LDL as AP SAA carrier into the subendothelial space of artery wall, where AP SAA accumulates and may exert noxious effects.

  13. Dynamic analysis of apoptosis using cyanine SYTO probes: From classical to microfluidic cytometry

    PubMed Central

    Wlodkowic, Donald; Skommer, Joanna; Faley, Shannon; Darzynkiewicz, Zbigniew; Cooper, Jonathan M.

    2013-01-01

    Cell death is a stochastic process, often initiated and/or executed in a multi-pathway/multi-organelle fashion. Therefore, high-throughput single-cell analysis platforms are required to provide detailed characterization of kinetics and mechanisms of cell death in heterogeneous cell populations. However, there is still a largely unmet need for inert fluorescent probes, suitable for prolonged kinetic studies. Here, we compare the use of innovative adaptation of unsymmetrical SYTO dyes for dynamic real-time analysis of apoptosis in conventional as well as microfluidic chip-based systems. We show that cyanine SYTO probes allow non-invasive tracking of intracellular events over extended time. Easy handling and “stain–no wash” protocols open up new opportunities for high-throughput analysis and live-cell sorting. Furthermore, SYTO probes are easily adaptable for detection of cell death using automated microfluidic chip-based cytometry. Overall, the combined use of SYTO probes and state-of-the-art Lab-on-a-Chip platform emerges as a cost effective solution for automated drug screening compared to conventional Annexin V or TUNEL assays. In particular, it should allow for dynamic analysis of samples where low cell number has so far been an obstacle, e.g. primary cancer stems cells or circulating minimal residual tumors. PMID:19298813

  14. Detailed simulation of a Lobster-eye telescope.

    PubMed

    Putkunz, Corey T; Peele, Andrew G

    2009-08-03

    The concept of an x-ray telescope based on the optics of the eye of certain types of crustacea has been in currency for nearly thirty years. However, it is only in the last decade that the technology to make the telescope and the opportunity to mount it on a suitable space platform have combined to allow the idea to become a reality. Accordingly, we have undertaken a detailed simulation study, updating previous simplified models, to properly characterise the performance of the instrument in orbit. The study reveals details of how the particular characteristics of the lobster-eye optics affect the sensitivity of the instrument and allow us to implement new ideas in data extraction methods.

  15. [Assessment of pragmatics from verbal spoken data].

    PubMed

    Gallardo-Paúls, B

    2009-02-27

    Pragmatic assessment is usually complex, long and sophisticated, especially for professionals who lack specific linguistic education and interact with impaired speakers. To design a quick method of assessment that will provide a quick general evaluation of the pragmatic effectiveness of neurologically affected speakers. This first filter will allow us to decide whether a detailed analysis of the altered categories should follow. Our starting point was the PerLA (perception, language and aphasia) profile of pragmatic assessment designed for the comprehensive analysis of conversational data in clinical linguistics; this was then converted into a quick questionnaire. A quick protocol of pragmatic assessment is proposed and the results found in a group of children with attention deficit hyperactivity disorder are discussed.

  16. Statistical analysis for understanding and predicting battery degradations in real-life electric vehicle use

    NASA Astrophysics Data System (ADS)

    Barré, Anthony; Suard, Frédéric; Gérard, Mathias; Montaru, Maxime; Riu, Delphine

    2014-01-01

    This paper describes the statistical analysis of recorded data parameters of electrical battery ageing during electric vehicle use. These data permit traditional battery ageing investigation based on the evolution of the capacity fade and resistance raise. The measured variables are examined in order to explain the correlation between battery ageing and operating conditions during experiments. Such study enables us to identify the main ageing factors. Then, detailed statistical dependency explorations present the responsible factors on battery ageing phenomena. Predictive battery ageing models are built from this approach. Thereby results demonstrate and quantify a relationship between variables and battery ageing global observations, and also allow accurate battery ageing diagnosis through predictive models.

  17. Second-order shaped pulsed for solid-state quantum computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Pinaki

    2008-01-01

    We present the construction and detailed analysis of highly optimized self-refocusing pulse shapes for several rotation angles. We characterize the constructed pulses by the coefficients appearing in the Magnus expansion up to second order. This allows a semianalytical analysis of the performance of the constructed shapes in sequences and composite pulses by computing the corresponding leading-order error operators. Higher orders can be analyzed with the numerical technique suggested by us previously. We illustrate the technique by analyzing several composite pulses designed to protect against pulse amplitude errors, and on decoupling sequences for potentially long chains of qubits with on-site andmore » nearest-neighbor couplings.« less

  18. A system for analysis and classification of voice communications

    NASA Technical Reports Server (NTRS)

    Older, H. J.; Jenney, L. L.; Garland, L.

    1973-01-01

    A method for analysis and classification of verbal communications typically associated with manned space missions or simulations was developed. The study was carried out in two phases. Phase 1 was devoted to identification of crew tasks and activities which require voice communication for accomplishment or reporting. Phase 2 entailed development of a message classification system and a preliminary test of its feasibility. The classification system permits voice communications to be analyzed to three progressively more specific levels of detail and to be described in terms of message content, purpose, and the participants in the information exchange. A coding technique was devised to allow messages to be recorded by an eight-digit number.

  19. Market assessment overview

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, H.

    1981-01-01

    Market assessment, refined with analysis disaggregated from a national level to the regional level and to specific market applications, resulted in more accurate and detailed market estimates. The development of an integrated set of computer simulations, coupled with refined market data, allowed progress in the ability to evaluate the worth of solar thermal parabolic dish systems. In-depth analyses of both electric and thermal market applications of these systems are described. The following market assessment studies were undertaken: (1) regional analysis of the near term market for parabolic dish systems; (2) potential early market estimate for electric applications; (3) potential early market estimate for industrial process heat/cogeneration applications; and (4) selection of thermal and electric application case studies for fiscal year 1981.

  20. Glass polymorphism in amorphous germanium probed by first-principles computer simulations

    NASA Astrophysics Data System (ADS)

    Mancini, G.; Celino, M.; Iesari, F.; Di Cicco, A.

    2016-01-01

    The low-density (LDA) to high-density (HDA) transformation in amorphous Ge at high pressure is studied by first-principles molecular dynamics simulations in the framework of density functional theory. Previous experiments are accurately reproduced, including the presence of a well-defined LDA-HDA transition above 8 GPa. The LDA-HDA density increase is found to be about 14%. Pair and bond-angle distributions are obtained in the 0-16 GPa pressure range and allowed us a detailed analysis of the transition. The local fourfold coordination is transformed in an average HDA sixfold coordination associated with different local geometries as confirmed by coordination number analysis and shape of the bond-angle distributions.

  1. Conformational and vibrational reassessment of solid paracetamol

    NASA Astrophysics Data System (ADS)

    Amado, Ana M.; Azevedo, Celeste; Ribeiro-Claro, Paulo J. A.

    2017-08-01

    This work provides an answer to the urge for a more detailed and accurate knowledge of the vibrational spectrum of the widely used analgesic/antipyretic drug commonly known as paracetamol. A comprehensive spectroscopic analysis - including infrared, Raman, and inelastic neutron scattering (INS) - is combined with a computational approach which takes account for the effects of intermolecular interactions in the solid state. This allows a full reassessment of the vibrational assignments for Paracetamol, thus preventing the propagation of incorrect data analysis and misassignments already found in the literature. In particular, the vibrational modes involving the hydrogen-bonded Nsbnd H and Osbnd H groups are correctly reallocated to bands shifted by up to 300 cm- 1 relatively to previous assignments.

  2. 13. ATTIC, NORTH END, DETAIL SHOWING CONSTRUCTION OF STAIRHALL SKYLIGHT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. ATTIC, NORTH END, DETAIL SHOWING CONSTRUCTION OF STAIRHALL SKYLIGHT DOME: NOTE WINDOW IN ROOF WHICH ALLOWS LIGHT TO ENTER SKYLIGHT PANES - William C. Hasbrouck House, 99 Montgomery Street, Newburgh, Orange County, NY

  3. Stellar, remnant, planetary, and dark-object masses from astrometric microlensing

    NASA Technical Reports Server (NTRS)

    Boden, A.; Gould, A. P.; Bennett, D. P.; Depoy, D. L.; Gaudi, S. B.; Griest, K.; Han, C.; Paczynski, B.; Reid, I. N.

    2002-01-01

    With SIM, we will break the microlensing degeneracy, and allow detailed interpretation of individual microlensing events. We will thus develop a detailed census of the dark and luminous stellar population of the Galaxy.

  4. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  5. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1993-01-01

    The SSME has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) Develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system. (2) Develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amounts of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. A high compression ratio can be achieved to allow the minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities. (3) Integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for a quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  6. SWMPrats.net: A Web-Based Resource for Exploring SWMP ...

    EPA Pesticide Factsheets

    SWMPrats.net is a web-based resource that provides accessible approaches to using SWMP data. The website includes a user forum with instructional ‘Plots of the Month’; links to workshop content; and a description of the SWMPr data analysis package for R. Interactive “widgets” allow users to skip the boring parts of data analysis and get right to the fun: visualization and exploration! There are three widgets, each performing a different analysis: system-wide overviews, detailed temporal summaries of a single variable at a single site, and inter-comparisons between sites or variables through time. Users can visually explore system-wide trends in data using the Trends Map widget. For a more detailed analysis, users can create monthly and annual graphs of single variables and locations in the Summary Plot widget. Lastly, users can compare two variables or NERRS locations through time using the Aggregation widget. For all widgets, users can adjust the time period of interest. Plots and tables can also be downloaded for use in outreach, education, or further analysis. The tools and forums are meant to build a community of practice to move SWMP data analysis forward. All widgets will be demonstrated live at the poster session. This abstract is for a poster presentation at the 2016 annual meeting for the National Estuarine Research Reserve System, Nov. 13-18. We will describe our online web resources for the analysis and interpretation of monitoring da

  7. Implementation and reporting of causal mediation analysis in 2015: a systematic review in epidemiological studies.

    PubMed

    Liu, Shao-Hsien; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-07-20

    Causal mediation analysis is often used to understand the impact of variables along the causal pathway of an occurrence relation. How well studies apply and report the elements of causal mediation analysis remains unknown. We systematically reviewed epidemiological studies published in 2015 that employed causal mediation analysis to estimate direct and indirect effects of observed associations between an exposure on an outcome. We identified potential epidemiological studies through conducting a citation search within Web of Science and a keyword search within PubMed. Two reviewers independently screened studies for eligibility. For eligible studies, one reviewer performed data extraction, and a senior epidemiologist confirmed the extracted information. Empirical application and methodological details of the technique were extracted and summarized. Thirteen studies were eligible for data extraction. While the majority of studies reported and identified the effects of measures, most studies lacked sufficient details on the extent to which identifiability assumptions were satisfied. Although most studies addressed issues of unmeasured confounders either from empirical approaches or sensitivity analyses, the majority did not examine the potential bias arising from the measurement error of the mediator. Some studies allowed for exposure-mediator interaction and only a few presented results from models both with and without interactions. Power calculations were scarce. Reporting of causal mediation analysis is varied and suboptimal. Given that the application of causal mediation analysis will likely continue to increase, developing standards of reporting of causal mediation analysis in epidemiological research would be prudent.

  8. Amygdala activity at encoding corresponds with memory vividness and with memory for select episodic details.

    PubMed

    Kensinger, Elizabeth A; Addis, Donna Rose; Atapattu, Ranga K

    2011-03-01

    It is well known that amygdala activity during encoding corresponds with subsequent memory for emotional information. It is less clear how amygdala activity relates to the subjective and objective qualities of a memory. In the present study, participants viewed emotional and neutral objects while undergoing a functional magnetic resonance imaging scan. Participants then took a memory test, identifying which verbal labels named a studied object and indicating the vividness of their memory for that object. They then retrieved episodic details associated with each object's presentation, selecting which object exemplar had been studied and indicating in which screen quadrant, study list, and with which encoding question the exemplar had been studied. Parametric analysis of the encoding data allowed examination of the processes that tracked with increasing memory vividness or with an increase in the diversity of episodic details remembered. Dissociable networks tracked these two increases, and amygdala activity corresponded with the former but not the latter. Subsequent-memory analyses revealed that amygdala activity corresponded with memory for exemplar type but not for other episodic features. These results emphasize that amygdala activity does not ensure accurate encoding of all types of episodic detail, yet it does support encoding of some item-specific details and leads to the retention of a memory that will feel subjectively vivid. The types of episodic details tied to amygdala engagement may be those that are most important for creating a subjectively vivid memory. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Amygdala Activity at Encoding Corresponds with Memory Vividness and with Memory for Select Episodic Details

    PubMed Central

    Kensinger, Elizabeth A.; Addis, Donna Rose; Atapattu, Ranga K.

    2011-01-01

    It is well known that amygdala activity during encoding corresponds with subsequent memory for emotional information. It is less clear how amygdala activity relates to the subjective and objective qualities of a memory. In the present study, participants viewed emotional and neutral objects while undergoing a functional magnetic resonance imaging scan. Participants then took a memory test, identifying which verbal labels named a studied object and indicating the vividness of their memory for that object. They then retrieved episodic details associated with each object’s presentation, selecting which object exemplar had been studied and indicating in which screen quadrant, study list, and with which encoding question the exemplar had been studied. Parametric analysis of the encoding data allowed examination of the processes that tracked with increasing memory vividness or with an increase in the diversity of episodic details remembered. Dissociable networks tracked these two increases, and amygdala activity corresponded with the former but not the latter. Subsequent-memory analyses revealed that amygdala activity corresponded with memory for exemplar type but not for other episodic features. These results emphasize that amygdala activity does not ensure accurate encoding of all types of episodic detail, yet it does support encoding of some item-specific details and leads to the retention of a memory that will feel subjectively vivid. The types of episodic details tied to amygdala engagement may be those that are most important for creating a subjectively vivid memory. PMID:21262244

  10. Analytical chemistry in water quality monitoring during manned space missions

    NASA Astrophysics Data System (ADS)

    Artemyeva, Anastasia A.

    2016-09-01

    Water quality monitoring during human spaceflights is essential. However, most of the traditional methods require sample collection with a subsequent ground analysis because of the limitations in volume, power, safety and gravity. The space missions are becoming longer-lasting; hence methods suitable for in-flight monitoring are demanded. Since 2009, water quality has been monitored in-flight with colorimetric methods allowing for detection of iodine and ionic silver. Organic compounds in water have been monitored with a second generation total organic carbon analyzer, which provides information on the amount of carbon in water at both the U.S. and Russian segments of the International Space Station since 2008. The disadvantage of this approach is the lack of compound-specific information. The recently developed methods and tools may potentially allow one to obtain in-flight a more detailed information on water quality. Namely, the microanalyzers based on potentiometric measurements were designed for online detection of chloride, potassium, nitrate ions and ammonia. The recent application of the current highly developed air quality monitoring system for water analysis was a logical step because most of the target analytes are the same in air and water. An electro-thermal vaporizer was designed, manufactured and coupled with the air quality control system. This development allowed for liberating the analytes from the aqueous matrix and further compound-specific analysis in the gas phase.

  11. STEREO In-situ Data Analysis

    NASA Astrophysics Data System (ADS)

    Schroeder, P. C.; Luhmann, J. G.; Davis, A. J.; Russell, C. T.

    2006-12-01

    STEREO's IMPACT (In-situ Measurements of Particles and CME Transients) investigation provides the first opportunity for long duration, detailed observations of 1 AU magnetic field structures, plasma and suprathermal electrons, and energetic particles at points bracketing Earth's heliospheric location. The PLASTIC instrument takes plasma ion composition measurements completing STEREO's comprehensive in-situ perspective. Stereoscopic/3D information from the STEREO SECCHI imagers and SWAVES radio experiment make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections (ICME) and solar wind structures to CMEs and coronal holes observed at the Sun. The uniqueness of the STEREO mission requires novel data analysis tools and techniques to take advantage of the mission's full scientific potential. An interactive browser with the ability to create publication-quality plots has been developed which integrates STEREO's in-situ data with data from a variety of other missions including WIND and ACE. Also, an application program interface (API) is provided allowing users to create custom software that ties directly into STEREO's data set. The API allows for more advanced forms of data mining than currently available through most web-based data services. A variety of data access techniques and the development of cross-spacecraft data analysis tools allow the larger scientific community to combine STEREO's unique in-situ data with those of other missions, particularly the L1 missions, and, therefore, to maximize STEREO's scientific potential in gaining a greater understanding of the heliosphere.

  12. A quantitative framework for flower phenotyping in cultivated carnation (Dianthus caryophyllus L.).

    PubMed

    Chacón, Borja; Ballester, Roberto; Birlanga, Virginia; Rolland-Lagan, Anne-Gaëlle; Pérez-Pérez, José Manuel

    2013-01-01

    Most important breeding goals in ornamental crops are plant appearance and flower characteristics where selection is visually performed on direct offspring of crossings. We developed an image analysis toolbox for the acquisition of flower and petal images from cultivated carnation (Dianthus caryophyllus L.) that was validated by a detailed analysis of flower and petal size and shape in 78 commercial cultivars of D. caryophyllus, including 55 standard, 22 spray and 1 pot carnation cultivars. Correlation analyses allowed us to reduce the number of parameters accounting for the observed variation in flower and petal morphology. Convexity was used as a descriptor for the level of serration in flowers and petals. We used a landmark-based approach that allowed us to identify eight main principal components (PCs) accounting for most of the variance observed in petal shape. The effect and the strength of these PCs in standard and spray carnation cultivars are consistent with shared underlying mechanisms involved in the morphological diversification of petals in both subpopulations. Our results also indicate that neighbor-joining trees built with morphological data might infer certain phylogenetic relationships among carnation cultivars. Based on estimated broad-sense heritability values for some flower and petal features, different genetic determinants shall modulate the responses of flower and petal morphology to environmental cues in this species. We believe our image analysis toolbox could allow capturing flower variation in other species of high ornamental value.

  13. System Evaluation and Life-Cycle Cost Analysis of a Commercial-Scale High-Temperature Electrolysis Hydrogen Production Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwin A. Harvego; James E. O'Brien; Michael G. McKellar

    2012-11-01

    Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less

  14. Development of Improved Models, Stochasticity, and Frameworks for the MIT Extensible Air Network Simulation

    NASA Technical Reports Server (NTRS)

    Clarke, John-Paul

    2004-01-01

    MEANS, the MIT Extensible Air Network Simulation, was created in February of 2001, and has been developed with support from NASA Ames since August of 2001. MEANS is a simulation tool which is designed to maximize fidelity without requiring data of such a low level as to preclude easy examination of alternative scenarios. To this end, MEANS is structured in a modular fashion to allow more detailed components to be brought in when desired, and left out when they would only be an impediment. Traditionally, one of the difficulties with high-fidelity models is that they require a level of detail in their data that is difficult to obtain. For analysis of past scenarios, the required data may not have been collected, or may be considered proprietary and thus difficult for independent researchers to obtain. For hypothetical scenarios, generation of the data is sufficiently difficult to be a task in and of itself. Often, simulations designed by a researcher will model exactly one element of the problem well and in detail, while assuming away other parts of the problem which are not of interest or for which data is not available. While these models are useful for working with the task at hand, they are very often not applicable to future problems. The MEAN Simulation attempts to address these problems by using a modular design which provides components of varying fidelity for each aspect of the simulation. This allows for the most accurate model for which data is available to be used. It also provides for easy analysis of sensitivity to data accuracy. This can be particularly useful in the case where accurate data is available for some subset of the situations that are to be considered. Furthermore, the ability to use the same model while examining effects on different parts of a system reduces the time spent learning the simulation, and provides for easier comparisons between changes to different parts of the system.

  15. An introduction to wavelet analysis in oceanography and meteorology - With application to the dispersion of Yanai waves

    NASA Technical Reports Server (NTRS)

    Meyers, Steven D.; Kelly, B. G.; O'Brien, J. J.

    1993-01-01

    Wavelet analysis is a relatively new technique that is an important addition to standard signal analysis methods. Unlike Fourier analysis that yields an average amplitude and phase for each harmonic in a dataset, the wavelet transform produces an instantaneous estimate or local value for the amplitude and phase of each harmonic. This allows detailed study of nonstationary spatial or time-dependent signal characteristics. The wavelet transform is discussed, examples are given, and some methods for preprocessing data for wavelet analysis are compared. By studying the dispersion of Yanai waves in a reduced gravity equatorial model, the usefulness of the transform is demonstrated. The group velocity is measured directly over a finite range of wavenumbers by examining the time evolution of the transform. The results agree well with linear theory at higher wavenumber but the measured group velocity is reduced at lower wavenumbers, possibly due to interaction with the basin boundaries.

  16. Striped Data Server for Scalable Parallel Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Jin; Gutsche, Oliver; Mandrichenko, Igor

    A columnar data representation is known to be an efficient way for data storage, specifically in cases when the analysis is often done based only on a small fragment of the available data structures. A data representation like Apache Parquet is a step forward from a columnar representation, which splits data horizontally to allow for easy parallelization of data analysis. Based on the general idea of columnar data storage, working on the [LDRD Project], we have developed a striped data representation, which, we believe, is better suited to the needs of High Energy Physics data analysis. A traditional columnar approachmore » allows for efficient data analysis of complex structures. While keeping all the benefits of columnar data representations, the striped mechanism goes further by enabling easy parallelization of computations without requiring special hardware. We will present an implementation and some performance characteristics of such a data representation mechanism using a distributed no-SQL database or a local file system, unified under the same API and data representation model. The representation is efficient and at the same time simple so that it allows for a common data model and APIs for wide range of underlying storage mechanisms such as distributed no-SQL databases and local file systems. Striped storage adopts Numpy arrays as its basic data representation format, which makes it easy and efficient to use in Python applications. The Striped Data Server is a web service, which allows to hide the server implementation details from the end user, easily exposes data to WAN users, and allows to utilize well known and developed data caching solutions to further increase data access efficiency. We are considering the Striped Data Server as the core of an enterprise scale data analysis platform for High Energy Physics and similar areas of data processing. We have been testing this architecture with a 2TB dataset from a CMS dark matter search and plan to expand it to multiple 100 TB or even PB scale. We will present the striped format, Striped Data Server architecture and performance test results.« less

  17. Lovis Corinth: integrating hemineglect and spatial distortions.

    PubMed

    Bäzner, H; Hennerici, M G

    2007-01-01

    Lovis Corinth suffered a right-hemispheric stroke at the age of 53 years, but died only 14 years later. The huge number of artworks that he produced after this life-threatening disease allows a detailed analysis of his poststroke artwork in comparison to his prestroke artwork. When performing this analysis as a neurologist, an enormous diversity of subtle stroke sequelae can be discovered, which can mostly be explained by a left-sided hemineglect. These findings clearly go far beyond pure psychological processes. Moreover, Corinth is a good and motivating example for patients suffering disability after a stroke, because he was able to produce great artwork after his stroke. Lovis Corinth was struggling against motor disability that admittedly was not severely affecting his artistic production but he also had to fight against severe neuropsychological deficits that did have clear consequences for his artistic production. Corinth's credo was 'true art means to use unreality'. Taken together with the often cited phrase of 'drawing means to [details]', there will be a clear-cut interpretation for the neurologist that can be derived from the understanding of a right-hemisphere lesion and subsequent left-sided neglect.

  18. Atomistic modeling of the low-frequency mechanical modes and Raman spectra of icosahedral virus capsids

    NASA Astrophysics Data System (ADS)

    Dykeman, Eric C.; Sankey, Otto F.

    2010-02-01

    We describe a technique for calculating the low-frequency mechanical modes and frequencies of a large symmetric biological molecule where the eigenvectors of the Hessian matrix are determined with full atomic detail. The method, which follows order N methods used in electronic structure theory, determines the subset of lowest-frequency modes while using group theory to reduce the complexity of the problem. We apply the method to three icosahedral viruses of various T numbers and sizes; the human viruses polio and hepatitis B, and the cowpea chlorotic mottle virus, a plant virus. From the normal-mode eigenvectors, we use a bond polarizability model to predict a low-frequency Raman scattering profile for the viruses. The full atomic detail in the displacement patterns combined with an empirical potential-energy model allows a comparison of the fully atomic normal modes with elastic network models and normal-mode analysis with only dihedral degrees of freedom. We find that coarse-graining normal-mode analysis (particularly the elastic network model) can predict the displacement patterns for the first few (˜10) low-frequency modes that are global and cooperative.

  19. Diagnostics of laser-produced plasmas based on the analysis of intensity ratios of He-like ions X-ray emission

    DOE PAGES

    Ryazantsev, S. N.; Skobelev, I. Yu.; Faenov, A. Ya.; ...

    2016-12-08

    Here, in this paper, we detail the diagnostic technique used to infer the spatially resolved electron temperatures and densities in experiments dedicated to investigate the generation of magnetically collimated plasma jets. It is shown that the relative intensities of the resonance transitions in emitting He-like ions can be used to measure the temperature in such recombining plasmas. The intensities of these transitions are sensitive to the plasma density in the range of 10 16–10 20 cm -3 and to plasma temperature ranges from 10 to 100 eV for ions with a nuclear charge Z n ~10. We show how detailedmore » calculations of the emissivity of F VIII ions allow to determine the parameters of the plasma jets that were created using ELFIE ns laser facility (Ecole Polytechnique, France). Lastly, the diagnostic and analysis technique detailed here can be applied in a broader context than the one of this study, i.e., to diagnose any recombining plasma containing He-like fluorine ions.« less

  20. Near-field electromagnetic holography for high-resolution analysis of network interactions in neuronal tissue

    PubMed Central

    Kjeldsen, Henrik D.; Kaiser, Marcus; Whittington, Miles A.

    2015-01-01

    Background Brain function is dependent upon the concerted, dynamical interactions between a great many neurons distributed over many cortical subregions. Current methods of quantifying such interactions are limited by consideration only of single direct or indirect measures of a subsample of all neuronal population activity. New method Here we present a new derivation of the electromagnetic analogy to near-field acoustic holography allowing high-resolution, vectored estimates of interactions between sources of electromagnetic activity that significantly improves this situation. In vitro voltage potential recordings were used to estimate pseudo-electromagnetic energy flow vector fields, current and energy source densities and energy dissipation in reconstruction planes at depth into the neural tissue parallel to the recording plane of the microelectrode array. Results The properties of the reconstructed near-field estimate allowed both the utilization of super-resolution techniques to increase the imaging resolution beyond that of the microelectrode array, and facilitated a novel approach to estimating causal relationships between activity in neocortical subregions. Comparison with existing methods The holographic nature of the reconstruction method allowed significantly better estimation of the fine spatiotemporal detail of neuronal population activity, compared with interpolation alone, beyond the spatial resolution of the electrode arrays used. Pseudo-energy flow vector mapping was possible with high temporal precision, allowing a near-realtime estimate of causal interaction dynamics. Conclusions Basic near-field electromagnetic holography provides a powerful means to increase spatial resolution from electrode array data with careful choice of spatial filters and distance to reconstruction plane. More detailed approaches may provide the ability to volumetrically reconstruct activity patterns on neuronal tissue, but the ability to extract vectored data with the method presented already permits the study of dynamic causal interactions without bias from any prior assumptions on anatomical connectivity. PMID:26026581

  1. Near-field electromagnetic holography for high-resolution analysis of network interactions in neuronal tissue.

    PubMed

    Kjeldsen, Henrik D; Kaiser, Marcus; Whittington, Miles A

    2015-09-30

    Brain function is dependent upon the concerted, dynamical interactions between a great many neurons distributed over many cortical subregions. Current methods of quantifying such interactions are limited by consideration only of single direct or indirect measures of a subsample of all neuronal population activity. Here we present a new derivation of the electromagnetic analogy to near-field acoustic holography allowing high-resolution, vectored estimates of interactions between sources of electromagnetic activity that significantly improves this situation. In vitro voltage potential recordings were used to estimate pseudo-electromagnetic energy flow vector fields, current and energy source densities and energy dissipation in reconstruction planes at depth into the neural tissue parallel to the recording plane of the microelectrode array. The properties of the reconstructed near-field estimate allowed both the utilization of super-resolution techniques to increase the imaging resolution beyond that of the microelectrode array, and facilitated a novel approach to estimating causal relationships between activity in neocortical subregions. The holographic nature of the reconstruction method allowed significantly better estimation of the fine spatiotemporal detail of neuronal population activity, compared with interpolation alone, beyond the spatial resolution of the electrode arrays used. Pseudo-energy flow vector mapping was possible with high temporal precision, allowing a near-realtime estimate of causal interaction dynamics. Basic near-field electromagnetic holography provides a powerful means to increase spatial resolution from electrode array data with careful choice of spatial filters and distance to reconstruction plane. More detailed approaches may provide the ability to volumetrically reconstruct activity patterns on neuronal tissue, but the ability to extract vectored data with the method presented already permits the study of dynamic causal interactions without bias from any prior assumptions on anatomical connectivity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Design and Analysis of a Turbopump for a Conceptual Expander Cycle Upper-Stage Engine

    NASA Technical Reports Server (NTRS)

    Dorney, Daniel J.; Rothermel, Jeffry; Griffin, Lisa W.; Thornton, Randall J.; Forbes, John C.; Skelly, Stephen E.; Huber, Frank W.

    2006-01-01

    As part of the development of technologies for rocket engines that will power spacecraft to the Moon and Mars, a program was initiated to develop a conceptual upper stage engine with wide flow range capability. The resulting expander cycle engine design employs a radial turbine to allow higher pump speeds and efficiencies. In this paper, the design and analysis of the pump section of the engine are discussed. One-dimensional meanline analyses and three-dimensional unsteady computational fluid dynamics simulations were performed for the pump stage. Configurations with both vaneless and vaned diffusers were investigated. Both the meanline analysis and computational predictions show that the pump will meet the performance objectives. Additional details describing the development of a water flow facility test are also presented.

  3. Further Analysis on the Mystery of the Surveyor III Dust Deposits

    NASA Technical Reports Server (NTRS)

    Metzger, Philip; Hintze, Paul; Trigwell, Steven; Lane, John

    2012-01-01

    The Apollo 12 lunar module (LM) landing near the Surveyor III spacecraft at the end of 1969 has remained the primary experimental verification of the predicted physics of plume ejecta effects from a rocket engine interacting with the surface of the moon. This was made possible by the return of the Surveyor III camera housing by the Apollo 12 astronauts, allowing detailed analysis of the composition of dust deposited by the LM plume. It was soon realized after the initial analysis of the camera housing that the LM plume tended to remove more dust than it had deposited. In the present study, coupons from the camera housing have been reexamined. In addition, plume effects recorded in landing videos from each Apollo mission have been studied for possible clues.

  4. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    NASA Technical Reports Server (NTRS)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  5. Provenance Challenges for Earth Science Dataset Publication

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2011-01-01

    Modern science is increasingly dependent on computational analysis of very large data sets. Organizing, referencing, publishing those data has become a complex problem. Published research that depends on such data often fails to cite the data in sufficient detail to allow an independent scientist to reproduce the original experiments and analyses. This paper explores some of the challenges related to data identification, equivalence and reproducibility in the domain of data intensive scientific processing. It will use the example of Earth Science satellite data, but the challenges also apply to other domains.

  6. H1 in RSA galaxies

    NASA Technical Reports Server (NTRS)

    Richter, OTTO-G.

    1993-01-01

    The original Revised Shapley-Ames (RSA) galaxy sample of almost 1300 galaxies has been augmented with further bright galaxies from the RSA appendix as well as newer galaxy catalogs. A complete and homogeneous, strictly magnitude-limited all-sky sample of 2345 galaxies brighter than 13.4 in apparent blue magnitude was formed. New 21 cm H1 line observations for more than 600 RSA galaxies have been combined with all previously available H1 data from the literature. This new extentise data act allows detailed tests of widely accepted 'standard' reduction and analysis techniques.

  7. CDF Top Physics

    DOE R&D Accomplishments Database

    Tartarelli, G. F.; CDF Collaboration

    1996-05-01

    The authors present the latest results about top physics obtained by the CDF experiment at the Fermilab Tevatron collider. The data sample used for these analysis (about 110 pb{sup{minus}1}) represents almost the entire statistics collected by CDF during four years (1992--95) of data taking. This large data size has allowed detailed studies of top production and decay properties. The results discussed here include the determination of the top quark mass, the measurement of the production cross section, the study of the kinematics of the top events and a look at top decays.

  8. Kinetics of the electric double layer formation modelled by the finite difference method

    NASA Astrophysics Data System (ADS)

    Valent, Ivan

    2017-11-01

    Dynamics of the elctric double layer formation in 100 mM NaCl solution for sudden potentail steps of 10 and 20 mV was simulated using the Poisson-Nernst-Planck theory and VLUGR2 solver for partial differential equations. The used approach was verified by comparing the obtained steady-state solution with the available exact solution. The simulations allowed for detailed analysis of the relaxation processes of the individual ions and the electric potential. Some computational aspects of the problem were discussed.

  9. The algorithmic details of polynomials application in the problems of heat and mass transfer control on the hypersonic aircraft permeable surfaces

    NASA Astrophysics Data System (ADS)

    Bilchenko, G. G.; Bilchenko, N. G.

    2018-03-01

    The hypersonic aircraft permeable surfaces heat and mass transfer effective control mathematical modeling problems are considered. The analysis of the control (the blowing) constructive and gasdynamical restrictions is carried out for the porous and perforated surfaces. The functions classes allowing realize the controls taking into account the arising types of restrictions are suggested. Estimates of the computational complexity of the W. G. Horner scheme application in the case of using the C. Hermite interpolation polynomial are given.

  10. Evaluating performance of biomedical image retrieval systems – an overview of the medical image retrieval task at ImageCLEF 2004–2013

    PubMed Central

    Kalpathy-Cramer, Jayashree; de Herrera, Alba García Seco; Demner-Fushman, Dina; Antani, Sameer; Bedrick, Steven; Müller, Henning

    2014-01-01

    Medical image retrieval and classification have been extremely active research topics over the past 15 years. With the ImageCLEF benchmark in medical image retrieval and classification a standard test bed was created that allows researchers to compare their approaches and ideas on increasingly large and varied data sets including generated ground truth. This article describes the lessons learned in ten evaluations campaigns. A detailed analysis of the data also highlights the value of the resources created. PMID:24746250

  11. Baseline-dependent averaging in radio interferometry

    NASA Astrophysics Data System (ADS)

    Wijnholds, S. J.; Willis, A. G.; Salvini, S.

    2018-05-01

    This paper presents a detailed analysis of the applicability and benefits of baseline-dependent averaging (BDA) in modern radio interferometers and in particular the Square Kilometre Array. We demonstrate that BDA does not affect the information content of the data other than a well-defined decorrelation loss for which closed form expressions are readily available. We verify these theoretical findings using simulations. We therefore conclude that BDA can be used reliably in modern radio interferometry allowing a reduction of visibility data volume (and hence processing costs for handling visibility data) by more than 80 per cent.

  12. Torus Breakdown and Homoclinic Chaos in a Glow Discharge Tube

    NASA Astrophysics Data System (ADS)

    Ginoux, Jean-Marc; Meucci, Riccardo; Euzzor, Stefano

    2017-12-01

    Starting from historical researches, we used, like Van der Pol and Le Corbeiller, a cubic function for modeling the current-voltage characteristic of a direct current low-pressure plasma discharge tube, i.e. a neon tube. This led us to propose a new four-dimensional autonomous dynamical system allowing to describe the experimentally observed phenomenon. Then, mathematical analysis and detailed numerical investigations of such a fourth-order torus circuit enabled to highlight bifurcation routes from torus breakdown to homoclinic chaos following the Newhouse-Ruelle-Takens scenario.

  13. Custom Coordination Environments for Lanthanoids: Tripodal Ligands Achieve Near-Perfect Octahedral Coordination for Two Dysprosium-Based Molecular Nanomagnets.

    PubMed

    Lim, Kwang Soo; Baldoví, José J; Jiang, ShangDa; Koo, Bong Ho; Kang, Dong Won; Lee, Woo Ram; Koh, Eui Kwan; Gaita-Ariño, Alejandro; Coronado, Eugenio; Slota, Michael; Bogani, Lapo; Hong, Chang Seop

    2017-05-01

    Controlling the coordination sphere of lanthanoid complexes is a challenging critical step toward controlling their relaxation properties. Here we present the synthesis of hexacoordinated dysprosium single-molecule magnets, where tripodal ligands achieve a near-perfect octahedral coordination. We perform a complete experimental and theoretical investigation of their magnetic properties, including a full single-crystal magnetic anisotropy analysis. The combination of electrostatic and crystal-field computational tools (SIMPRE and CONDON codes) allows us to explain the static behavior of these systems in detail.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voynikova, D. S., E-mail: desi-sl2000@yahoo.com; Gocheva-Ilieva, S. G., E-mail: snegocheva@yahoo.com; Ivanov, A. V., E-mail: aivanov-99@yahoo.com

    Numerous time series methods are used in environmental sciences allowing the detailed investigation of air pollution processes. The goal of this study is to present the empirical analysis of various aspects of stochastic modeling and in particular the ARIMA/SARIMA methods. The subject of investigation is air pollution in the town of Kardzhali, Bulgaria with 2 problematic pollutants – sulfur dioxide (SO2) and particulate matter (PM10). Various SARIMA Transfer Function models are built taking into account meteorological factors, data transformations and the use of different horizons selected to predict future levels of concentrations of the pollutants.

  15. Smooth and rapid microwave synthesis of MIL-53(Fe) including superparamagnetic γ-Fe2O3 nanoparticles

    NASA Astrophysics Data System (ADS)

    Wengert, Simon; Albrecht, Joachim; Ruoss, Stephen; Stahl, Claudia; Schütz, Gisela; Schäfer, Ronald

    2017-12-01

    MIL-53(Fe) linked to superparamagnetic γ-Fe2O3 nanoparticles was created using time-efficient microwave synthesis. Intermediates as well as the final product have been characterized by Dynamic Light Scattering (DLS), Infrared Spectroscopy (FTIR) and Thermal Gravimetric Analysis (TGA). It is found that this route allows the production of Fe nanoparticles with typical sizes of about 80 nm that are embedded inside the metal-organic structures. Detailed magnetization measurements using SQUID magnetometry revealed a nearly reversible magnetization loop indicating essentially superparamagnetic behavior.

  16. Technological variability in the Late Palaeolithic lithic industries of the Egyptian Nile Valley: The case of the Silsilian and Afian industries

    PubMed Central

    2017-01-01

    During the Nubia Salvage Campaign and the subsequent expeditions from the 1960’s to the 1980’s, numerous sites attributed to the Late Palaeolithic (~25–15 ka) were found in the Nile Valley, particularly in Nubia and Upper Egypt. This region is one of the few to have allowed human occupations during the dry Marine Isotope Stage 2 and is therefore key to understanding how human populations adapted to environmental changes at this time. This paper focuses on two sites located in Upper Egypt, excavated by the Combined Prehistoric Expedition: E71K18, attributed to the Afian industry and E71K20, attributed to the Silsilian industry. It aims to review the geomorphological and chronological evidence of the sites, present a technological analysis of the lithic assemblages in order to provide data that can be used in detailed comparative studies, which will allow discussion of technological variability in the Late Palaeolithic of the Nile Valley and its place within the regional context. The lithic analysis relies on the chaîne opératoire concept combined with an attribute analysis to allow quantification. This study (1) casts doubts on the chronology of E71K18 and related Afian industry, which could be older or younger than previously suggested, highlights (2) distinct technological characteristics for the Afian and the Silsilian, as well as (3) similar technological characteristics which allow to group them under a same broad techno-cultural complex, distinct from those north or south of the area. PMID:29281660

  17. An end-to-end X-IFU simulator: constraints on ICM kinematics

    NASA Astrophysics Data System (ADS)

    Roncarelli, M.; Gaspari, M.; Ettori, S.; Brighenti, F.

    2017-10-01

    In the next years the study of ICM physics will benefit from a completely new type of oservations made available by the X-IFU microcalorimeter of the ATHENA X-ray telescope. X-IFU will combine energy and spatial resolution (2.5 eV and 5 arcsec) allowing to map line emission and, potentially, to characterise the ICM dynamics with an unprecedented detail. I will present an end-to-end simulator aimed at describing the ability of X-IFU to characterise ICM velocity features. Starting from hydrodynamical simulations of ICM turbulence (Gaspari et al. 2013) we went through a detailed and realistic spectral analysis of simulated observations to derive mapped quantities of gas density, temperature, metallicity and, most notably, centroid shift and velocity broadening of the emission lines, with relative errors. Our results show that X-IFU will be able to map in great detail the ICM velocity features and provide precise measurements of the broadening power spectrum. This will provide interesting constraints on the characteristics of turbulent motions, both on large and small scales.

  18. BAMS2 Workspace: a comprehensive and versatile neuroinformatic platform for collating and processing neuroanatomical connections

    PubMed Central

    Bota, Mihail; Talpalaru, Ştefan; Hintiryan, Houri; Dong, Hong-Wei; Swanson, Larry W.

    2014-01-01

    We present in this paper a novel neuroinformatic platform, the BAMS2 Workspace (http://brancusi1.usc.edu), designed for storing and processing information about gray matter region axonal connections. This de novo constructed module allows registered users to directly collate their data by using a simple and versatile visual interface. It also allows construction and analysis of sets of connections associated with gray matter region nomenclatures from any designated species. The Workspace includes a set of tools allowing the display of data in matrix and networks formats, and the uploading of processed information in visual, PDF, CSV, and Excel formats. Finally, the Workspace can be accessed anonymously by third party systems to create individualized connectivity networks. All features of the BAMS2 Workspace are described in detail, and are demonstrated with connectivity reports collated in BAMS and associated with the rat sensory-motor cortex, medial frontal cortex, and amygdalar regions. PMID:24668342

  19. PIXE analysis of caries related trace elements in tooth enamel

    NASA Astrophysics Data System (ADS)

    Annegarn, H. J.; Jodaikin, A.; Cleaton-Jones, P. E.; Sellschop, J. P. F.; Madiba, C. C. P.; Bibby, D.

    1981-03-01

    PIXE analysis has been applied to a set of twenty human teeth to determine trace element concentration in enamel from areas susceptible to dental caries (mesial and distal contact points) and in areas less susceptible to the disease (buccal surfaces), with the aim of determining the possible roles of trace elements in the curious process. The samples were caries-free anterior incisors extracted for periodontal reasons from subjects 10-30 years of age. Prior to extraction of the sample teeth, a detailed dental history and examination was carried out in each individual. PIXE analysis, using a 3 MeV proton beam of 1 mm diameter, allowed the determination of Ca, Mn, Fe, Cu, Zn, Sr and Pb above detection limits. As demonstrated in this work, the enhanced sensitivity of PIXE analysis over electron microprobe analysis, and the capability of localised surface analysis compared with the pooled samples required for neutron activation analysis, makes it a powerful and useful technique in dental analysis.

  20. Skeletal mechanism generation for surrogate fuels using directed relation graph with error propagation and sensitivity analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niemeyer, Kyle E.; Sung, Chih-Jen; Raju, Mandhapati P.

    2010-09-15

    A novel implementation for the skeletal reduction of large detailed reaction mechanisms using the directed relation graph with error propagation and sensitivity analysis (DRGEPSA) is developed and presented with examples for three hydrocarbon components, n-heptane, iso-octane, and n-decane, relevant to surrogate fuel development. DRGEPSA integrates two previously developed methods, directed relation graph-aided sensitivity analysis (DRGASA) and directed relation graph with error propagation (DRGEP), by first applying DRGEP to efficiently remove many unimportant species prior to sensitivity analysis to further remove unimportant species, producing an optimally small skeletal mechanism for a given error limit. It is illustrated that the combination ofmore » the DRGEP and DRGASA methods allows the DRGEPSA approach to overcome the weaknesses of each, specifically that DRGEP cannot identify all unimportant species and that DRGASA shields unimportant species from removal. Skeletal mechanisms for n-heptane and iso-octane generated using the DRGEP, DRGASA, and DRGEPSA methods are presented and compared to illustrate the improvement of DRGEPSA. From a detailed reaction mechanism for n-alkanes covering n-octane to n-hexadecane with 2115 species and 8157 reactions, two skeletal mechanisms for n-decane generated using DRGEPSA, one covering a comprehensive range of temperature, pressure, and equivalence ratio conditions for autoignition and the other limited to high temperatures, are presented and validated. The comprehensive skeletal mechanism consists of 202 species and 846 reactions and the high-temperature skeletal mechanism consists of 51 species and 256 reactions. Both mechanisms are further demonstrated to well reproduce the results of the detailed mechanism in perfectly-stirred reactor and laminar flame simulations over a wide range of conditions. The comprehensive and high-temperature n-decane skeletal mechanisms are included as supplementary material with this article. (author)« less

  1. Be/X-Ray Pulsar Binary Science with LOFT

    NASA Technical Reports Server (NTRS)

    Wilson-Hodge, Colleen A.

    2011-01-01

    Accretion disks are ubiquitous in astronomical sources. Accretion powered pulsars are a good test bed for accretion disk physics, because unlike for other objects, the spin of the neutron star is directly observable allowing us to see the effects of angular momentum transfer onto the pulsar. The combination of a sensitive wide-field monitor and the large area detector on LOFT will enable new detailed studies of accretion powered pulsars which I will review. RXTE observations have shown an unusually high number of Be/X-ray pulsar binaries in the SMC. Unlike binaries in the Milky Way, these systems are all at the same distance, allowing detailed population studies using the sensitive LOFT WFM, potentially providing connections to star formation episodes. For Galactic accreting pulsar systems, LOFT will allow measurement of spectral variations within individual pulses, mapping the accretion column in detail for the first time. LOFT will also provide better constraints on magnetic fields in accreting pulsars, allowing measurements of cyclotron features, observations of transitions into the centrifugal inhibition regime, and monitoring of spin-up rate vs flux correlations. Coordinated multi-wavelength observations are crucial to extracting the best science from LOFT from these and numerous other objects.

  2. In-Space Radiator Shape Optimization using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Kittredge, Ken; Tinker, Michael; SanSoucie, Michael

    2006-01-01

    Future space exploration missions will require the development of more advanced in-space radiators. These radiators should be highly efficient and lightweight, deployable heat rejection systems. Typical radiators for in-space heat mitigation commonly comprise a substantial portion of the total vehicle mass. A small mass savings of even 5-10% can greatly improve vehicle performance. The objective of this paper is to present the development of detailed tools for the analysis and design of in-space radiators using evolutionary computation techniques. The optimality criterion is defined as a two-dimensional radiator with a shape demonstrating the smallest mass for the greatest overall heat transfer, thus the end result is a set of highly functional radiator designs. This cross-disciplinary work combines topology optimization and thermal analysis design by means of a genetic algorithm The proposed design tool consists of the following steps; design parameterization based on the exterior boundary of the radiator, objective function definition (mass minimization and heat loss maximization), objective function evaluation via finite element analysis (thermal radiation analysis) and optimization based on evolutionary algorithms. The radiator design problem is defined as follows: the input force is a driving temperature and the output reaction is heat loss. Appropriate modeling of the space environment is added to capture its effect on the radiator. The design parameters chosen for this radiator shape optimization problem fall into two classes, variable height along the width of the radiator and a spline curve defining the -material boundary of the radiator. The implementation of multiple design parameter schemes allows the user to have more confidence in the radiator optimization tool upon demonstration of convergence between the two design parameter schemes. This tool easily allows the user to manipulate the driving temperature regions thus permitting detailed design of in-space radiators for unique situations. Preliminary results indicate an optimized shape following that of the temperature distribution regions in the "cooler" portions of the radiator. The results closely follow the expected radiator shape.

  3. 9. Detail, original door in south leanto. The current project ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. Detail, original door in south lean-to. The current project will modify this opening to allow handicap access. - Interurban Electric Railway Bridge Yard Shop, Interstate 80 at Alameda County Postmile 2.0, Oakland, Alameda County, CA

  4. The physics behind the larger scale organization of DNA in eukaryotes.

    PubMed

    Emanuel, Marc; Radja, Nima Hamedani; Henriksson, Andreas; Schiessel, Helmut

    2009-07-01

    In this paper, we discuss in detail the organization of chromatin during a cell cycle at several levels. We show that current experimental data on large-scale chromatin organization have not yet reached the level of precision to allow for detailed modeling. We speculate in some detail about the possible physics underlying the larger scale chromatin organization.

  5. 3. Detail of airplane tail protruding out of hangar doors, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Detail of airplane tail protruding out of hangar doors, dock no. 491. Detail of canvas gasket allowing doors to close tightly around fuselage. View to north. - Offutt Air Force Base, Looking Glass Airborne Command Post, Nose Docks, On either side of Hangar Access Apron at Northwest end of Project Looking Glass Historic District, Bellevue, Sarpy County, NE

  6. Jahn-Teller versus quantum effects in the spin-orbital material LuVO 3

    DOE PAGES

    Skoulatos, M.; Toth, S.; Roessli, B.; ...

    2015-04-13

    In this article, we report on combined neutron and resonant x-ray scattering results, identifying the nature of the spin-orbital ground state and magnetic excitations in LuVO 3 as driven by the orbital parameter. In particular, we distinguish between models based on orbital-Peierls dimerization, taken as a signature of quantum effects in orbitals, and Jahn-Teller distortions, in favor of the latter. In order to solve this long-standing puzzle, polarized neutron beams were employed as a prerequisite in order to solve details of the magnetic structure, which allowed quantitative intensity analysis of extended magnetic-excitation data sets. The results of this detailed studymore » enabled us to draw definite conclusions about the classical versus quantum behavior of orbitals in this system and to discard the previous claims about quantum effects dominating the orbital physics of LuVO 3 and similar systems.« less

  7. Molecular and Biochemical Characterization of a Cytokinin Oxidase from Maize1

    PubMed Central

    Bilyeu, Kristin D.; Cole, Jean L.; Laskey, James G.; Riekhof, Wayne R.; Esparza, Thomas J.; Kramer, Michelle D.; Morris, Roy O.

    2001-01-01

    It is generally accepted that cytokinin oxidases, which oxidatively remove cytokinin side chains to produce adenine and the corresponding isopentenyl aldehyde, play a major role in regulating cytokinin levels in planta. Partially purified fractions of cytokinin oxidase from various species have been studied for many years, but have yet to clearly reveal the properties of the enzyme or to define its biological significance. Details of the genomic organization of the recently isolated maize (Zea mays) cytokinin oxidase gene (ckx1) and some of its Arabidopsis homologs are now presented. Expression of an intronless ckx1 in Pichia pastoris allowed production of large amounts of recombinant cytokinin oxidase and facilitated detailed kinetic and cofactor analysis and comparison with the native enzyme. The enzyme is a flavoprotein containing covalently bound flavin adenine dinucleotide, but no detectable heavy metals. Expression of the oxidase in maize tissues is described. PMID:11154345

  8. Procedure Enabling Simulation and In-Depth Analysis of Optical Effects in Camera-Based Time-Of Sensors

    NASA Astrophysics Data System (ADS)

    Baumgart, M.; Druml, N.; Consani, M.

    2018-05-01

    This paper presents a simulation approach for Time-of-Flight cameras to estimate sensor performance and accuracy, as well as to help understanding experimentally discovered effects. The main scope is the detailed simulation of the optical signals. We use a raytracing-based approach and use the optical path length as the master parameter for depth calculations. The procedure is described in detail with references to our implementation in Zemax OpticStudio and Python. Our simulation approach supports multiple and extended light sources and allows accounting for all effects within the geometrical optics model. Especially multi-object reflection/scattering ray-paths, translucent objects, and aberration effects (e.g. distortion caused by the ToF lens) are supported. The optical path length approach also enables the implementation of different ToF senor types and transient imaging evaluations. The main features are demonstrated on a simple 3D test scene.

  9. Profile of Students Uninterested in Practicum Class at Faculty of Engineering Universitas Negeri Surabaya

    NASA Astrophysics Data System (ADS)

    Munoto; Sondang, Meini; Satriana, FMS

    2018-04-01

    This study aims to determine the characteristics of students who were uninterested in attending practicum classes. This study applied naturalistic qualitative research methods using participatory observation and interviews. The data validity was ensured by triangulation, detailed description, length of observation time, as well as details and thorough observation. The data were analyzed using domain analysis and followed by conducting taxonomic, component, and thematic analyses. The results of the study indicate that faineant students show a negative behavior while attending laboratory practicums. They have a lack of motivation, effective interaction, and attention. The cognitive abilities vary from low to high. Other causes on how the aspects of the study were low were found as well, therefore improving those aspects allows students to raise their interest in practicum classes. The impact is to create well-skilled vocational teachers in conducting practicum classes in vocational schools and to make graduates better-prepared for the workforce.

  10. DFT-derived reactive potentials for the simulation of activated processes: the case of CdTe and CdTe:S.

    PubMed

    Hu, Xiao Liang; Ciaglia, Riccardo; Pietrucci, Fabio; Gallet, Grégoire A; Andreoni, Wanda

    2014-06-19

    We introduce a new ab initio derived reactive potential for the simulation of CdTe within density functional theory (DFT) and apply it to calculate both static and dynamical properties of a number of systems (bulk solid, defective structures, liquid, surfaces) at finite temperature. In particular, we also consider cases with low sulfur concentration (CdTe:S). The analysis of DFT and classical molecular dynamics (MD) simulations performed with the same protocol leads to stringent performance tests and to a detailed comparison of the two schemes. Metadynamics techniques are used to empower both Car-Parrinello and classical molecular dynamics for the simulation of activated processes. For the latter, we consider surface reconstruction and sulfur diffusion in the bulk. The same procedures are applied using previously proposed force fields for CdTe and CdTeS materials, thus allowing for a detailed comparison of the various schemes.

  11. PTools: an opensource molecular docking library

    PubMed Central

    Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin

    2009-01-01

    Background Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. Results We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. Conclusion The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation. PMID:19409097

  12. PTools: an opensource molecular docking library.

    PubMed

    Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin

    2009-05-01

    Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation.

  13. Photochemical methods to assay DNA photocleavage using supercoiled pUC18 DNA and LED or xenon arc lamp excitation.

    PubMed

    Prussin, Aaron J; Zigler, David F; Jain, Avijita; Brown, Jared R; Winkel, Brenda S J; Brewer, Karen J

    2008-04-01

    Methods for the study of DNA photocleavage are illustrated using a mixed-metal supramolecular complex [{(bpy)(2)Ru(dpp)}(2)RhCl(2)]Cl(5). The methods use supercoiled pUC18 plasmid as a DNA probe and either filtered light from a xenon arc lamp source or monochromatic light from a newly designed, high-intensity light-emitting diode (LED) array. Detailed methods for performing the photochemical experiments and analysis of the DNA photoproduct are delineated. Detailed methods are also given for building an LED array to be used for DNA photolysis experiments. The Xe arc source has a broad spectral range and high light flux. The LEDs have a high-intensity, nearly monochromatic output. Arrays of LEDs have the advantage of allowing tunable, accurate output to multiple samples for high-throughput photochemistry experiments at relatively low cost.

  14. Current Lewis Turbomachinery Research: Building on our Legacy of Excellence

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1997-01-01

    This Wu Chang-Hua lecture is concerned with the development of analysis and computational capability for turbomachinery flows which is based on detailed flow field physics. A brief review of the work of Professor Wu is presented as well as a summary of the current NASA aeropropulsion programs. Two major areas of research are described in order to determine our predictive capabilities using modern day computational tools evolved from the work of Professor Wu. In one of these areas, namely transonic rotor flow, it is demonstrated that a high level of accuracy is obtainable provided sufficient geometric detail is simulated. In the second case, namely turbine heat transfer, our capability is lacking for rotating blade rows and experimental correlations will provide needed information in the near term. It is believed that continuing progress will allow us to realize the full computational potential and its impact on design time and cost.

  15. Semi-Empirical Prediction of Aircraft Low-Speed Aerodynamic Characteristics

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2015-01-01

    This paper lays out a comprehensive methodology for computing a low-speed, high-lift polar, without requiring additional details about the aircraft design beyond what is typically available at the conceptual design stage. Introducing low-order, physics-based aerodynamic analyses allows the methodology to be more applicable to unconventional aircraft concepts than traditional, fully-empirical methods. The methodology uses empirical relationships for flap lift effectiveness, chord extension, drag-coefficient increment and maximum lift coefficient of various types of flap systems as a function of flap deflection, and combines these increments with the characteristics of the unflapped airfoils. Once the aerodynamic characteristics of the flapped sections are known, a vortex-lattice analysis calculates the three-dimensional lift, drag and moment coefficients of the whole aircraft configuration. This paper details the results of two validation cases: a supercritical airfoil model with several types of flaps; and a 12-foot, full-span aircraft model with slats and double-slotted flaps.

  16. UAV and SfM in Detailed Geomorphological Mapping of Granite Tors: An Example of Starościńskie Skały (Sudetes, SW Poland)

    NASA Astrophysics Data System (ADS)

    Kasprzak, Marek; Jancewicz, Kacper; Michniewicz, Aleksandra

    2017-11-01

    The paper presents an example of using photographs taken by unmanned aerial vehicles (UAV) and processed using the structure from motion (SfM) procedure in a geomorphological study of rock relief. Subject to analysis is a small rock city in the West Sudetes (SW Poland), known as Starościńskie Skały and developed in coarse granite bedrock. The aims of this paper were, first, to compare UAV/SfM-derived data with the cartographical image based on the traditional geomorphological field-mapping methods and the digital elevation model derived from airborne laser scanning (ALS). Second, to test if the proposed combination of UAV and SfM methods may be helpful in recognizing the detailed structure of granite tors. As a result of conducted UAV flights and digital image post-processing in AgiSoft software, it was possible to obtain datasets (dense point cloud, texture model, orthophotomap, bare-ground-type digital terrain model—DTM) which allowed to visualize in detail the surface of the study area. In consequence, it was possible to distinguish even the very small forms of rock surface microrelief: joints, aplite veins, rills and karren, weathering pits, etc., otherwise difficult to map and measure. The study includes also valorization of particular datasets concerning microtopography and allows to discuss indisputable advantages of using the UAV/SfM-based DTM in geomorphic studies of tors and rock cities, even those located within forest as in the presented case study.

  17. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets

    PubMed Central

    Johnson, Z. P.; Eady, R. D.; Ahmad, S. F.; Agravat, S.; Morris, T; Else, J; Lank, S. M.; Wiseman, R. W.; O’Connor, D. H.; Penedo, M. C. T.; Larsen, C. P.

    2012-01-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permitsmultiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox onWindows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie. kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo, user name: imsdemo7@gmail.com and password: imsdemo. PMID:22080300

  18. Uronic polysaccharide degrading enzymes.

    PubMed

    Garron, Marie-Line; Cygler, Miroslaw

    2014-10-01

    In the past several years progress has been made in the field of structure and function of polysaccharide lyases (PLs). The number of classified polysaccharide lyase families has increased to 23 and more detailed analysis has allowed the identification of more closely related subfamilies, leading to stronger correlation between each subfamily and a unique substrate. The number of as yet unclassified polysaccharide lyases has also increased and we expect that sequencing projects will allow many of these unclassified sequences to emerge as new families. The progress in structural analysis of PLs has led to having at least one representative structure for each of the families and for two unclassified enzymes. The newly determined structures have folds observed previously in other PL families and their catalytic mechanisms follow either metal-assisted or Tyr/His mechanisms characteristic for other PL enzymes. Comparison of PLs with glycoside hydrolases (GHs) shows several folds common to both classes but only for the β-helix fold is there strong indication of divergent evolution from a common ancestor. Analysis of bacterial genomes identified gene clusters containing multiple polysaccharide cleaving enzymes, the Polysaccharides Utilization Loci (PULs), and their gene complement suggests that they are organized to process completely a specific polysaccharide. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Annular Air Leaks in a liquid hydrogen storage tank

    NASA Astrophysics Data System (ADS)

    Krenn, AG; Youngquist, RC; Starr, SO

    2017-12-01

    Large liquid hydrogen (LH2) storage tanks are vital infrastructure for NASA, the DOD, and industrial users. Over time, air may leak into the evacuated, perlite filled annular region of these tanks. Once inside, the extremely low temperatures will cause most of the air to freeze. If a significant mass of air is allowed to accumulate, severe damage can result from nominal draining operations. Collection of liquid air on the outer shell may chill it below its ductility range, resulting in fracture. Testing and analysis to quantify the thermal conductivity of perlite that has nitrogen frozen into its interstitial spaces and to determine the void fraction of frozen nitrogen within a perlite/frozen nitrogen mixture is presented. General equations to evaluate methods for removing frozen air, while avoiding fracture, are developed. A hypothetical leak is imposed on an existing tank geometry and a full analysis of that leak is detailed. This analysis includes a thermal model of the tank and a time-to-failure calculation. Approaches to safely remove the frozen air are analyzed, leading to the conclusion that the most feasible approach is to allow the frozen air to melt and to use a water stream to prevent the outer shell from chilling.

  20. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    PubMed

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo.

  1. Ultrasonic vocalisation emitted by infant rodents: a tool for assessment of neurobehavioural development.

    PubMed

    Branchi, I; Santucci, D; Alleva, E

    2001-11-01

    Ultrasonic vocalisations (USVs) emitted by altricial rodent pups are whistle-like sounds with frequencies between 30 and 90 kHz. These signals play an important communicative role in mother-offspring interaction since they elicit in the dam a prompt response concerning caregiving behaviours. Both physical and social parameters modulate the USV emission in the infant rodent. Recently, a more detailed analysis of the ultrasonic vocalisation pattern, considering the spectrographic structure of sounds has allowed a deeper investigation of this behaviour. In order to investigate neurobehavioural development, the analysis of USVs presents several advantages, mainly: (i) USVs are one of the few responses produced by very young mice that can be quantitatively analysed and elicited by quantifiable stimuli; (ii) USV production follows a clear ontogenetic profile from birth to PND 14-15, thus allowing longitudinal neurobehavioural analysis during very early postnatal ontogeny. The study of this ethologically-ecologically relevant behaviour represent a valid model to evaluate possible alterations in the neurobehavioural development of perinatally treated or genetically modified infant rodents. Furthermore, the role played by several receptor agonists and antagonists in modulating USV rate makes this measure particularly important when investigating the effects of anxiogenic and anxiolytic compounds, and emotional behaviour in general.

  2. Finite-Element Analysis of a Mach-8 Flight Test Article Using Nonlinear Contact Elements

    NASA Technical Reports Server (NTRS)

    Richards, W. Lance

    1997-01-01

    A flight test article, called a glove, is required for a Mach-8 boundary-layer experiment to be conducted on a flight mission of the air-launched Pegasus(reg) space booster. The glove is required to provide a smooth, three-dimensional, structurally stable, aerodynamic surface and includes instrumentation to determine when and where boundary-layer transition occurs during the hypersonic flight trajectory. A restraint mechanism has been invented to attach the glove to the wing of the space booster. The restraint mechanism securely attaches the glove to the wing in directions normal to the wing/glove interface surface, but allows the glove to thermally expand and contract to alleviate stresses in directions parallel to the interface surface. A finite-element analysis has been performed using nonlinear contact elements to model the complex behavior of the sliding restraint mechanism. This paper provides an overview of the glove design and presents details of the analysis that were essential to demonstrate the flight worthiness of the wing-glove test article. Results show that all glove components are well within the allowable stress and deformation requirements to satisfy the objectives of the flight research experiment.

  3. Structure of force networks in tapped particulate systems of disks and pentagons. II. Persistence analysis.

    PubMed

    Kondic, L; Kramár, M; Pugnaloni, Luis A; Carlevaro, C Manuel; Mischaikow, K

    2016-06-01

    In the companion paper [Pugnaloni et al., Phys. Rev. E 93, 062902 (2016)10.1103/PhysRevE.93.062902], we use classical measures based on force probability density functions (PDFs), as well as Betti numbers (quantifying the number of components, related to force chains, and loops), to describe the force networks in tapped systems of disks and pentagons. In the present work, we focus on the use of persistence analysis, which allows us to describe these networks in much more detail. This approach allows us not only to describe but also to quantify the differences between the force networks in different realizations of a system, in different parts of the considered domain, or in different systems. We show that persistence analysis clearly distinguishes the systems that are very difficult or impossible to differentiate using other means. One important finding is that the differences in force networks between disks and pentagons are most apparent when loops are considered: the quantities describing properties of the loops may differ significantly even if other measures (properties of components, Betti numbers, force PDFs, or the stress tensor) do not distinguish clearly or at all the investigated systems.

  4. SCOWLP classification: Structural comparison and analysis of protein binding regions

    PubMed Central

    Teyra, Joan; Paszkowski-Rogacz, Maciej; Anders, Gerd; Pisabarro, M Teresa

    2008-01-01

    Background Detailed information about protein interactions is critical for our understanding of the principles governing protein recognition mechanisms. The structures of many proteins have been experimentally determined in complex with different ligands bound either in the same or different binding regions. Thus, the structural interactome requires the development of tools to classify protein binding regions. A proper classification may provide a general view of the regions that a protein uses to bind others and also facilitate a detailed comparative analysis of the interacting information for specific protein binding regions at atomic level. Such classification might be of potential use for deciphering protein interaction networks, understanding protein function, rational engineering and design. Description Protein binding regions (PBRs) might be ideally described as well-defined separated regions that share no interacting residues one another. However, PBRs are often irregular, discontinuous and can share a wide range of interacting residues among them. The criteria to define an individual binding region can be often arbitrary and may differ from other binding regions within a protein family. Therefore, the rational behind protein interface classification should aim to fulfil the requirements of the analysis to be performed. We extract detailed interaction information of protein domains, peptides and interfacial solvent from the SCOWLP database and we classify the PBRs of each domain family. For this purpose, we define a similarity index based on the overlapping of interacting residues mapped in pair-wise structural alignments. We perform our classification with agglomerative hierarchical clustering using the complete-linkage method. Our classification is calculated at different similarity cut-offs to allow flexibility in the analysis of PBRs, feature especially interesting for those protein families with conflictive binding regions. The hierarchical classification of PBRs is implemented into the SCOWLP database and extends the SCOP classification with three additional family sub-levels: Binding Region, Interface and Contacting Domains. SCOWLP contains 9,334 binding regions distributed within 2,561 families. In 65% of the cases we observe families containing more than one binding region. Besides, 22% of the regions are forming complex with more than one different protein family. Conclusion The current SCOWLP classification and its web application represent a framework for the study of protein interfaces and comparative analysis of protein family binding regions. This comparison can be performed at atomic level and allows the user to study interactome conservation and variability. The new SCOWLP classification may be of great utility for reconstruction of protein complexes, understanding protein networks and ligand design. SCOWLP will be updated with every SCOP release. The web application is available at . PMID:18182098

  5. Resolution analysis of archive films for the purpose of their optimal digitization and distribution

    NASA Astrophysics Data System (ADS)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2017-09-01

    With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.

  6. Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark

    2016-03-16

    The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.

  7. The application of X-ray microtomography for the assessement of root resorption caused by the orthodontic treatment of premolars.

    PubMed

    Sawicka, Monika; Bedini, Rossella; Pecci, Raffaella; Pameijer, Cornelis Hans; Kmiec, Zbigniew

    2012-01-01

    The purpose of this study was to demonstrate potential application of micro-computed tomography in the morphometric analysis of the root resorption in extracted human first premolars subjected to the orthodontic force. In one patient treated in the orthodontic clinic two mandibular first premolars subjected to orthodontic force for 4 weeks and one control tooth were selected for micro-computed tomographic analysis. The hardware device used in this study was a desktop X-ray microfocus CT scanner (SkyScan 1072). The morphology of root's surfaces was assessed by TView and Computer Tomography Analyzer (CTAn) softwares (SkyScan, bvba) which allowed analysis of all microscans, identification of root resorption craters and measurement of their length, width and volume. Microscans showed in details the surface morphology of the investigated teeth. The analysis of microscans allowed to detect 3 root resorption cavities in each of the orthodontically moved tooth and only one resorption crater in the control tooth. The volumes of the resorption craters in orthodontically-treated teeth were much larger than in a control tooth. Micro-computed tomography is a reproducible technique for the three-dimensional non-invasive assessment of root's morphology ex vivo. TView and CTan softwares are useful in accurate morphometric measurements of root's resorption.

  8. Current and Emerging Technologies for the Analysis of the Genome-Wide and Locus-Specific DNA Methylation Patterns.

    PubMed

    Tost, Jörg

    2016-01-01

    DNA methylation is the most studied epigenetic modification, and altered DNA methylation patterns have been identified in cancer and more recently also in many other complex diseases. Furthermore, DNA methylation is influenced by a variety of environmental factors, and the analysis of DNA methylation patterns might allow deciphering previous exposure. Although a large number of techniques to study DNA methylation either genome-wide or at specific loci have been devised, they all are based on a limited number of principles for differentiating the methylation state, viz., methylation-specific/methylation-dependent restriction enzymes, antibodies or methyl-binding proteins, chemical-based enrichment, or bisulfite conversion. Second-generation sequencing has largely replaced microarrays as readout platform and is also becoming more popular for locus-specific DNA methylation analysis. In this chapter, the currently used methods for both genome-wide and locus-specific analysis of 5-methylcytosine and as its oxidative derivatives, such as 5-hydroxymethylcytosine, are reviewed in detail, and the advantages and limitations of each approach are discussed. Furthermore, emerging technologies avoiding PCR amplification and allowing a direct readout of DNA methylation are summarized, together with novel applications, such as the detection of DNA methylation in single cells or in circulating cell-free DNA.

  9. Cohort profile: the chronic kidney disease prognosis consortium.

    PubMed

    Matsushita, Kunihiro; Ballew, Shoshana H; Astor, Brad C; Jong, Paul E de; Gansevoort, Ron T; Hemmelgarn, Brenda R; Levey, Andrew S; Levin, Adeera; Wen, Chi-Pang; Woodward, Mark; Coresh, Josef

    2013-12-01

    The Chronic Kidney Disease Prognosis Consortium (CKD-PC) was established in 2009 to provide comprehensive evidence about the prognostic impact of two key kidney measures that are used to define and stage CKD, estimated glomerular filtration rate (eGFR) and albuminuria, on mortality and kidney outcomes. CKD-PC currently consists of 46 cohorts with data on these kidney measures and outcomes from >2 million participants spanning across 40 countries/regions all over the world. CKD-PC published four meta-analysis articles in 2010-11, providing key evidence for an international consensus on the definition and staging of CKD and an update for CKD clinical practice guidelines. The consortium continues to work on more detailed analysis (subgroups, different eGFR equations, other exposures and outcomes, and risk prediction). CKD-PC preferably collects individual participant data but also applies a novel distributed analysis model, in which each cohort runs statistical analysis locally and shares only analysed outputs for meta-analyses. This distributed model allows inclusion of cohorts which cannot share individual participant level data. According to agreement with cohorts, CKD-PC will not share data with third parties, but is open to including further eligible cohorts. Each cohort can opt in/out for each topic. CKD-PC has established a productive and effective collaboration, allowing flexible participation and complex meta-analyses for studying CKD.

  10. Two-photon confocal microscopy in wound healing

    NASA Astrophysics Data System (ADS)

    Navarro, Fernando A.; So, Peter T. C.; Driessen, Antoine; Kropf, Nina; Park, Christine S.; Huertas, Juan C.; Lee, Hoon B.; Orgill, Dennis P.

    2001-04-01

    Advances in histopathology and immunohistochemistry have allowed for precise microanatomic detail of tissues. Two Photon Confocal Microscopy (TPCM) is a new technology useful in non-destructive analysis of tissue. Laser light excites the natural florophores, NAD(P)H and NADP+ and the scattering patterns of the emitted light are analyzed to reconstruct microanatomic features. Guinea pig skin was studied using TPCM and skin preparation methods including chemical depilation and tape striping. Results of TPCM were compared with conventional hematoxylin and eosin microscopy. Two-dimensional images were rendered from the three dimensional reconstructions. Images of deeper layers including basal cells and the dermo-epidermal junction improved after removing the stratum corneum with chemical depilation or tape stripping. TCPM allows good resolution of corneocytes, basal cells and collagen fibers and shows promise as a non-destructive method to study wound healing.

  11. Trajectory Browser: An Online Tool for Interplanetary Trajectory Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    Foster, Cyrus James

    2013-01-01

    The trajectory browser is a web-based tool developed at the NASA Ames Research Center for finding preliminary trajectories to planetary bodies and for providing relevant launch date, time-of-flight and (Delta)V requirements. The site hosts a database of transfer trajectories from Earth to planets and small-bodies for various types of missions such as rendezvous, sample return or flybys. A search engine allows the user to find trajectories meeting desired constraints on the launch window, mission duration and (Delta)V capability, while a trajectory viewer tool allows the visualization of the heliocentric trajectory and the detailed mission itinerary. The anticipated user base of this tool consists primarily of scientists and engineers designing interplanetary missions in the context of pre-phase A studies, particularly for performing accessibility surveys to large populations of small-bodies.

  12. Camera, Hand Lens, and Microscope Probe (CHAMP): An Instrument Proposed for the 2009 MSL Rover Mission

    NASA Technical Reports Server (NTRS)

    Mungas, Greg S.; Beegle, Luther W.; Boynton, John E.; Lee, Pascal; Shidemantle, Ritch; Fisher, Ted

    2004-01-01

    The Camera, Hand Lens, and Microscope Probe (CHAMP) will allow examination of martian surface features and materials (terrain, rocks, soils, samples) on spatial scales ranging from kilometers to micrometers, thus enabling both microscopy and context imaging with high operational flexibility. CHAMP is designed to allow the detailed and quantitative investigation of a wide range of geologic features and processes on Mars, leading to a better quantitative understanding of the evolution of the martian surface environment through time. In particular, CHAMP will provide key data that will help understand the local region explored by Mars Surface Laboratory (MSL) as a potential habitat for life. CHAMP will also support other anticipated MSL investigations, in particular by helping identify and select the highest priority targets for sample collection and analysis by the MSL's analytical suite.

  13. Computer program for analysis of split-Stirling-cycle cryogenic coolers

    NASA Technical Reports Server (NTRS)

    Brown, M. T.; Russo, S. C.

    1983-01-01

    A computer program for predicting the detailed thermodynamic performance of split-Stirling-cycle refrigerators has been developed. The mathematical model includes the refrigerator cold head, free-displacer/regenerator, gas transfer line, and provision for modeling a mechanical or thermal compressor. To allow for dynamic processes (such as aerodynamic friction and heat transfer) temperature, pressure, and mass flow rate are varied by sub-dividing the refrigerator into an appropriate number of fluid and structural control volumes. Of special importance to modeling of cryogenic coolers is the inclusion of real gas properties, and allowance for variation of thermo-physical properties such as thermal conductivities, specific heats and viscosities, with temperature and/or pressure. The resulting model, therefore, comprehensively simulates the split-cycle cooler both spatially and temporally by reflecting the effects of dynamic processes and real material properties.

  14. Analysis of problems and failures in the measurement of soil-gas radon concentration.

    PubMed

    Neznal, Martin; Neznal, Matěj

    2014-07-01

    Long-term experience in the field of soil-gas radon concentration measurements allows to describe and explain the most frequent causes of failures, which can appear in practice when various types of measurement methods and soil-gas sampling techniques are used. The concept of minimal sampling depth, which depends on the volume of the soil-gas sample and on the soil properties, is shown in detail. Consideration of minimal sampling depth at the time of measurement planning allows to avoid the most common mistakes. The ways how to identify influencing parameters, how to avoid a dilution of soil-gas samples by the atmospheric air, as well as how to recognise inappropriate sampling methods are discussed. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. 78 FR 66929 - Intent To Conduct a Detailed Economic Impact Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... EXPORT-IMPORT BANK Intent To Conduct a Detailed Economic Impact Analysis AGENCY: Policy and... Federal Register notice informing the public of its intent to conduct a detailed economic impact analysis... subject to a detailed economic impact analysis. DATES: The Federal Register notice published on August 5...

  16. Network and biosignature analysis for the integration of transcriptomic and metabolomic data to characterize leaf senescence process in sunflower.

    PubMed

    Moschen, Sebastián; Higgins, Janet; Di Rienzo, Julio A; Heinz, Ruth A; Paniego, Norma; Fernandez, Paula

    2016-06-06

    In recent years, high throughput technologies have led to an increase of datasets from omics disciplines allowing the understanding of the complex regulatory networks associated with biological processes. Leaf senescence is a complex mechanism controlled by multiple genetic and environmental variables, which has a strong impact on crop yield. Transcription factors (TFs) are key proteins in the regulation of gene expression, regulating different signaling pathways; their function is crucial for triggering and/or regulating different aspects of the leaf senescence process. The study of TF interactions and their integration with metabolic profiles under different developmental conditions, especially for a non-model organism such as sunflower, will open new insights into the details of gene regulation of leaf senescence. Weighted Gene Correlation Network Analysis (WGCNA) and BioSignature Discoverer (BioSD, Gnosis Data Analysis, Heraklion, Greece) were used to integrate transcriptomic and metabolomic data. WGCNA allowed the detection of 10 metabolites and 13 TFs whereas BioSD allowed the detection of 1 metabolite and 6 TFs as potential biomarkers. The comparative analysis demonstrated that three transcription factors were detected through both methodologies, highlighting them as potentially robust biomarkers associated with leaf senescence in sunflower. The complementary use of network and BioSignature Discoverer analysis of transcriptomic and metabolomic data provided a useful tool for identifying candidate genes and metabolites which may have a role during the triggering and development of the leaf senescence process. The WGCNA tool allowed us to design and test a hypothetical network in order to infer relationships across selected transcription factor and metabolite candidate biomarkers involved in leaf senescence, whereas BioSignature Discoverer selected transcripts and metabolites which discriminate between different ages of sunflower plants. The methodology presented here would help to elucidate and predict novel networks and potential biomarkers of leaf senescence in sunflower.

  17. Combined experimental and theoretical study of fast atom diffraction on the β2(2×4) reconstructed GaAs(001) surface

    NASA Astrophysics Data System (ADS)

    Debiossac, M.; Zugarramurdi, A.; Khemliche, H.; Roncin, P.; Borisov, A. G.; Momeni, A.; Atkinson, P.; Eddrief, M.; Finocchi, F.; Etgens, V. H.

    2014-10-01

    A grazing incidence fast atom diffraction (GIFAD or FAD) setup, installed on a molecular beam epitaxy chamber, has been used to characterize the β2(2×4) reconstruction of a GaAs(001) surface at 530∘C under an As4 overpressure. Using a 400-eV 4He beam, high-resolution diffraction patterns with up to eighty well-resolved diffraction orders are observed simultaneously, providing a detailed fingerprint of the surface structure. Experimental diffraction data are in good agreement with results from quantum scattering calculations based on an ab initio projectile-surface interaction potential. Along with exact calculations, we show that a straightforward semiclassical analysis allows the features of the diffraction chart to be linked to the main characteristics of the surface reconstruction topography. Our results demonstrate that GIFAD is a technique suitable for measuring in situ the subtle details of complex surface reconstructions. We have performed measurements at very small incidence angles, where the kinetic energy of the projectile motion perpendicular to the surface can be reduced to less than 1 meV. This allowed the depth of the attractive van der Waals potential well to be estimated as -8.7 meV in very good agreement with results reported in literature.

  18. Towards rationally redesigning bacterial signaling systems using information encoded in abundant sequence data

    NASA Astrophysics Data System (ADS)

    Cheng, Ryan; Morcos, Faruck; Levine, Herbert; Onuchic, Jose

    2014-03-01

    An important challenge in biology is to distinguish the subset of residues that allow bacterial two-component signaling (TCS) proteins to preferentially interact with their correct TCS partner such that they can bind and transfer signal. Detailed knowledge of this information would allow one to search sequence-space for mutations that can systematically tune the signal transmission between TCS partners as well as re-encode a TCS protein to preferentially transfer signals to a non-partner. Motivated by the notion that this detailed information is found in sequence data, we explore the mutual sequence co-evolution between signaling partners to infer how mutations can positively or negatively alter their interaction. Using Direct Coupling Analysis (DCA) for determining evolutionarily conserved interprotein interactions, we apply a DCA-based metric to quantify mutational changes in the interaction between TCS proteins and demonstrate that it accurately correlates with experimental mutagenesis studies probing the mutational change in the in vitro phosphotransfer. Our methodology serves as a potential framework for the rational design of TCS systems as well as a framework for the system-level study of protein-protein interactions in sequence-rich systems. This research has been supported by the NSF INSPIRE award MCB-1241332 and by the CTBP sponsored by the NSF (Grant PHY-1308264).

  19. Microscopic modeling of gas-surface scattering: II. Application to argon atom adsorption on a platinum (111) surface

    NASA Astrophysics Data System (ADS)

    Filinov, A.; Bonitz, M.; Loffhagen, D.

    2018-06-01

    A new combination of first principle molecular dynamics (MD) simulations with a rate equation model presented in the preceding paper (paper I) is applied to analyze in detail the scattering of argon atoms from a platinum (111) surface. The combined model is based on a classification of all atom trajectories according to their energies into trapped, quasi-trapped and scattering states. The number of particles in each of the three classes obeys coupled rate equations. The coefficients in the rate equations are the transition probabilities between these states which are obtained from MD simulations. While these rates are generally time-dependent, after a characteristic time scale t E of several tens of picoseconds they become stationary allowing for a rather simple analysis. Here, we investigate this time scale by analyzing in detail the temporal evolution of the energy distribution functions of the adsorbate atoms. We separately study the energy loss distribution function of the atoms and the distribution function of in-plane and perpendicular energy components. Further, we compute the sticking probability of argon atoms as a function of incident energy, angle and lattice temperature. Our model is important for plasma-surface modeling as it allows to extend accurate simulations to longer time scales.

  20. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  1. Data analysis of the COMPTEL instrument on the NASA gamma ray observatory

    NASA Technical Reports Server (NTRS)

    Diehl, R.; Bennett, K.; Collmar, W.; Connors, A.; Denherder, J. W.; Hermsen, W.; Lichti, G. G.; Lockwood, J. A.; Macri, J.; Mcconnell, M.

    1992-01-01

    The Compton imaging telescope (COMPTEL) on the Gamma Ray Observatory (GRO) is a wide field of view instrument. The coincidence measurement technique in two scintillation detector layers requires specific analysis methods. Straightforward event projection into the sky is impossible. Therefore, detector events are analyzed in a multi-dimensional dataspace using a gamma ray sky hypothesis convolved with the point spread function of the instrument in this dataspace. Background suppression and analysis techniques have important implications on the gamma ray source results for this background limited telescope. The COMPTEL collaboration applies a software system of analysis utilities, organized around a database management system. The use of this system for the assistance of guest investigators at the various collaboration sites and external sites is foreseen and allows different detail levels of cooperation with the COMPTEL institutes, dependent on the type of data to be studied.

  2. Method of confidence domains in the analysis of noise-induced extinction for tritrophic population system

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina; Ryashko, Lev; Ryazanova, Tatyana

    2017-09-01

    A problem of the analysis of the noise-induced extinction in multidimensional population systems is considered. For the investigation of conditions of the extinction caused by random disturbances, a new approach based on the stochastic sensitivity function technique and confidence domains is suggested, and applied to tritrophic population model of interacting prey, predator and top predator. This approach allows us to analyze constructively the probabilistic mechanisms of the transition to the noise-induced extinction from both equilibrium and oscillatory regimes of coexistence. In this analysis, a method of principal directions for the reducing of the dimension of confidence domains is suggested. In the dispersion of random states, the principal subspace is defined by the ratio of eigenvalues of the stochastic sensitivity matrix. A detailed analysis of two scenarios of the noise-induced extinction in dependence on parameters of considered tritrophic system is carried out.

  3. Coupling gas chromatography and electronic nose detection for detailed cigarette smoke aroma characterization.

    PubMed

    Rambla-Alegre, Maria; Tienpont, Bart; Mitsui, Kazuhisa; Masugi, Eri; Yoshimura, Yuta; Nagata, Hisanori; David, Frank; Sandra, Pat

    2014-10-24

    Aroma characterization of whole cigarette smoke samples using sensory panels or electronic nose (E-nose) devices is difficult due to the masking effect of major constituents and solvent used for the extraction step. On the other hand, GC in combination with olfactometry detection does not allow to study the delicate balance and synergetic effect of aroma solutes. To overcome these limitations a new instrumental set-up consisting of heart-cutting gas chromatography using a capillary flow technology based Deans switch and low thermal mass GC in combination with an electronic nose device is presented as an alternative to GC-olfactometry. This new hyphenated GC-E-nose configuration is used for the characterization of cigarette smoke aroma. The system allows the transfer, combination or omission of selected GC fractions before injection in the E-nose. Principal component analysis (PCA) and discriminant factor analysis (DFA) allowed clear visualizing of the differences among cigarette brands and classifying them independently of their nicotine content. Omission and perceptual interaction tests could also be carried out using this configuration. The results are promising and suggest that the GC-E-nose hyphenation is a good approach to measure the contribution level of individual compounds to the whole cigarette smoke. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Application of long-term microdialysis in circadian rhythm research

    PubMed Central

    Borjigin, Jimo; Liu, Tiecheng

    2008-01-01

    Our laboratory has pioneered long-term microdialysis to monitor pineal melatonin secretion in living animals across multiple circadian cycles. There are numerous advantages of this approach for rhythm analysis: (1) we can precisely define melatonin onset and offset phases; (2) melatonin is a reliable and stable neuroendocrine output of the circadian clock (versus behavioral output which is sensitive to stress or other factors); (3) melatonin measurements can be performed extremely frequently, permitting high temporal resolution (10 min sampling intervals), which allows detection of slight changes in phase; (4) the measurements can be performed for more than four weeks, allowing perturbations of the circadian clock to be followed long-term in the same animals; (5) this is an automated process (microdialysis coupled with on-line HPLC analysis), which increases accuracy and bypasses the labor-intensive and error-prone manual handling of dialysis samples; and (6) our approach allows real-time investigation of circadian rhythm function and permits appropriate timely adjustments of experimental conditions. The longevity of microdialysis probes, the key to the success of this approach, depends at least in part on the methods of the construction and implantation of dialysis probes. In this article, we have detailed the procedures of construction and surgical implantation of microdialysis probes used currently in our laboratory, which are significantly improved from our previous methods. PMID:18045670

  5. Integrating atomistic molecular dynamics simulations, experiments, and network analysis to study protein dynamics: strength in unity.

    PubMed

    Papaleo, Elena

    2015-01-01

    In the last years, we have been observing remarkable improvements in the field of protein dynamics. Indeed, we can now study protein dynamics in atomistic details over several timescales with a rich portfolio of experimental and computational techniques. On one side, this provides us with the possibility to validate simulation methods and physical models against a broad range of experimental observables. On the other side, it also allows a complementary and comprehensive view on protein structure and dynamics. What is needed now is a better understanding of the link between the dynamic properties that we observe and the functional properties of these important cellular machines. To make progresses in this direction, we need to improve the physical models used to describe proteins and solvent in molecular dynamics, as well as to strengthen the integration of experiments and simulations to overcome their own limitations. Moreover, now that we have the means to study protein dynamics in great details, we need new tools to understand the information embedded in the protein ensembles and in their dynamic signature. With this aim in mind, we should enrich the current tools for analysis of biomolecular simulations with attention to the effects that can be propagated over long distances and are often associated to important biological functions. In this context, approaches inspired by network analysis can make an important contribution to the analysis of molecular dynamics simulations.

  6. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation;more » and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).« less

  7. Malassezia globosa and restricta: breakthrough understanding of the etiology and treatment of dandruff and seborrheic dermatitis through whole-genome analysis.

    PubMed

    Dawson, Thomas L

    2007-12-01

    Dandruff and seborrheic dermatitis (D/SD) share an etiology dependent upon three factors: sebum, microbial metabolism (specifically, Malassezia yeasts), and individual susceptibility. Advances in microbiological and analytical techniques permit a more detailed understanding of these etiologic factors, especially the role of Malassezia. Malassezia are lipid-dependent and demonstrate adaptation allowing them to exploit a narrow niche on sebum-rich skin. Work in our and our collaborators' laboratories has focused on understanding these adaptations by detailed analysis of biochemistry and gene expression. We have shown that Malassezia globosa and M. restricta predominate on dandruff scalp, that oleic acid alone can initiate dandruff-like desquamation, that M. globosa is the most likely initiating organism by virtue of its high lipase activity, and that an M. globosa lipase is expressed on human scalp. Considering the importance of M. globosa in D/SD (and the overall importance of commensal fungi), we have sequenced the M. globosa and M. restricta genomes. Genomic analysis indicates key adaptations to the skin environment, several of which yield important clues to the role Malassezia play in human disease. This work offers the promise of defining new treatments to D/SD that are targeted at changing the level or activities of Malassezia genes.

  8. Three-Way Analysis of Spectrospatial Electromyography Data: Classification and Interpretation

    PubMed Central

    Kauppi, Jukka-Pekka; Hahne, Janne; Müller, Klaus-Robert; Hyvärinen, Aapo

    2015-01-01

    Classifying multivariate electromyography (EMG) data is an important problem in prosthesis control as well as in neurophysiological studies and diagnosis. With modern high-density EMG sensor technology, it is possible to capture the rich spectrospatial structure of the myoelectric activity. We hypothesize that multi-way machine learning methods can efficiently utilize this structure in classification as well as reveal interesting patterns in it. To this end, we investigate the suitability of existing three-way classification methods to EMG-based hand movement classification in spectrospatial domain, as well as extend these methods by sparsification and regularization. We propose to use Fourier-domain independent component analysis as preprocessing to improve classification and interpretability of the results. In high-density EMG experiments on hand movements across 10 subjects, three-way classification yielded higher average performance compared with state-of-the art classification based on temporal features, suggesting that the three-way analysis approach can efficiently utilize detailed spectrospatial information of high-density EMG. Phase and amplitude patterns of features selected by the classifier in finger-movement data were found to be consistent with known physiology. Thus, our approach can accurately resolve hand and finger movements on the basis of detailed spectrospatial information, and at the same time allows for physiological interpretation of the results. PMID:26039100

  9. Coupled micromorphological and stable isotope analysis of Quaternary calcrete development

    NASA Astrophysics Data System (ADS)

    Adamson, Kathryn; Candy, Ian; Whitfield, Liz

    2015-09-01

    Pedogenic calcretes are widespread in arid and semi-arid regions. Using calcrete profiles from four river terraces of the Rio Alias in southeast Spain, this study explores the potential of using detailed micromorphological and stable isotopic analysis to more fully understand the impacts of Quaternary environmental change on calcrete development. The four profiles increase in carbonate complexity with progressive age, reflecting calcretisation over multiple glacial-interglacial cycles since MIS 9 (c. 300 ka). Calcrete profiles contain a mixture of Alpha (non-biogenic) and Beta (biogenic) microfabrics. Alpha fabrics have higher δ13C and δ18O values. The profiles contain a range of crystal textures, but there is little difference between the δ13C and δ18O values of spar, microspar, and micrite cements. Strong positive covariance between δ13C and δ18O suggests that both isotopes are responding to the same environmental parameter, which is inferred to be relative aridity. The study reveals that the detailed co-analysis of calcrete micromorphology and stable isotope signatures can allow patterns of calcrete formation to be placed into a wider palaeoclimatic context. This demonstrates the potential of this technique to more reliably constrain the palaeoenvironmental significance of secondary carbonates in dryland settings where other proxy records may be poorly preserved.

  10. Surface radiant flux densities inferred from LAC and GAC AVHRR data

    NASA Astrophysics Data System (ADS)

    Berger, F.; Klaes, D.

    To infer surface radiant flux densities from current (NOAA-AVHRR, ERS-1/2 ATSR) and future meteorological (Envisat AATSR, MSG, METOP) satellite data, the complex, modular analysis scheme SESAT (Strahlungs- und Energieflüsse aus Satellitendaten) could be developed (Berger, 2001). This scheme allows the determination of cloud types, optical and microphysical cloud properties as well as surface and TOA radiant flux densities. After testing of SESAT in Central Europe and the Baltic Sea catchment (more than 400scenes U including a detailed validation with various surface measurements) it could be applied to a large number of NOAA-16 AVHRR overpasses covering the globe.For the analysis, two different spatial resolutions U local area coverage (LAC) andwere considered. Therefore, all inferred results, like global area coverage (GAC) U cloud cover, cloud properties and radiant properties, could be intercompared. Specific emphasis could be made to the surface radiant flux densities (all radiative balance compoments), where results for different regions, like Southern America, Southern Africa, Northern America, Europe, and Indonesia, will be presented. Applying SESAT, energy flux densities, like latent and sensible heat flux densities could also be determined additionally. A statistical analysis of all results including a detailed discussion for the two spatial resolutions will close this study.

  11. Multivariate Analysis of Longitudinal Rates of Change

    PubMed Central

    Bryan, Matthew; Heagerty, Patrick J.

    2016-01-01

    Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed by Roy and Lin [1]; Proust-Lima, Letenneur and Jacqmin-Gadda [2]; and Gray and Brookmeyer [3] among others. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, Gray and Brookmeyer [3] introduce an “accelerated time” method which assumes that covariates rescale time in longitudinal models for disease progression. In this manuscript we detail an alternative multivariate model formulation that directly structures longitudinal rates of change, and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. PMID:27417129

  12. Structure and Function of Iron-Loaded Synthetic Melanin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yiwen; Xie, Yijun; Wang, Zhao

    We describe a synthetic method for increasing and controlling the iron loading of synthetic melanin nanoparticles and use the resulting materials to perform a systematic quantitative investigation on their structure- property relationship. A comprehensive analysis by magnetometry, electron paramagnetic resonance, and nuclear magnetic relaxation dispersion reveals the complexities of their magnetic behavior and how these intraparticle magnetic interactions manifest in useful material properties such as their performance as MRI contrast agents. This analysis allows predictions of the optimal iron loading through a quantitative modeling of antiferromagnetic coupling that arises from proximal iron ions. This study provides a detailed understanding ofmore » this complex class of synthetic biomaterials and gives insight into interactions and structures prevalent in naturally occurring melanins.« less

  13. [MALDI-TOF mass spectrometry in the investigation of large high-molecular biological compounds].

    PubMed

    Porubl'ova, L V; Rebriiev, A V; Hromovyĭ, T Iu; Minia, I I; Obolens'ka, M Iu

    2009-01-01

    MALDI-TOF (Matrix-Assisted Laser Desorption/Ionization Time-of-Flight) mass spectrometry has become, in the recent years, a tool of choice for analyses of biological polymers. The wide mass range, high accuracy, informativity and sensitivity make it a superior method for analysis of all kinds of high-molecular biological compounds including proteins, nucleic acids and lipids. MALDI-TOF-MS is particularly suitable for the identification of proteins by mass fingerprint or microsequencing. Therefore it has become an important technique of proteomics. Furthermore, the method allows making a detailed analysis of post-translational protein modifications, protein-protein and protein-nucleic acid interactions. Recently, the method was also successfully applied to nucleic acid sequencing as well as screening for mutations.

  14. The High Resolution Chandra X-Ray Spectrum of 3C273

    NASA Technical Reports Server (NTRS)

    Fruscione, Antonella; Lavoie, Anthony (Technical Monitor)

    2000-01-01

    The bright quasar 3C273 was observed by Chandra in January 2000 for 120 ksec as a calibration target. It was observed with all detector- plus-grating combinations (ACIS+HETG, ACIS+LETG, and HRC+LETG) yielding an X-ray spectrum across the entire 0.1-10 keV band with unprecedented spectral resolution. At about 10 arcsec from the nucleus, an X-ray jet is also clearly visible and resolved in the Oth order images. While the jet is much fainter than the nuclear source, the Chandra spatial resolution allows, for the first time, spectral analysis of both components separately. We will present detailed spectral analysis with particular emphasis on possible absorption features and comparison with simultaneous BeppoSAX data.

  15. Tube-Forming Assays.

    PubMed

    Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy

    2016-01-01

    Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.

  16. The active site structure of tetanus neurotoxin resolved by multiple scattering analysis in X-Ray absorption spectroscopy.

    PubMed Central

    Meneghini, C; Morante, S

    1998-01-01

    A detailed study of the x-ray absorption spectrum of tetanus neurotoxin in the K-edge EXAFS region of the zinc absorber is presented that allows the complete identification of the amino acid residues coordinated to the zinc active site. A very satisfactory interpretation of the experimental data can be given if multiple scattering contributions are included in the analysis. Comparing the absorption spectrum of tetanus neurotoxin to that of two other structurally similar zinc-endopeptidases, thermolysin and astacin, in which the zinc coordination mode is known from crystallographic data, we conclude that in tetanus neurotoxin, besides a water molecule, zinc is coordinated to two histidines and a tyrosine. PMID:9746536

  17. Analysis of Black Bearing Balls from a Space Shuttle Body Flap Actuator

    NASA Technical Reports Server (NTRS)

    Sovinski, Marjorie F.; Street, Kenneth W.

    2005-01-01

    A significantly deteriorated ball bearing mechanism from a body flap actuator on Space Shuttle OV-103 was disassembled and the balls submitted for analysis in conjunction with Return to Flight activities. The OV-103 balls, referred to as the "black balls", were subjected to X-ray photoelectron spectroscopy (XPS), Fourier transform infrared (FT-IR) and Raman micro spectroscopy, surface profilometry, and optical and electron microscopy. The spectroscopic results in combination with microscopy analysis allowed a determination of the lubricant degradation pathway. The chemical attack mechanism does not adequately explain the unique visual appearance of the black balls. Numerous efforts have unsuccessfully focused on duplication of the phenomena causing this unique surface structure and appearance of the black balls. Further detail will be presented supporting these conclusions along with plausible explanations of the unique black appearance to the balls.

  18. Skeletal Mechanism Generation of Surrogate Jet Fuels for Aeropropulsion Modeling

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Jen; Niemeyer, Kyle E.

    2010-05-01

    A novel implementation for the skeletal reduction of large detailed reaction mechanisms using the directed relation graph with error propagation and sensitivity analysis (DRGEPSA) is developed and presented with skeletal reductions of two important hydrocarbon components, n-heptane and n-decane, relevant to surrogate jet fuel development. DRGEPSA integrates two previously developed methods, directed relation graph-aided sensitivity analysis (DRGASA) and directed relation graph with error propagation (DRGEP), by first applying DRGEP to efficiently remove many unimportant species prior to sensitivity analysis to further remove unimportant species, producing an optimally small skeletal mechanism for a given error limit. It is illustrated that the combination of the DRGEP and DRGASA methods allows the DRGEPSA approach to overcome the weaknesses of each previous method, specifically that DRGEP cannot identify all unimportant species and that DRGASA shields unimportant species from removal.

  19. A Look Inside HIV Resistance through Retroviral Protease Interaction Maps

    PubMed Central

    Kontijevskis, Aleksejs; Prusis, Peteris; Petrovska, Ramona; Yahorava, Sviatlana; Mutulis, Felikss; Mutule, Ilze; Komorowski, Jan; Wikberg, Jarl E. S

    2007-01-01

    Retroviruses affect a large number of species, from fish and birds to mammals and humans, with global socioeconomic negative impacts. Here the authors report and experimentally validate a novel approach for the analysis of the molecular networks that are involved in the recognition of substrates by retroviral proteases. Using multivariate analysis of the sequence-based physiochemical descriptions of 61 retroviral proteases comprising wild-type proteases, natural mutants, and drug-resistant forms of proteases from nine different viral species in relation to their ability to cleave 299 substrates, the authors mapped the physicochemical properties and cross-dependencies of the amino acids of the proteases and their substrates, which revealed a complex molecular interaction network of substrate recognition and cleavage. The approach allowed a detailed analysis of the molecular–chemical mechanisms involved in substrate cleavage by retroviral proteases. PMID:17352531

  20. An analysis of a candidate control algorithm for a ride quality augmentation system

    NASA Technical Reports Server (NTRS)

    Suikat, Reiner; Donaldson, Kent; Downing, David R.

    1987-01-01

    This paper presents a detailed analysis of a candidate algorithm for a ride quality augmentation system. The algorithm consists of a full-state feedback control law based on optimal control output weighting, estimators for angle of attack and sideslip, and a maneuvering algorithm. The control law is shown to perform well by both frequency and time domain analysis. The rms vertical acceleration is reduced by about 40 percent over the whole mission flight envelope. The estimators for the angle of attack and sideslip avoid the often inaccurate or costly direct measurement of those angles. The maneuvering algorithm will allow the augmented airplane to respond to pilot inputs. The design characteristics and performance are documented by the closed-loop eigenvalues; rms levels of vertical, lateral, and longitudinal acceleration; and representative time histories and frequency response.

  1. A Graphical User Interface for Software-assisted Tracking of Protein Concentration in Dynamic Cellular Protrusions.

    PubMed

    Saha, Tanumoy; Rathmann, Isabel; Galic, Milos

    2017-07-11

    Filopodia are dynamic, finger-like cellular protrusions associated with migration and cell-cell communication. In order to better understand the complex signaling mechanisms underlying filopodial initiation, elongation and subsequent stabilization or retraction, it is crucial to determine the spatio-temporal protein activity in these dynamic structures. To analyze protein function in filopodia, we recently developed a semi-automated tracking algorithm that adapts to filopodial shape-changes, thus allowing parallel analysis of protrusion dynamics and relative protein concentration along the whole filopodial length. Here, we present a detailed step-by-step protocol for optimized cell handling, image acquisition and software analysis. We further provide instructions for the use of optional features during image analysis and data representation, as well as troubleshooting guidelines for all critical steps along the way. Finally, we also include a comparison of the described image analysis software with other programs available for filopodia quantification. Together, the presented protocol provides a framework for accurate analysis of protein dynamics in filopodial protrusions using image analysis software.

  2. Simultaneous measurement of temperature and emissivity of lunar regolith simulant using dual-channel millimeter-wave radiometry.

    PubMed

    McCloy, J S; Sundaram, S K; Matyas, J; Woskov, P P

    2011-05-01

    Millimeter wave (MMW) radiometry can be used for simultaneous measurement of emissivity and temperature of materials under extreme environments (high temperature, pressure, and corrosive environments). The state-of-the-art dual channel MMW passive radiometer with active interferometric capabilities at 137 GHz described here allows for radiometric measurements of sample temperature and emissivity up to at least 1600 °C with simultaneous measurement of sample surface dynamics. These capabilities have been used to demonstrate dynamic measurement of melting of powders of simulated lunar regolith and static measurement of emissivity of solid samples. The paper presents the theoretical background and basis for the dual-receiver system, describes the hardware in detail, and demonstrates the data analysis. Post-experiment analysis of emissivity versus temperature allows further extraction from the radiometric data of millimeter wave viewing beam coupling factors, which provide corroboratory evidence to the interferometric data of the process dynamics observed. These results show the promise of the MMW system for extracting quantitative and qualitative process parameters for industrial processes and access to real-time dynamics of materials behavior in extreme environments.

  3. Calculation of Multistage Turbomachinery Using Steady Characteristic Boundary Conditions

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.

    1998-01-01

    A multiblock Navier-Stokes analysis code for turbomachinery has been modified to allow analysis of multistage turbomachines. A steady averaging-plane approach was used to pass information between blade rows. Characteristic boundary conditions written in terms of perturbations about the mean flow from the neighboring blade row were used to allow close spacing between the blade rows without forcing the flow to be axisymmetric. In this report the multiblock code is described briefly and the characteristic boundary conditions and the averaging-plane implementation are described in detail. Two approaches for averaging the flow properties are also described. A two-dimensional turbine stator case was used to compare the characteristic boundary conditions with standard axisymmetric boundary conditions. Differences were apparent but small in this low-speed case. The two-stage fuel turbine used on the space shuttle main engines was then analyzed using a three-dimensional averaging-plane approach. Computed surface pressure distributions on the stator blades and endwalls and computed distributions of blade surface heat transfer coefficient on three blades showed very good agreement with experimental data from two tests.

  4. Community Landscapes: An Integrative Approach to Determine Overlapping Network Module Hierarchy, Identify Key Nodes and Predict Network Dynamics

    PubMed Central

    Kovács, István A.; Palotai, Robin; Szalay, Máté S.; Csermely, Peter

    2010-01-01

    Background Network communities help the functional organization and evolution of complex networks. However, the development of a method, which is both fast and accurate, provides modular overlaps and partitions of a heterogeneous network, has proven to be rather difficult. Methodology/Principal Findings Here we introduce the novel concept of ModuLand, an integrative method family determining overlapping network modules as hills of an influence function-based, centrality-type community landscape, and including several widely used modularization methods as special cases. As various adaptations of the method family, we developed several algorithms, which provide an efficient analysis of weighted and directed networks, and (1) determine pervasively overlapping modules with high resolution; (2) uncover a detailed hierarchical network structure allowing an efficient, zoom-in analysis of large networks; (3) allow the determination of key network nodes and (4) help to predict network dynamics. Conclusions/Significance The concept opens a wide range of possibilities to develop new approaches and applications including network routing, classification, comparison and prediction. PMID:20824084

  5. Air shower measurements with the LOPES radio antenna array

    NASA Astrophysics Data System (ADS)

    Lopes Collaboration; Haungs, A.; Apel, W. D.; Arteaga, J. C.; Asch, T.; Auffenberg, J.; Badea, F.; Bähren, L.; Bekk, K.; Bertaina, M.; Biermann, P. L.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Brüggemann, M.; Buchholz, P.; Buitink, S.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; di Pierro, F.; Doll, P.; Engel, R.; Falcke, H.; Finger, M.; Fuhrmann, D.; Gemmeke, H.; Ghia, P. L.; Glasstetter, R.; Grupen, C.; Heck, D.; Hörandel, J. R.; Horneffer, A.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kang, D.; Kickelbick, D.; Kolotaev, Y.; Krömer, O.; Kuijpers, J.; Lafebre, S.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Navarra, G.; Nehls, S.; Nigl, A.; Oehlschläger, J.; Over, S.; Petcu, M.; Pierog, T.; Rautenberg, J.; Rebel, H.; Roth, M.; Saftoiu, A.; Schieler, H.; Schmidt, A.; Schröder, F.; Sima, O.; Singh, K.; Stümpert, M.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Walkowiak, W.; Weindl, A.; Wochele, J.; Wommer, M.; Zabierowski, J.; Zensus, J. A.; LOPES Collaboration

    2009-06-01

    LOPES is set up at the location of the KASCADE-Grande extensive air shower experiment in Karlsruhe, Germany and aims to measure and investigate radio pulses from extensive air showers. Since radio waves suffer very little attenuation, radio measurements allow the detection of very distant or highly inclined showers. These waves can be recorded day and night, and provide a bolometric measure of the leptonic shower component. LOPES is designed as a digital radio interferometer using high bandwidths and fast data processing and profits from the reconstructed air shower observables of KASCADE-Grande. The LOPES antennas are absolutely amplitude calibrated allowing to reconstruct the electric field strength which can be compared with predictions from detailed Monte-Carlo simulations. We report about the analysis of correlations present in the radio signals measured by the LOPES 30 antenna array. Additionally, LOPES operates antennas of a different type (LOPESSTAR) which are optimized for an application at the Pierre Auger Observatory. Status, recent results of the data analysis and further perspectives of LOPES and the possible large scale application of this new detection technique are discussed.

  6. Failure Analysis in Platelet Molded Composite Systems

    NASA Astrophysics Data System (ADS)

    Kravchenko, Sergii G.

    Long-fiber discontinuous composite systems in the form of chopped prepreg tapes provide an advanced, structural grade, molding compound allowing for fabrication of complex three-dimensional components. Understanding of process-structure-property relationship is essential for application of prerpeg platelet molded components, especially because of their possible irregular disordered heterogeneous morphology. Herein, a structure-property relationship was analyzed in the composite systems of many platelets. Regular and irregular morphologies were considered. Platelet-based systems with more ordered morphology possess superior mechanical performance. While regular morphologies allow for a careful inspection of failure mechanisms derived from the morphological characteristics, irregular morphologies are representative of the composite architectures resulting from uncontrolled deposition and molding with chopped prerpegs. Progressive failure analysis (PFA) was used to study the damaged deformation up to ultimate failure in a platelet-based composite system. Computational damage mechanics approaches were utilized to conduct the PFA. The developed computational models granted understanding of how the composite structure details, meaning the platelet geometry and system morphology (geometrical arrangement and orientation distribution of platelets), define the effective mechanical properties of a platelet-molded composite system, its stiffness, strength and variability in properties.

  7. Enabling the democratization of the genomics revolution with a fully integrated web-based bioinformatics platform, Version 1.5 and 1.x.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chain, Patrick; Lo, Chien-Chi; Li, Po-E

    EDGE bioinformatics was developed to help biologists process Next Generation Sequencing data (in the form of raw FASTQ files), even if they have little to no bioinformatics expertise. EDGE is a highly integrated and interactive web-based platform that is capable of running many of the standard analyses that biologists require for viral, bacterial/archaeal, and metagenomic samples. EDGE provides the following analytical workflows: quality trimming and host removal, assembly and annotation, comparisons against known references, taxonomy classification of reads and contigs, whole genome SNP-based phylogenetic analysis, and PCR analysis. EDGE provides an intuitive web-based interface for user input, allows users tomore » visualize and interact with selected results (e.g. JBrowse genome browser), and generates a final detailed PDF report. Results in the form of tables, text files, graphic files, and PDFs can be downloaded. A user management system allows tracking of an individual’s EDGE runs, along with the ability to share, post publicly, delete, or archive their results.« less

  8. Cloning of a CACTA transposon-like insertion in intron I of tomato invertase Lin5 gene and identification of transposase-like sequences of Solanaceae species.

    PubMed

    Proels, Reinhard K; Roitsch, Thomas

    2006-03-01

    Very few CACTA transposon-like sequences have been described in Solanaceae species. Sequence information has been restricted to partial transposase (TPase)-like fragments, and no target gene of CACTA-like transposon insertion has been described in tomato to date. In this manuscript, we report on a CACTA transposon-like insertion in intron I of tomato (Lycopersicon esculentum) invertase gene Lin5 and TPase-like sequences of several Solanaceae species. Consensus primers deduced from the TPase region of the tomato CACTA transposon-like element allowed the amplification of similar sequences from various Solanaceae species of different subfamilies including Solaneae (Solanum tuberosum), Cestreae (Nicotiana tabacum) and Datureae (Datura stramonium). This demonstrates the ubiquitous presence of CACTA-like elements in Solanaceae genomes. The obtained partial sequences are highly conserved, and allow further detection and detailed analysis of CACTA-like transposons throughout Solanaceae species. CACTA-like transposon sequences make possible the evaluation of their use for genome analysis, functional studies of genes and the evolutionary relationships between plant species.

  9. STEREO in-situ data analysis

    NASA Astrophysics Data System (ADS)

    Schroeder, P.; Luhmann, J.; Davis, A.; Russell, C.

    STEREO s IMPACT In-situ Measurements of Particles and CME Transients investigation provides the first opportunity for long duration detailed observations of 1 AU magnetic field structures plasma and suprathermal electrons and energetic particles at points bracketing Earth s heliospheric location The PLASTIC instrument will make plasma ion composition measurements completing STEREO s comprehensive in-situ perspective Stereoscopic 3D information from the STEREO SECCHI imagers and SWAVES radio experiment will make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections ICME and solar wind structures to CMEs and coronal holes observed at the Sun The uniqueness of the STEREO mission requires novel data analysis tools and techniques to take advantage of the mission s full scientific potential An interactive browser with the ability to create publication-quality plots is being developed which will integrate STEREO s in-situ data with data from a variety of other missions including WIND and ACE Also an application program interface API will be provided allowing users to create custom software that ties directly into STEREO s data set The API will allow for more advanced forms of data mining than currently available through most data web services A variety of data access techniques and the development of cross-spacecraft data analysis tools will allow the larger scientific community to combine STEREO s unique in-situ data with those of other missions particularly the L1 missions and therefore to maximize

  10. A Quantitative Framework for Flower Phenotyping in Cultivated Carnation (Dianthus caryophyllus L.)

    PubMed Central

    Chacón, Borja; Ballester, Roberto; Birlanga, Virginia; Rolland-Lagan, Anne-Gaëlle; Pérez-Pérez, José Manuel

    2013-01-01

    Most important breeding goals in ornamental crops are plant appearance and flower characteristics where selection is visually performed on direct offspring of crossings. We developed an image analysis toolbox for the acquisition of flower and petal images from cultivated carnation (Dianthus caryophyllus L.) that was validated by a detailed analysis of flower and petal size and shape in 78 commercial cultivars of D. caryophyllus, including 55 standard, 22 spray and 1 pot carnation cultivars. Correlation analyses allowed us to reduce the number of parameters accounting for the observed variation in flower and petal morphology. Convexity was used as a descriptor for the level of serration in flowers and petals. We used a landmark-based approach that allowed us to identify eight main principal components (PCs) accounting for most of the variance observed in petal shape. The effect and the strength of these PCs in standard and spray carnation cultivars are consistent with shared underlying mechanisms involved in the morphological diversification of petals in both subpopulations. Our results also indicate that neighbor-joining trees built with morphological data might infer certain phylogenetic relationships among carnation cultivars. Based on estimated broad-sense heritability values for some flower and petal features, different genetic determinants shall modulate the responses of flower and petal morphology to environmental cues in this species. We believe our image analysis toolbox could allow capturing flower variation in other species of high ornamental value. PMID:24349209

  11. DETAIL OF "FEET" OF MAIN TRUSS NORTH END. NOTE PLATES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF "FEET" OF MAIN TRUSS NORTH END. NOTE PLATES ON WHICH FEET REST ALLOWING EXPANSION OF TRUSS AS IT EXPANDS AND SHRINKS UNDER THE SUN - Missouri & North Arkansas Railroad Bridge, Spanning Middle Fork Little Red River, Shirley, Van Buren County, AR

  12. Scanning mirror for infrared sensors

    NASA Technical Reports Server (NTRS)

    Anderson, R. H.; Bernstein, S. B.

    1972-01-01

    A high resolution, long life angle-encoded scanning mirror, built for application in an infrared attitude sensor, is described. The mirror uses a Moire' fringe type optical encoder and unique torsion bar suspension together with a magnetic drive to meet stringent operational and environmental requirements at a minimum weight and with minimum power consumption. Details of the specifications, design, and construction are presented with an analysis of the mirror suspension that allows accurate prediction of performance. The emphasis is on mechanical design considerations, and brief discussions are included on the encoder and magnetic drive to provide a complete view of the mirror system and its capabilities.

  13. Phage diabody repertoires for selection of large numbers of bispecific antibody fragments.

    PubMed

    McGuinness, B T; Walter, G; FitzGerald, K; Schuler, P; Mahoney, W; Duncan, A R; Hoogenboom, H R

    1996-09-01

    Methods for the generation of large numbers of different bispecific antibodies are presented. Cloning strategies are detailed to create repertoires of bispecific diabody molecules with variability on one or both of the antigen binding sites. This diabody format, when combined with the power of phage display technology, allows the generation and analysis of thousands of different bispecific molecules. Selection for binding presumably also selects for more stable diabodies. Phage diabody libraries enable screening or selection of the best combination bispecific molecule with regards to affinity of binding, epitope recognition and pairing before manufacture of the best candidate.

  14. A noise model for the evaluation of defect states in solar cells

    PubMed Central

    Landi, G.; Barone, C.; Mauro, C.; Neitzert, H. C.; Pagano, S.

    2016-01-01

    A theoretical model, combining trapping/detrapping and recombination mechanisms, is formulated to explain the origin of random current fluctuations in silicon-based solar cells. In this framework, the comparison between dark and photo-induced noise allows the determination of important electronic parameters of the defect states. A detailed analysis of the electric noise, at different temperatures and for different illumination levels, is reported for crystalline silicon-based solar cells, in the pristine form and after artificial degradation with high energy protons. The evolution of the dominating defect properties is studied through noise spectroscopy. PMID:27412097

  15. Probing pre-inflationary anisotropy with directional variations in the gravitational wave background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furuya, Yu; Niiyama, Yuki; Sendouda, Yuuiti, E-mail: furuya@tap.st.hirosaki-u.ac.jp, E-mail: niiyama@tap.st.hirosaki-u.ac.jp, E-mail: sendouda@hirosaki-u.ac.jp

    We perform a detailed analysis on a primordial gravitational-wave background amplified during a Kasner-like pre-inflationary phase allowing for general triaxial anisotropies. It is found that the predicted angular distribution map of gravitational-wave intensity on large scales exhibits topologically distinctive patterns according to the degree of the pre-inflationary anisotropy, thereby serving as a potential probe for the pre-inflationary early universe with future all-sky observations of gravitational waves. We also derive an observational limit on the amplitude of such anisotropic gravitational waves from the B -mode polarisation of the cosmic microwave background.

  16. Proof of principal for staircase auger chip removal theory

    NASA Technical Reports Server (NTRS)

    Barron, Jeffrey B.; Brewer, Steve; Kerns, Kenneth; Moody, Kyle; Rossi, Richard A.

    1987-01-01

    A proof of principal design of the staircase auger theory is provided for lunar drilling. The drill is designed to drill holes 30 meters deep and 0.1 meters in diameter. The action of the auger is 0.01 meter strokes at a varying number of strokes per second. A detailed analysis of the interaction of the auger and particle was done to optimize the parameters of the auger. This optimum design will allow for proper heat removal and reasonable drilling time. The drill bit is designed to scoop the particles into the auger while efficiently cutting through the moon's surface.

  17. Memristor emulator causes dissimilarity on a coupled memristive systems

    NASA Astrophysics Data System (ADS)

    Sabarathinam, S.; Prasad, Awadhesh

    2018-04-01

    The memristor is known as abasic fourth passive solid state circuit element. Itgaining increasing attention to create the next generation electronic devices commonly used as fundamental chaotic circuit although often arbitrary (typically piecewise linear or cubic) fluxcharge characteristics. In thispresent work, the causes of the memristor emulator studied in a coupled memristive chaoticoscillator for the first time. We confirm that the emulator that allows synchronization between theoscillators and cause the dissimilarity between the systems when increasing the couplingstrength, and co-efficient of the memristor emulator. The detailed statistical analysis was performed to confirm such phenomenon.

  18. Structural zooming research and development of an interactive computer graphical interface for stress analysis of cracks

    NASA Technical Reports Server (NTRS)

    Gerstle, Walter

    1989-01-01

    Engineering problems sometimes involve the numerical solution of boundary value problems over domains containing geometric feature with widely varying scales. Often, a detailed solution is required at one or more of these features. Small details in large structures may have profound effects upon global performance. Conversely, large-scale conditions may effect local performance. Many man-hours and CPU-hours are currently spent in modeling such problems. With the structural zooming technique, it is now possible to design an integrated program which allows the analyst to interactively focus upon a small region of interest, to modify the local geometry, and then to obtain highly accurate responses in that region which reflect both the properties of the overall structure and the local detail. A boundary integral equation analysis program, called BOAST, was recently developed for the stress analysis of cracks. This program can accurately analyze two-dimensional linear elastic fracture mechanics problems with far less computational effort than existing finite element codes. An interactive computer graphical interface to BOAST was written. The graphical interface would have several requirements: it would be menu-driven, with mouse input; all aspects of input would be entered graphically; the results of a BOAST analysis would be displayed pictorially but also the user would be able to probe interactively to get numerical values of displacement and stress at desired locations within the analysis domain; the entire procedure would be integrated into a single, easy to use package; and it would be written using calls to the graphic package called HOOPS. The program is nearing completion. All of the preprocessing features are working satisfactorily and were debugged. The postprocessing features are under development, and rudimentary postprocessing should be available by the end of the summer. The program was developed and run on a VAX workstation, and must be ported to the SUN workstation. This activity is currently underway.

  19. Non-destructive determination of floral staging in cereals using X-ray micro computed tomography (µCT).

    PubMed

    Tracy, Saoirse R; Gómez, José Fernández; Sturrock, Craig J; Wilson, Zoe A; Ferguson, Alison C

    2017-01-01

    Accurate floral staging is required to aid research into pollen and flower development, in particular male development. Pollen development is highly sensitive to stress and is critical for crop yields. Research into male development under environmental change is important to help target increased yields. This is hindered in monocots as the flower develops internally in the pseudostem. Floral staging studies therefore typically rely on destructive analysis, such as removal from the plant, fixation, staining and sectioning. This time-consuming analysis therefore prevents follow up studies and analysis past the point of the floral staging. This study focuses on using X-ray µCT scanning to allow quick and detailed non-destructive internal 3D phenotypic information to allow accurate staging of Arabidopsis thaliana L. and Barley ( Hordeum vulgare L.) flowers. X-ray µCT has previously relied on fixation methods for above ground tissue, therefore two contrast agents (Lugol's iodine and Bismuth) were observed in Arabidopsis and Barley in planta to circumvent this step. 3D models and 2D slices were generated from the X-ray µCT images providing insightful information normally only available through destructive time-consuming processes such as sectioning and microscopy. Barley growth and development was also monitored over three weeks by X-ray µCT to observe flower development in situ. By measuring spike size in the developing tillers accurate non-destructive staging at the flower and anther stages could be performed; this staging was confirmed using traditional destructive microscopic analysis. The use of X-ray micro computed tomography (µCT) scanning of living plant tissue offers immense benefits for plant phenotyping, for successive developmental measurements and for accurate developmental timing for scientific measurements. Nevertheless, X-ray µCT remains underused in plant sciences, especially in above-ground organs, despite its unique potential in delivering detailed non-destructive internal 3D phenotypic information. This work represents a novel application of X-ray µCT that could enhance research undertaken in monocot species to enable effective non-destructive staging and developmental analysis for molecular genetic studies and to determine effects of stresses at particular growth stages.

  20. Compensation Rules for Climate Policy in the Electricity Sector

    ERIC Educational Resources Information Center

    Burtraw, Dallas; Palmer, Karen

    2008-01-01

    Most previous cap and trade programs have distributed emission allowances for free to incumbent producers. However, in the electricity sector the value of CO[subscript 2] allowances may be far in excess of costs to industry and giving them away to firms diverts allowance value from other purposes. Using a detailed simulation model, this paper…

  1. Magnetic mapping for structural geology and geothermal exploration in Guadeloupe, Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Mercier de Lépinay, jeanne; munschy, marc; geraud, yves; diraison, marc; navelot, vivien; verati, christelle; corsini, michel; lardeaux, jean marc; favier, alexiane

    2017-04-01

    This work is implemented through the GEOTREF program which benefits from the support of both the ADEME and the French public funds "Investments for the future". The program focuses on the exploration for geothermal resources in Guadeloupe, Lesser Antilles, where a geothermal power plant is in production since 1986 (Bouillante, Basse Terre). In Les Saintes archipelago, in the south of Guadeloupe, the outcrop analysis of Terre-de-Haut Island allows to point out an exhumed geothermal paleo-system that is thought to be an analogue of the Bouillante active geothermal system. We show that a detailed marine magnetic survey with a quantitative interpretation can bring information about the offshore structures around Les Saintes archipelago in order to extend the geological limits and structural elements. A similar survey and workflow is also conducted offshore Basse-Terre where more geophysical data is already available. In order to correctly link the offshore and onshore structures, the magnetic survey must be close enough to the shoreline and sufficiently detailed to correctly outline the tectonic structures. An appropriate solution for such a survey is to use a three component magnetometer aboard a speedboat. Such a boat allows more navigation flexibility than a classic oceanic vessel towing a magnetometer; it can sail at higher speed on calm seas and closer to the shoreline. This kind of magnetic acquisition is only viable because the magnetic effect of the ship can be compensated using the same algorithms than those used for airborne magnetometry. The use of potential field transforms allows a large variety of structures to be highlighted, providing insights to build a general understanding of the nature and distribution of the magnetic sources. In particular, we use the tilt angle operator to better identify the magnetic lineaments offshore in order to compare them to the faults identified onshore during the outcrop analysis. All the major faults and fractures directions observed onshore are well represented through the magnetic lineaments except the main N90-110 system which is almost inexistent. We also invert the magnetic data to obtain a magnetization intensity map. This inversion assumes a constant depth magnetized layer and a constant magnetization's direction. The calculated variations on the map are consistent with on-field measurements showing that hydrothermalized rocks have a lower magnetic susceptibility (2 orders of magnitude) than fresh ones. Our interpretation and the onshore structural and petrographic analysis allow us to recognize the offshore extension of the hydrothermalized area, as well as different structural orientations.

  2. Interior detail, building 810, view to south showing operable door ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail, building 810, view to south showing operable door sections to allow closure around aircraft fuselage section, 135 mm lens plus electronic flash lightening. - Travis Air Force Base, B-36 Hangar, Between Woodskill Avenue & Ellis, adjacent to Taxiway V & W, Fairfield, Solano County, CA

  3. Annotation and structural elucidation of bovine milk oligosaccharides and determination of novel fucosylated structures

    PubMed Central

    Aldredge, Danielle L; Geronimo, Maria R; Hua, Serenus; Nwosu, Charles C; Lebrilla, Carlito B; Barile, Daniela

    2013-01-01

    Bovine milk oligosaccharides (BMOs) are recognized by the dairy and food industries, as well as by infant formula manufacturers, as novel, high-potential bioactive food ingredients. Recent studies revealed that bovine milk contains complex oligosaccharides structurally related to those previously thought to be present in only human milk. These BMOs are microbiotic modulators involved in important biological activities, including preventing pathogen binding to the intestinal epithelium and serving as nutrients for a selected class of beneficial bacteria. Only a small number of BMO structures are fully elucidated. To better understand the potential of BMOs as a class of biotherapeutics, their detailed structure analysis is needed. This study initiated the development of a structure library of BMOs and a comprehensive evaluation of structure-related specificity. The bovine milk glycome was profiled by high-performance mass spectrometry and advanced separation techniques to obtain a comprehensive catalog of BMOs, including several novel, lower abundant neutral and fucosylated oligosaccharides that are often overlooked during analysis. Structures were identified using isomer-specific tandem mass spectroscopy and targeted exoglycosidase digestions to produce a BMO library detailing retention time, accurate mass and structure to allow their rapid identification in future studies. PMID:23436288

  4. Analysis of Invasion Dynamics of Matrix-Embedded Cells in a Multisample Format.

    PubMed

    Van Troys, Marleen; Masuzzo, Paola; Huyck, Lynn; Bakkali, Karima; Waterschoot, Davy; Martens, Lennart; Ampe, Christophe

    2018-01-01

    In vitro tests of cancer cell invasion are the "first line" tools of preclinical researchers for screening the multitude of chemical compounds or cell perturbations that may aid in halting or treating cancer malignancy. In order to have predictive value or to contribute to designing personalized treatment regimes, these tests need to take into account the cancer cell environment and measure effects on invasion in sufficient detail. The in vitro invasion assays presented here are a trade-off between feasibility in a multisample format and mimicking the complexity of the tumor microenvironment. They allow testing multiple samples and conditions in parallel using 3D-matrix-embedded cells and deal with the heterogeneous behavior of an invading cell population in time. We describe the steps to take, the technical problems to tackle and useful software tools for the entire workflow: from the experimental setup to the quantification of the invasive capacity of the cells. The protocol is intended to guide researchers to standardize experimental set-ups and to annotate their invasion experiments in sufficient detail. In addition, it provides options for image processing and a solution for storage, visualization, quantitative analysis, and multisample comparison of acquired cell invasion data.

  5. DEM Modeling of a Flexible Barrier Impacted by a Dry Granular Flow

    NASA Astrophysics Data System (ADS)

    Albaba, Adel; Lambert, Stéphane; Kneib, François; Chareyre, Bruno; Nicot, François

    2017-11-01

    Flexible barriers are widely used as protection structures against natural hazards in mountainous regions, in particular for containing granular materials such as debris flows, snow avalanches and rock slides. This article presents a discrete element method-based model developed in the aim of investigating the response of flexible barriers in such contexts. It allows for accounting for the peculiar mechanical and geometrical characteristics of both the granular flow and the barrier in a same framework, and with limited assumptions. The model, developed with YADE software, is described in detail, as well as its calibration. In particular, cables are modeled as continuous bodies. Besides, it naturally considers the sliding of rings along supporting cables. The model is then applied for a generic flexible barrier to demonstrate its capacities in accounting for the behavior of different components. A detailed analysis of the forces in the different components showed that energy dissipators (ED) had limited influence on total force applied to the barrier and retaining capacity, but greatly influenced the load transmission within the barrier and the force in anchors. A sensitivity analysis showed that the barrier's response significantly changes according to the choice of ED activation force and incoming flow conditions.

  6. Microsatellite Analysis for Identification of Individuals Using Bone from the Extinct Steller's Sea Cow (Hydrodamalis gigas).

    PubMed

    Warner, Jeffery F; Harpole, Michael G; Crerar, Lorelei D

    2017-01-01

    Microsatellite DNA can provide more detailed population genetic information than mitochondrial DNA which is normally used to research ancient bone. The methods detailed in this chapter can be utilized for any type of bone. However, for this example, four microsatellite loci were isolated from Steller's sea cow (Hydrodamalis gigas) using published primers for manatee and dugong microsatellites. The primers DduC05 (Broderick et al., Mol Ecol Notes 6:1275-1277, 2007), Tmakb60, TmaSC5 (Pause et al., Mol Ecol Notes 6: 1073-1076, 2007), and TmaE11 (Garcia-Rodriguez et al., Mol Ecol 12:2161-2163, 2000) all successfully amplified microsatellites from H. gigas. The DNA samples were from bone collected on Bering or St. Lawrence Islands. DNA was analyzed using primers with the fluorescent label FAM-6. Sequenced alleles were then used to indicate a difference in the number of repeats and thus a difference in individuals. This is the first time that H. gigas microsatellite loci have been isolated. These techniques for ancient bone microsatellite analysis allow an estimate of population size for a newly discovered St. Lawrence Island sea cow population.

  7. Utilities on the info highway: Part two

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burkhart, L.A.

    1994-05-15

    This article describes federal government legislation to allow electric and gas utilities to provide telecommunications services. The final law will probably allow all utilities to provide telecommunication services, even those regulated by PUHCA. Details of House and Senate bills are described.

  8. Computer-based analysis of microvascular alterations in a mouse model for Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Heinzer, Stefan; Müller, Ralph; Stampanoni, Marco; Abela, Rafael; Meyer, Eric P.; Ulmann-Schuler, Alexandra; Krucker, Thomas

    2007-03-01

    Vascular factors associated with Alzheimer's disease (AD) have recently gained increased attention. To investigate changes in vascular, particularly microvascular architecture, we developed a hierarchical imaging framework to obtain large-volume, high-resolution 3D images from brains of transgenic mice modeling AD. In this paper, we present imaging and data analysis methods which allow compiling unique characteristics from several hundred gigabytes of image data. Image acquisition is based on desktop micro-computed tomography (µCT) and local synchrotron-radiation µCT (SRµCT) scanning with a nominal voxel size of 16 µm and 1.4 µm, respectively. Two visualization approaches were implemented: stacks of Z-buffer projections for fast data browsing, and progressive-mesh based surface rendering for detailed 3D visualization of the large datasets. In a first step, image data was assessed visually via a Java client connected to a central database. Identified characteristics of interest were subsequently quantified using global morphometry software. To obtain even deeper insight into microvascular alterations, tree analysis software was developed providing local morphometric parameters such as number of vessel segments or vessel tortuosity. In the context of ever increasing image resolution and large datasets, computer-aided analysis has proven both powerful and indispensable. The hierarchical approach maintains the context of local phenomena, while proper visualization and morphometry provide the basis for detailed analysis of the pathology related to structure. Beyond analysis of microvascular changes in AD this framework will have significant impact considering that vascular changes are involved in other neurodegenerative diseases as well as in cancer, cardiovascular disease, asthma, and arthritis.

  9. Automation of a N-S S and C Database Generation for the Harrier in Ground Effect

    NASA Technical Reports Server (NTRS)

    Murman, Scott M.; Chaderjian, Neal M.; Pandya, Shishir; Kwak, Dochan (Technical Monitor)

    2001-01-01

    A method of automating the generation of a time-dependent, Navier-Stokes static stability and control database for the Harrier aircraft in ground effect is outlined. Reusable, lightweight components arc described which allow different facets of the computational fluid dynamic simulation process to utilize a consistent interface to a remote database. These components also allow changes and customizations to easily be facilitated into the solution process to enhance performance, without relying upon third-party support. An analysis of the multi-level parallel solver OVERFLOW-MLP is presented, and the results indicate that it is feasible to utilize large numbers of processors (= 100) even with a grid system with relatively small number of cells (= 10(exp 6)). A more detailed discussion of the simulation process, as well as refined data for the scaling of the OVERFLOW-MLP flow solver will be included in the full paper.

  10. Digital avionics design and reliability analyzer

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The description and specifications for a digital avionics design and reliability analyzer are given. Its basic function is to provide for the simulation and emulation of the various fault-tolerant digital avionic computer designs that are developed. It has been established that hardware emulation at the gate-level will be utilized. The primary benefit of emulation to reliability analysis is the fact that it provides the capability to model a system at a very detailed level. Emulation allows the direct insertion of faults into the system, rather than waiting for actual hardware failures to occur. This allows for controlled and accelerated testing of system reaction to hardware failures. There is a trade study which leads to the decision to specify a two-machine system, including an emulation computer connected to a general-purpose computer. There is also an evaluation of potential computers to serve as the emulation computer.

  11. Evolution, Energy Landscapes and the Paradoxes of Protein Folding

    PubMed Central

    Wolynes, Peter G.

    2014-01-01

    Protein folding has been viewed as a difficult problem of molecular self-organization. The search problem involved in folding however has been simplified through the evolution of folding energy landscapes that are funneled. The funnel hypothesis can be quantified using energy landscape theory based on the minimal frustration principle. Strong quantitative predictions that follow from energy landscape theory have been widely confirmed both through laboratory folding experiments and from detailed simulations. Energy landscape ideas also have allowed successful protein structure prediction algorithms to be developed. The selection constraint of having funneled folding landscapes has left its imprint on the sequences of existing protein structural families. Quantitative analysis of co-evolution patterns allows us to infer the statistical characteristics of the folding landscape. These turn out to be consistent with what has been obtained from laboratory physicochemical folding experiments signalling a beautiful confluence of genomics and chemical physics. PMID:25530262

  12. CAD system of design and engineering provision of die forming of compressor blades for aircraft engines

    NASA Astrophysics Data System (ADS)

    Khaimovich, I. N.

    2017-10-01

    The articles provides the calculation algorithms for blank design and die forming fitting to produce the compressor blades for aircraft engines. The design system proposed in the article allows generating drafts of trimming and reducing dies automatically, leading to significant reduction of work preparation time. The detailed analysis of the blade structural elements features was carried out, the taken limitations and technological solutions allowed forming generalized algorithms of forming parting stamp face over the entire circuit of the engraving for different configurations of die forgings. The author worked out the algorithms and programs to calculate three dimensional point locations describing the configuration of die cavity. As a result the author obtained the generic mathematical model of final die block in the form of three-dimensional array of base points. This model is the base for creation of engineering documentation of technological equipment and means of its control.

  13. Evolved Late-Type Star FUV Spectra: Mass Loss and Fluorescence

    NASA Technical Reports Server (NTRS)

    Harper, Graham M.

    2005-01-01

    This proposal was for a detailed analysis of the far ultraviolet (FUV) photoionizing radiation that provides crucial input physics for mass loss studies, e.g., observations of the flux below 10448, allow us to constrain the Ca II/Ca III balance and make significant progress beyond previous optical studies on stellar mass loss and circumstellar photochemistry. Our targets selection provided good spectral-type coverage required to help unravel the Ca II/Ca III balance as the mass-loss rates increase by over three orders of magnitude from K5 III to M5 III. We also explored the relationship between the FUV radiation field and other UV diagnostics to allow us to empirically estimate the FUV radiation field for the vast majority of stars which are too faint to be observed with FUSE, and to improve upon their uncertain mass-loss rates.

  14. Whole Genome Amplification of Labeled Viable Single Cells Suited for Array-Comparative Genomic Hybridization.

    PubMed

    Kroneis, Thomas; El-Heliebi, Amin

    2015-01-01

    Understanding details of a complex biological system makes it necessary to dismantle it down to its components. Immunostaining techniques allow identification of several distinct cell types thereby giving an inside view of intercellular heterogeneity. Often staining reveals that the most remarkable cells are the rarest. To further characterize the target cells on a molecular level, single cell techniques are necessary. Here, we describe the immunostaining, micromanipulation, and whole genome amplification of single cells for the purpose of genomic characterization. First, we exemplify the preparation of cell suspensions from cultured cells as well as the isolation of peripheral mononucleated cells from blood. The target cell population is then subjected to immunostaining. After cytocentrifugation target cells are isolated by micromanipulation and forwarded to whole genome amplification. For whole genome amplification, we use GenomePlex(®) technology allowing downstream genomic analysis such as array-comparative genomic hybridization.

  15. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation

    NASA Astrophysics Data System (ADS)

    Veltri, M.

    2016-09-01

    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  16. Fluid-Driven Deformation of a Soft Porous Medium

    NASA Astrophysics Data System (ADS)

    Lutz, Tyler; Wilen, Larry; Wettlaufer, John

    2017-11-01

    Viscous drag forces resisting the flow of fluid through a soft porous medium are maintained by restoring forces associated with deformations in the solid matrix. We describe experimental measurements of the deformation of foam under a pressure-driven flow of water along a single axis. Image analysis techniques allow tracking of the foam displacement while pressure sensors allow measurement of the fluid pressure. Experiments are performed for a series of different pressure heads ranging from 10 to 90 psi, and the results are compared to theory. This work builds on previous measurements of the fluid-induced deformation of a bed of soft hydrogel spheres. Compared to the hydrogel system, foams have the advantage that the constituents of the porous medium do not rearrange during an experiment, but they have the disadvantage of having a high friction coefficient with any boundaries. We detail strategies to characterize and mitigate the effects of friction on the observed foam deformations.

  17. Design and validation of an advanced entrained flow reactor system for studies of rapid solid biomass fuel particle conversion and ash formation reactions

    NASA Astrophysics Data System (ADS)

    Wagner, David R.; Holmgren, Per; Skoglund, Nils; Broström, Markus

    2018-06-01

    The design and validation of a newly commissioned entrained flow reactor is described in the present paper. The reactor was designed for advanced studies of fuel conversion and ash formation in powder flames, and the capabilities of the reactor were experimentally validated using two different solid biomass fuels. The drop tube geometry was equipped with a flat flame burner to heat and support the powder flame, optical access ports, a particle image velocimetry (PIV) system for in situ conversion monitoring, and probes for extraction of gases and particulate matter. A detailed description of the system is provided based on simulations and measurements, establishing the detailed temperature distribution and gas flow profiles. Mass balance closures of approximately 98% were achieved by combining gas analysis and particle extraction. Biomass fuel particles were successfully tracked using shadow imaging PIV, and the resulting data were used to determine the size, shape, velocity, and residence time of converting particles. Successful extractive sampling of coarse and fine particles during combustion while retaining their morphology was demonstrated, and it opens up for detailed time resolved studies of rapid ash transformation reactions; in the validation experiments, clear and systematic fractionation trends for K, Cl, S, and Si were observed for the two fuels tested. The combination of in situ access, accurate residence time estimations, and precise particle sampling for subsequent chemical analysis allows for a wide range of future studies, with implications and possibilities discussed in the paper.

  18. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  19. Visual analysis of mass cytometry data by hierarchical stochastic neighbour embedding reveals rare cell types.

    PubMed

    van Unen, Vincent; Höllt, Thomas; Pezzotti, Nicola; Li, Na; Reinders, Marcel J T; Eisemann, Elmar; Koning, Frits; Vilanova, Anna; Lelieveldt, Boudewijn P F

    2017-11-23

    Mass cytometry allows high-resolution dissection of the cellular composition of the immune system. However, the high-dimensionality, large size, and non-linear structure of the data poses considerable challenges for the data analysis. In particular, dimensionality reduction-based techniques like t-SNE offer single-cell resolution but are limited in the number of cells that can be analyzed. Here we introduce Hierarchical Stochastic Neighbor Embedding (HSNE) for the analysis of mass cytometry data sets. HSNE constructs a hierarchy of non-linear similarities that can be interactively explored with a stepwise increase in detail up to the single-cell level. We apply HSNE to a study on gastrointestinal disorders and three other available mass cytometry data sets. We find that HSNE efficiently replicates previous observations and identifies rare cell populations that were previously missed due to downsampling. Thus, HSNE removes the scalability limit of conventional t-SNE analysis, a feature that makes it highly suitable for the analysis of massive high-dimensional data sets.

  20. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  1. Adaptation of Laser Microdissection Technique for the Study of a Spontaneous Metastatic Mammary Carcinoma Mouse Model by NanoString Technologies

    PubMed Central

    Saylor, Karen L.; Anver, Miriam R.; Salomon, David S.; Golubeva, Yelena G.

    2016-01-01

    Laser capture microdissection (LCM) of tissue is an established tool in medical research for collection of distinguished cell populations under direct microscopic visualization for molecular analysis. LCM samples have been successfully analyzed in a number of genomic and proteomic downstream molecular applications. However, LCM sample collection and preparation procedure has to be adapted to each downstream analysis platform. In this present manuscript we describe in detail the adaptation of LCM methodology for the collection and preparation of fresh frozen samples for NanoString analysis based on a study of a model of mouse mammary gland carcinoma and its lung metastasis. Our adaptation of LCM sample preparation and workflow to the requirements of the NanoString platform allowed acquiring samples with high RNA quality. The NanoString analysis of such samples provided sensitive detection of genes of interest and their associated molecular pathways. NanoString is a reliable gene expression analysis platform that can be effectively coupled with LCM. PMID:27077656

  2. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  3. Time-resolved confocal fluorescence microscopy: novel technical features and applications for FLIM, FRET and FCS using a sophisticated data acquisition concept in TCSPC

    NASA Astrophysics Data System (ADS)

    Koberling, Felix; Krämer, Benedikt; Kapusta, Peter; Patting, Matthias; Wahl, Michael; Erdmann, Rainer

    2007-05-01

    In recent years time-resolved fluorescence measurement and analysis techniques became a standard in single molecule microscopy. However, considering the equipment and experimental implementation they are typically still an add-on and offer only limited possibilities to study the mutual dependencies with common intensity and spectral information. In contrast, we are using a specially designed instrument with an unrestricted photon data acquisition approach which allows to store spatial, temporal, spectral and intensity information in a generalized format preserving the full experimental information. This format allows us not only to easily study dependencies between various fluorescence parameters but also to use, for example, the photon arrival time for sorting and weighting the detected photons to improve the significance in common FCS and FRET analysis schemes. The power of this approach will be demonstrated for different techniques: In FCS experiments the concentration determination accuracy can be easily improved by a simple time-gated photon analysis to suppress the fast decaying background signal. A more detailed analysis of the arrival times allows even to separate FCS curves for species which differ in their fluorescence lifetime but, for example, cannot be distinguished spectrally. In multichromophoric systems like a photonic wire which undergoes unidirectional multistep FRET the lifetime information complements significantly the intensity based analysis and helps to assign the respective FRET partners. Moreover, together with pulsed excitation the time-correlated analysis enables directly to take advantage of alternating multi-colour laser excitation. This pulsed interleaved excitation (PIE) can be used to identify and rule out inactive FRET molecules which cause interfering artefacts in standard FRET efficiency analysis. We used a piezo scanner based confocal microscope with compact picosecond diode lasers as excitation sources. The timing performance can be significantly increased by using new SPAD detectors which enable, in conjunction with new TCSPC electronics, an overall IRF width of less than 120 ps maintaining single molecule sensitivity.

  4. 50 years of mass balance observations at Vernagtferner, Eastern Alps

    NASA Astrophysics Data System (ADS)

    Braun, Ludwig; Mayer, Christoph

    2016-04-01

    The determination and monitoring of the seasonal and annual glacier mass balances of Vernagtferner, Austria, started in 1964 by the Commission of Glaciology, Bavarian Academy of Sciences. Detailed and continuous climate- and runoff measurements complement this mass balance series since 1974. Vernagtferner attracted the attention of scientists since the beginning of the 17th century due to its rapid advances and the resulting glacier lake outburst floods in the Ötztal valley. This is one reason for the first photogrammetric survey in 1889, which was followed by frequent topographic surveys, adding up to more than ten digital elevation models of the glacier until today. By including the known maximum glacier extent at the end of the Little Ice Age in 1845, the geodetic glacier volume balances cover a time span of almost 170 years. The 50 years of glacier mass balance and 40 years of water balance in the drainage basin are therefore embedded in a considerably longer period of glacier evolution, allowing an interpretation within an extended frame of climatology and ice dynamics. The direct mass balance observations cover not only the period of alpine-wide strong glacier mass loss since the beginning of the 1990s. The data also contain the last period of glacier advances between 1970 and 1990. The combination of the observed surface mass exchange and the determined periodic volumetric changes allows a detailed analysis of the dynamic reaction of the glacier over the period of half a century. The accompanying meteorological observations are the basis for relating these reactions to the climatic changes during this period. Vernagtferner is therefore one of the few glaciers in the world, where a very detailed glacier-climate reaction was observed for many decades and can be realistically reconstructed back to the end of the Little Ice Age.

  5. High resolution top-down experimental strategies on the Orbitrap platform.

    PubMed

    Scheffler, Kai; Viner, Rosa; Damoc, Eugen

    2018-03-20

    Top-down mass spectrometry (MS) strategies allow in-depth characterization of proteins by fragmentation of the entire molecule(s) inside a mass spectrometer without requiring prior proteolytic digestion. Importantly, the fragmentation techniques on commercially available mass spectrometers have become more versatile over the past decade, with different characteristics in regards to the type and wealth of fragment ions that can be obtained while preserving labile protein post-translational modifications. Due to these and other improvements, top-down MS has become of broader interest and has started to be applied in more disciplines, such as the quality control of recombinant proteins, analysis and characterization of biopharmaceuticals, and clinical biochemistry to probe protein forms as potential disease biomarkers. This article provides a technical overview and guidance for data acquisition strategies on the Orbitrap platform for single proteins and low complexity protein mixtures. A protein standard mixture composed of six recombinant proteins is also introduced and analysis strategies are discussed in detail. The article provides a detailed overview and guidance on how to choose from the variety of available methods for protein characterization by top-down analysis on the Orbitrap platform. Technical details are provided explaining important observations and phenomena when working with intact proteins and data from a number of different samples should serve to provide a solid understanding on how experiments were and should be setup and to set the right expectations on the outcome of these types of experiments. Additionally, a new intact protein standard sample is introduced that will help as a QC sample to check the instrument's hardware and method setup conditions as a requirement for obtaining high quality data from biologically relevant samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Recognition of large scale deep-seated landslides in vegetated areas of Taiwan

    NASA Astrophysics Data System (ADS)

    Lin, C. W.; Tarolli, P.; Tseng, C. M.; Tseng, Y. H.

    2012-04-01

    In August 2009, Typhoon Morakot triggered thousands of landslides and debris flows, and according to government reports, 619 people were dead and 76 missing and the economic loss was estimated at hundreds million of USD. In particular, the large deep-seated landslides are critical and deserve attention, since they can be affected by a reactivation during intense events, that usually can evolve in destructive failures. These are also difficult to recognize in the field, especially under dense forest areas. A detailed and constantly updated inventory map of such phenomena, and the recognition of their topographic signatures really represents a key tool for landslide risk mitigation, and mapping. The aim of this work is to test the performance of a new developed method for the automatic extraction of geomorphic features related to landslide crowns developed by Tarolli et al. (2010), in support to field surveys in order to develop a detailed and accurate inventory map of such phenomena. The methodology is based on the detection of thresholds derived by the statistical analysis of variability of landform curvature from high resolution LiDAR derived topography. The analysis suggested that the method allowed a good performance in localization and extraction, respect to field analysis, of features related to deep-seated landslides. Thanks to LiDAR capabilty to detect the bare ground elevation data also in forested areas, it was possible to recognize in detail landslide features also in remote regions difficult to access. Reference Tarolli, P., Sofia, G., Dalla Fontana, G. (2010). Geomorphic features extraction from high-resolution topography: landslide crowns and bank erosion, Natural Hazards, doi:10.1007/s11069-010-9695-2

  7. Selectively Sized Graphene-Based Nanopores for in Situ Single Molecule Sensing

    PubMed Central

    2015-01-01

    The use of nanopore biosensors is set to be extremely important in developing precise single molecule detectors and providing highly sensitive advanced analysis of biological molecules. The precise tailoring of nanopore size is a significant step toward achieving this, as it would allow for a nanopore to be tuned to a corresponding analyte. The work presented here details a methodology for selectively opening nanopores in real-time. The tunable nanopores on a quartz nanopipette platform are fabricated using the electroetching of a graphene-based membrane constructed from individual graphene nanoflakes (ø ∼30 nm). The device design allows for in situ opening of the graphene membrane, from fully closed to fully opened (ø ∼25 nm), a feature that has yet to be reported in the literature. The translocation of DNA is studied as the pore size is varied, allowing for subfeatures of DNA to be detected with slower DNA translocations at smaller pore sizes, and the ability to observe trends as the pore is opened. This approach opens the door to creating a device that can be target to detect specific analytes. PMID:26204996

  8. Mapping of Low-Frequency Raman Modes in CVD-Grown Transition Metal Dichalcogenides: Layer Number, Stacking Orientation and Resonant Effects

    PubMed Central

    O’Brien, Maria; McEvoy, Niall; Hanlon, Damien; Hallam, Toby; Coleman, Jonathan N.; Duesberg, Georg S.

    2016-01-01

    Layered inorganic materials, such as the transition metal dichalcogenides (TMDs), have attracted much attention due to their exceptional electronic and optical properties. Reliable synthesis and characterization of these materials must be developed if these properties are to be exploited. Herein, we present low-frequency Raman analysis of MoS2, MoSe2, WSe2 and WS2 grown by chemical vapour deposition (CVD). Raman spectra are acquired over large areas allowing changes in the position and intensity of the shear and layer-breathing modes to be visualized in maps. This allows detailed characterization of mono- and few-layered TMDs which is complementary to well-established (high-frequency) Raman and photoluminescence spectroscopy. This study presents a major stepping stone in fundamental understanding of layered materials as mapping the low-frequency modes allows the quality, symmetry, stacking configuration and layer number of 2D materials to be probed over large areas. In addition, we report on anomalous resonance effects in the low-frequency region of the WS2 Raman spectrum. PMID:26766208

  9. Beyond conventional dose-response curves: Sensorgram comparison in SPR allows single concentration activity and similarity assessment.

    PubMed

    Gassner, C; Karlsson, R; Lipsmeier, F; Moelleken, J

    2018-05-30

    Previously we have introduced two SPR-based assay principles (dual-binding assay and bridging assay), which allow the determination of two out of three possible interaction parameters for bispecific molecules within one assay setup: two individual interactions to both targets, and/or one simultaneous/overall interaction, which potentially reflects the inter-dependency of both individual binding events. However, activity and similarity are determined by comparing report points over a concentration range, which also mirrors the way data is generated by conventional ELISA-based methods So far, binding kinetics have not been specifically considered in generic approaches for activity assessment. Here, we introduce an improved slope-ratio model which, together with a sensorgram comparison based similarity assessment, allows the development of a detailed, USP-conformal ligand binding assay using only a single sample concentration. We compare this novel analysis method to the usual concentration-range approach for both SPR-based assay principles and discuss its impact on data quality and increased sample throughput. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Using McStas for modelling complex optics, using simple building bricks

    NASA Astrophysics Data System (ADS)

    Willendrup, Peter K.; Udby, Linda; Knudsen, Erik; Farhi, Emmanuel; Lefmann, Kim

    2011-04-01

    The McStas neutron ray-tracing simulation package is a versatile tool for producing accurate neutron simulations, extensively used for design and optimization of instruments, virtual experiments, data analysis and user training.In McStas, component organization and simulation flow is intrinsically linear: the neutron interacts with the beamline components in a sequential order, one by one. Historically, a beamline component with several parts had to be implemented with a complete, internal description of all these parts, e.g. a guide component including all four mirror plates and required logic to allow scattering between the mirrors.For quite a while, users have requested the ability to allow “components inside components” or meta-components, allowing to combine functionality of several simple components to achieve more complex behaviour, i.e. four single mirror plates together defining a guide.We will here show that it is now possible to define meta-components in McStas, and present a set of detailed, validated examples including a guide with an embedded, wedged, polarizing mirror system of the Helmholtz-Zentrum Berlin type.

  11. Measurement of 0.511-MeV gamma rays with a balloon-borne Ge/Li/ spectrometer

    NASA Technical Reports Server (NTRS)

    Ling, J. C.; Mahoney, W. A.; Willett, J. B.; Jacobson, A. S.

    1977-01-01

    A collimated high-resolution gamma ray spectrometer was flown on a balloon over Palestine, Texas, on June 10, 1974, to obtain measurements of the terrestrial and extraterrestrial 0.511-MeV gamma rays. The spectrometer consists of four 40-cu-cm Ge(Li) crystals operating in the energy range 0.06-10 MeV; this cluster of detectors is surrounded by a CsI(Na) anticoincidence shield. This system is used primarily to allow measurements of the two escape peaks associated with high-energy gamma ray lines. It also allows a measurement of the background component of the 0.511-MeV flux produced by beta(+) decays in materials inside the CsI(Na) shield. It is shown that the measurements of the atmospheric fluxes are consistent with earlier results after allowance is made for an additional component of the background due to beta(+) decays produced by neutron- and proton-initiated interactions with materials in and near the detector. Results of the extraterrestrial flux require an extensive detailed analysis of the time-varying background because of activation buildup and balloon spatial drifts.

  12. Modeling of natural risks in GIS, decision support in the Civil Protection and Emergency Planning

    NASA Astrophysics Data System (ADS)

    Santos, M.; Martins, L.; Moreira, S.; Costa, A.; Matos, F.; Teixeira, M.; Bateira, C.

    2012-04-01

    The assessment of natural hazards in Civil Protection is essential in the prevention and mitigation of emergency situations. This paper presents the results of the development of mapping susceptibility to landslides, floods, forest fires and soil erosion, using GIS (Geographic Information System) tools in two municipalities - Santo Tirso and Trofa - in the district of Oporto, in the northwest of Portugal. The mapping of natural hazards fits in the legislative plan of the Municipal Civil Protection (Law No. 65/2007 of 12 November) and it provides the key elements to planning and preparing an appropriate response in case some of the processes / phenomena occur, thus optimizing the procedures for protection and relief provided by the Municipal Civil Protection Service. Susceptibility mapping to landslides, floods, forest fires and soil erosion was performed with GIS tools resources. The methodology used to compile the mapping of landslides, forest fires and soil erosion was based on the modeling of different conditioning factors and validated with field work and event log. The mapping of susceptibility to floods and flooding was developed through mathematical parameters (statistical, hydrologic and hydraulic), supported by field work and the recognition of individual characteristics of each sector analysis and subsequently analyzed in a GIS environment The mapping proposal was made in 1:5000 scale which allows not only the identification of large sets affected by the spatial dynamics of the processes / phenomena, but also a more detailed analysis, especially when combined with geographic information systems (GIS) thus allowing to study more specific situations that require a quick response. The maps developed in this study are fundamental to the understanding, prediction and prevention of susceptibility and risks present in the municipalities, being a valuable tool in the process of Emergency Planning, since it identifies priority areas of intervention for farther detail analysis, promote and safeguard mechanisms to prevent injury and it anticipates the possibility of potential interventions that can minimize the risk.

  13. Conformational dynamics of abasic DNA upon interactions with AP endonuclease 1 revealed by stopped-flow fluorescence analysis.

    PubMed

    Kanazhevskaya, Lyubov Yu; Koval, Vladimir V; Vorobjev, Yury N; Fedorova, Olga S

    2012-02-14

    Apurinic/apyrimidinic (AP) sites are abundant DNA lesions arising from exposure to UV light, ionizing radiation, alkylating agents, and oxygen radicals. In human cells, AP endonuclease 1 (APE1) recognizes this mutagenic lesion and initiates its repair via a specific incision of the phosphodiester backbone 5' to the AP site. We have investigated a detailed mechanism of APE1 functioning using fluorescently labeled DNA substrates. A fluorescent adenine analogue, 2-aminopurine, was introduced into DNA substrates adjacent to the abasic site to serve as an on-site reporter of conformational transitions in DNA during the catalytic cycle. Application of a pre-steady-state stopped-flow technique allows us to observe changes in the fluorescence intensity corresponding to different stages of the process in real time. We also detected an intrinsic Trp fluorescence of the enzyme during interactions with 2-aPu-containing substrates. Our data have revealed a conformational flexibility of the abasic DNA being processed by APE1. Quantitative analysis of fluorescent traces has yielded a minimal kinetic scheme and appropriate rate constants consisting of four steps. The results obtained from stopped-flow data have shown a substantial influence of the 2-aPu base location on completion of certain reaction steps. Using detailed molecular dynamics simulations of the DNA substrates, we have attributed structural distortions of AP-DNA to realization of specific binding, effective locking, and incision of the damaged DNA. The findings allowed us to accurately discern the step that corresponds to insertion of specific APE1 amino acid residues into the abasic DNA void in the course of stabilization of the precatalytic complex.

  14. Statistical comparisons of gravity wave features derived from OH airglow and SABER data

    NASA Astrophysics Data System (ADS)

    Gelinas, L. J.; Hecht, J. H.; Walterscheid, R. L.

    2017-12-01

    The Aerospace Corporation's near-IR camera (ANI), deployed at Andes Lidar Observatory (ALO), Cerro Pachon Chile (30S,70W) since 2010, images the bright OH Meinel (4,2) airglow band. The imager provides detailed observations of gravity waves and instability dynamics, as described by Hecht et al. (2014). The camera employs a wide-angle lens that views a 73 by 73 degree region of the sky, approximately 120 km x 120 km at 85 km altitude. Image cadence of 30s allows for detailed spectral analysis of the horizontal components of wave features, including the evolution and decay of instability features. The SABER instrument on NASA's TIMED spacecraft provides remote soundings of kinetic temperature profiles from the lower stratosphere to the lower thermosphere. Horizontal and vertical filtering techniques allow SABER temperatures to be analyzed for gravity wave variances [Walterscheid and Christensen, 2016]. Here we compare the statistical characteristics of horizontal wave spectra, derived from airglow imagery, with vertical wave variances derived from SABER temperature profiles. The analysis is performed for a period of strong mountain wave activity over the Andes spanning the period between June and September 2012. Hecht, J. H., et al. (2014), The life cycle of instability features measured from the Andes Lidar Observatory over Cerro Pachon on March 24, 2012, J. Geophys. Res. Atmos., 119, 8872-8898, doi:10.1002/2014JD021726. Walterscheid, R. L., and A. B. Christensen (2016), Low-latitude gravity wave variances in the mesosphere and lower thermosphere derived from SABER temperature observation and compared with model simulation of waves generated by deep tropical convection, J. Geophys. Res. Atmos., 121, 11,900-11,912, doi:10.1002/2016JD024843.

  15. Detailed Skylab ECS consumables analysis for the interim revision flight plan (November, 1972, SL-1 launch)

    NASA Technical Reports Server (NTRS)

    Wells, C.; Kolkhorst, H. E.

    1971-01-01

    The consumables analysis was performed for the Skylab 2, 3, and 4 Preliminary Reference Interim Revision Flight Plan. The analysis and the results are based on the mission requirements as specified in the flight plan and on other available data. The results indicate that the consumables requirements for the Skylab missions allow for remaining margins (percent) of oxygen, nitrogen, and water nominal as follows: 83.5, 90.8, and 88.7 for mission SL-2; 57.1, 64.1, and 67.3 for SL-3; and 30.8, 44.3, and 46.5 for SL-4. Performance of experiment M509 as scheduled in the flight plan results in venting overboard the cluster atmosphere. This is due to the addition of nitrogen for propulsion and to the additional oxygen introduced into the cabin when the experiment is performed with the crewman suited.

  16. Development of a BPM Lock-In Diagnostic System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Dickson

    2003-05-12

    A system has been developed for the acquisition and analysis of high rate, time coherent BPM data across the Jefferson Lab's Continuous Electron Beam Accelerator Facility (CEBAF). This system will allow the acquisition of Beam Position Monitor (BPM) position and intensity information at a rate in excess 7 KHz for approximately 200 BPMs in a time synchronous manner. By inducing minute sinusoidal transverse beam motion in the CEBAF injector, with known phase relative to the synchronized BPM acquisition, it is possible to derive several types of useful information. Analysis of the BPM intensity data, which is proportional to beam current,more » by beating the signal with an in-phase sinusoidal representation of the transverse kick can localize beam scraping to a particular BPM. Similarly, real-time optics information may be deduced with an analysis of BPM position data. This paper will detail the frequency lock-in technique applied and present status.« less

  17. Spatiotemporal Bayesian analysis of Lyme disease in New York state, 1990-2000.

    PubMed

    Chen, Haiyan; Stratton, Howard H; Caraco, Thomas B; White, Dennis J

    2006-07-01

    Mapping ordinarily increases our understanding of nontrivial spatial and temporal heterogeneities in disease rates. However, the large number of parameters required by the corresponding statistical models often complicates detailed analysis. This study investigates the feasibility of a fully Bayesian hierarchical regression approach to the problem and identifies how it outperforms two more popular methods: crude rate estimates (CRE) and empirical Bayes standardization (EBS). In particular, we apply a fully Bayesian approach to the spatiotemporal analysis of Lyme disease incidence in New York state for the period 1990-2000. These results are compared with those obtained by CRE and EBS in Chen et al. (2005). We show that the fully Bayesian regression model not only gives more reliable estimates of disease rates than the other two approaches but also allows for tractable models that can accommodate more numerous sources of variation and unknown parameters.

  18. Waste isolation pilot plant (WIPP) borehole plugging program description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, C.L.; Hunter, T.O.

    1979-08-01

    The tests and experiments described attempt to provide a mix of borehole (with limited access) and in-mine (with relatively unlimited access) environments in which assessment of the various issues involved can be undertaken. The Bell Canyon Test provides the opportunity to instrument and analyze a plug in a high pressure region. The Shallow Hole Test permits application of best techniques for plugging and then access to both the top and bottom of the plug for further analysis. The Diagnostic Test Hole permits recovery of bench scale size samples for analysis and establishes an in-borehole laboratory in which to conduct testingmore » and analysis in all strata from the surface into the salt horizon. The additional in mine experiments provide the opportunity to investigate in more detail specific effects on plugs in the salt region and allows evaluation of instrumentation systems.« less

  19. Multilingual Sentiment Analysis: State of the Art and Independent Comparison of Techniques.

    PubMed

    Dashtipour, Kia; Poria, Soujanya; Hussain, Amir; Cambria, Erik; Hawalah, Ahmad Y A; Gelbukh, Alexander; Zhou, Qiang

    With the advent of Internet, people actively express their opinions about products, services, events, political parties, etc., in social media, blogs, and website comments. The amount of research work on sentiment analysis is growing explosively. However, the majority of research efforts are devoted to English-language data, while a great share of information is available in other languages. We present a state-of-the-art review on multilingual sentiment analysis. More importantly, we compare our own implementation of existing approaches on common data. Precision observed in our experiments is typically lower than the one reported by the original authors, which we attribute to the lack of detail in the original presentation of those approaches. Thus, we compare the existing works by what they really offer to the reader, including whether they allow for accurate implementation and for reliable reproduction of the reported results.

  20. In vivo RNAi in the Drosophila Follicular Epithelium: Analysis of Stem Cell Maintenance, Proliferation, and Differentiation.

    PubMed

    Riechmann, Veit

    2017-01-01

    In vivo RNAi in Drosophila facilitates simple and rapid analysis of gene functions in a cell- or tissue-specific manner. The versatility of the UAS-GAL4 system allows to control exactly where and when during development the function of a gene is depleted. The epithelium of the ovary is a particularly good model to study in a living animal how stem cells are maintained and how their descendants proliferate and differentiate. Here I provide basic information about the publicly available reagents for in vivo RNAi, and I describe how the oogenesis system can be applied to analyze stem cells and epithelial development at a histological level. Moreover, I give helpful hints to optimize the use of the UAS-GAL4 system for RNAi induction in the follicular epithelium. Finally, I provide detailed step-by-step protocols for ovary dissection, antibody stainings, and ovary mounting for microscopic analysis.

  1. Unmasking the masked Universe: the 2M++ catalogue through Bayesian eyes

    NASA Astrophysics Data System (ADS)

    Lavaux, Guilhem; Jasche, Jens

    2016-01-01

    This work describes a full Bayesian analysis of the Nearby Universe as traced by galaxies of the 2M++ survey. The analysis is run in two sequential steps. The first step self-consistently derives the luminosity-dependent galaxy biases, the power spectrum of matter fluctuations and matter density fields within a Gaussian statistic approximation. The second step makes a detailed analysis of the three-dimensional large-scale structures, assuming a fixed bias model and a fixed cosmology. This second step allows for the reconstruction of both the final density field and the initial conditions at z = 1000 assuming a fixed bias model. From these, we derive fields that self-consistently extrapolate the observed large-scale structures. We give two examples of these extrapolation and their utility for the detection of structures: the visibility of the Sloan Great Wall, and the detection and characterization of the Local Void using DIVA, a Lagrangian based technique to classify structures.

  2. Technical Manual for the SAM Physical Trough Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, M. J.; Gilman, P.

    2011-06-01

    NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field,more » power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.« less

  3. Developing Automated Spectral Analysis Tools for Interstellar Features Extractionto Support Construction of the 3D ISM Map

    NASA Astrophysics Data System (ADS)

    Puspitarini, L.; Lallement, R.; Monreal-Ibero, A.; Chen, H.-C.; Malasan, H. L.; Aprilia; Arifyanto, M. I.; Irfan, M.

    2018-04-01

    One of the ways to obtain a detailed 3D ISM map is by gathering interstellar (IS) absorption data toward widely distributed background target stars at known distances (line-of-sight/LOS data). The radial and angular evolution of the LOS measurements allow the inference of the ISM spatial distribution. For a better spatial resolution, one needs a large number of the LOS data. It requires building fast tools to measure IS absorption. One of the tools is a global analysis that fit two different diffuse interstellar bands (DIBs) simultaneously. We derived the equivalent width (EW) ratio of the two DIBs recorded in each spectrum of target stars. The ratio variability can be used to study IS environmental conditions or to detect DIB family.

  4. Mathematical Analysis and Optimization of Infiltration Processes

    NASA Technical Reports Server (NTRS)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  5. A modified version of fluctuating asymmetry, potential for the analysis of Aesculus hippocastanum L. compound leaves.

    PubMed

    Velickovic, Miroslava

    2008-01-01

    My research interest was to create a new, simple and tractable mathematical framework for analyzing fluctuating asymmetry (FA) in Aesculus hippocastanum L. palmately compound leaves (each compound leaf with 7 obviate, serrate leaflets). FA, being random differences in the development of both sides of a bilaterally symmetrical character, has been proposed as an indicator of environmental and genetic stress. In the present paper the well-established Palmer's procedure for FA has been modified to improve the suitability of the chosen index (FA1) to be used in compound leaf asymmetry analysis. The processing steps are described in detail, allowing us to apply these modifications for the other Palmer's indices of FA as well as for the compound leaves of other plant species.

  6. Sunrise/sunset thermal shock disturbance analysis and simulation for the TOPEX satellite

    NASA Technical Reports Server (NTRS)

    Dennehy, C. J.; Welch, R. V.; Zimbelman, D. F.

    1990-01-01

    It is shown here that during normal on-orbit operations the TOPEX low-earth orbiting satellite is subjected to an impulsive disturbance torque caused by rapid heating of its solar array when entering and exiting the earth's shadow. Error budgets and simulation results are used to demonstrate that this sunrise/sunset torque disturbance is the dominant Normal Mission Mode (NMM) attitude error source. The detailed thermomechanical modeling, analysis, and simulation of this torque is described, and the predicted on-orbit performance of the NMM attitude control system in the face of the sunrise/sunset disturbance is presented. The disturbance results in temporary attitude perturbations that exceed NMM pointing requirements. However, they are below the maximum allowable pointing error which would cause the radar altimeter to break lock.

  7. Galactic Astronomy in the Ultraviolet

    NASA Astrophysics Data System (ADS)

    Rastorguev, A. S.; Sachkov, M. E.; Zabolotskikh, M. V.

    2017-12-01

    We propose a number of prospective observational programs for the ultraviolet space observatory WSO-UV, which seem to be of great importance to modern galactic astronomy. The programs include the search for binary Cepheids; the search and detailed photometric study and the analysis of radial distribution of UV-bright stars in globular clusters ("blue stragglers", blue horizontal-branch stars, RR Lyrae variables, white dwarfs, and stars with UV excesses); the investigation of stellar content and kinematics of young open clusters and associations; the study of spectral energy distribution in hot stars, including calculation of the extinction curves in the UV, optical and NIR; and accurate definition of the relations between the UV-colors and effective temperature. The high angular resolution of the observatory allows accurate astrometric measurements of stellar proper motions and their kinematic analysis.

  8. PSIDD (2): A Prototype Post-Scan Interactive Data Display System for Detailed Analysis of Ultrasonic Scans

    NASA Technical Reports Server (NTRS)

    Cao, Wei; Roth, Don J.

    1997-01-01

    This article presents the description of PSIDD(2), a post-scan interactive data display system for ultrasonic contact scan and single measurement analysis. PSIDD(2) was developed in conjunction with ASTM standards for ultrasonic velocity and attenuation coefficient contact measurements. This system has been upgraded from its original version PSIDD(1) and improvements are described in this article. PSIDD(2) implements a comparison mode where the display of time domain waveforms and ultrasonic properties versus frequency can be shown for up to five scan points on one plot. This allows the rapid contrasting of sample areas exhibiting different ultrasonic properties as initially indicated by the ultrasonic contact scan image. This improvement plus additional features to be described in the article greatly facilitate material microstructural appraisal.

  9. PDCO: Polarizational-directional correlation from oriented nuclei

    NASA Astrophysics Data System (ADS)

    Droste, Ch.; Rohoziński, S. G.; Starosta, K.; Morek, T.; Srebrny, J.; Magierski, P.

    1996-02-01

    A general formula is given for correlation between two polarized gamma rays ( γ1 and γ2) emitted in a cascade from an oriented (for example, due to a heavy ion reaction) nucleus. It allows us or one to calculate the angular correlation between: (a) Linear polarizations of γ1 and γ2. (b) Polarization of γ1 and direction of γ2 or vice versa. (c) Directions of γ1 and γ2 (DCO). The formula, discussed in detail for the case (b), can be used in the analysis of data coming from the modern multidetector gamma ray spectrometers that contain new generation detectors (e.g. CLOVER) sensitive to the polarization. The analysis of polarization together with DCO ratio can lead to a unique spin/parity assignment and a mixing ratio determination.

  10. Forget about data, deliver results

    NASA Astrophysics Data System (ADS)

    Walter, Roland

    2015-12-01

    High-energy astrophysics space missions have pioneered and demonstrated the power of legacy data sets for generating new discoveries, especially when analysed in ways original researchers could not have anticipated. The only way to ensure that the data of present observatories can be effectively used in the future is to allow users to perform on-the-fly data analysis to produce straightforwardly scientific results for any sky position, time and energy intervals without requiring mission specific software or detailed instrumental knowledge. Providing a straightforward interface to complex data and data analysis makes the data and the process of generating science results available to the public and higher education and promotes the visibility of the investment in science to the society. This is a fundamental step to transmit the values of science and to evolve towards a knowledge society.

  11. Extension of an Object-Oriented Optimization Tool: User's Reference Manual

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Truong, Samson S.

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.

  12. Polar ring galaxies in the Galaxy Zoo

    NASA Astrophysics Data System (ADS)

    Finkelman, Ido; Funes, José G.; Brosch, Noah

    2012-05-01

    We report observations of 16 candidate polar-ring galaxies (PRGs) identified by the Galaxy Zoo project in the Sloan Digital Sky Survey (SDSS) data base. Deep images of five galaxies are available in the SDSS Stripe82 data base, while to reach similar depth we observed the remaining galaxies with the 1.8-m Vatican Advanced Technology Telescope. We derive integrated magnitudes and u-r colours for the host and ring components and show continuum-subtracted Hα+[N II] images for seven objects. We present a basic morphological and environmental analysis of the galaxies and discuss their properties in comparison with other types of early-type galaxies. Follow-up photometric and spectroscopic observations will allow a kinematic confirmation of the nature of these systems and a more detailed analysis of their stellar populations.

  13. Measurement of Galactic Logarithmic Spiral Arm Pitch Angle Using Two-dimensional Fast Fourier Transform Decomposition

    NASA Astrophysics Data System (ADS)

    Davis, Benjamin L.; Berrier, Joel C.; Shields, Douglas W.; Kennefick, Julia; Kennefick, Daniel; Seigar, Marc S.; Lacy, Claud H. S.; Puerari, Ivânio

    2012-04-01

    A logarithmic spiral is a prominent feature appearing in a majority of observed galaxies. This feature has long been associated with the traditional Hubble classification scheme, but historical quotes of pitch angle of spiral galaxies have been almost exclusively qualitative. We have developed a methodology, utilizing two-dimensional fast Fourier transformations of images of spiral galaxies, in order to isolate and measure the pitch angles of their spiral arms. Our technique provides a quantitative way to measure this morphological feature. This will allow comparison of spiral galaxy pitch angle to other galactic parameters and test spiral arm genesis theories. In this work, we detail our image processing and analysis of spiral galaxy images and discuss the robustness of our analysis techniques.

  14. Missing in Amazonian jungle: a case report of skeletal evidence for dismemberment.

    PubMed

    Delabarde, Tania; Ludes, Bertrand

    2010-07-01

    This case study presents the results of the recovery and analysis of three sets of disarticulated and incomplete human remains found in Ecuador, within the Amazonian jungle. Recovered body parts sustained extensive sharp force trauma situated on different aspect of the skeleton. The anthropological examination (bone reassembly, biological profile) was followed by a detailed analysis of cut marks, including a basic experimental study on pig bones to demonstrate that dismemberment may have occurred within a certain amount of time after death. Despite the location (deep into the Amazonian jungle) and the perpetrator's actions (dismemberment and dispersion of body parts in a river), forensic work both on the field and in laboratory allowed identification of the victims and the reconstruction of the sequence of events.

  15. A topological multilayer model of the human body.

    PubMed

    Barbeito, Antonio; Painho, Marco; Cabral, Pedro; O'Neill, João

    2015-11-04

    Geographical information systems deal with spatial databases in which topological models are described with alphanumeric information. Its graphical interfaces implement the multilayer concept and provide powerful interaction tools. In this study, we apply these concepts to the human body creating a representation that would allow an interactive, precise, and detailed anatomical study. A vector surface component of the human body is built using a three-dimensional (3-D) reconstruction methodology. This multilayer concept is implemented by associating raster components with the corresponding vector surfaces, which include neighbourhood topology enabling spatial analysis. A root mean square error of 0.18 mm validated the three-dimensional reconstruction technique of internal anatomical structures. The expansion of the identification and the development of a neighbourhood analysis function are the new tools provided in this model.

  16. Fluorescence recovery after photo-bleaching as a method to determine local diffusion coefficient in the stratum corneum.

    PubMed

    Anissimov, Yuri G; Zhao, Xin; Roberts, Michael S; Zvyagin, Andrei V

    2012-10-01

    Fluorescence recovery after photo-bleaching experiments were performed in human stratum corneum in vitro. Fluorescence multiphoton tomography was used, which allowed the dimensions of the photobleached volume to be at the micron scale and located fully within the lipid phase of the stratum corneum. Analysis of the fluorescence recovery data with simplified mathematical models yielded the diffusion coefficient of small molecular weight organic fluorescent dye Rhodamine B in the stratum corneum lipid phase of about (3-6) × 10(-9)cm(2) s(-1). It was concluded that the presented method can be used for detailed analysis of localised diffusion coefficients in the stratum corneum phases for various fluorescent probes. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Detection and Analysis of Circular RNAs by RT-PCR.

    PubMed

    Panda, Amaresh C; Gorospe, Myriam

    2018-03-20

    Gene expression in eukaryotic cells is tightly regulated at the transcriptional and posttranscriptional levels. Posttranscriptional processes, including pre-mRNA splicing, mRNA export, mRNA turnover, and mRNA translation, are controlled by RNA-binding proteins (RBPs) and noncoding (nc)RNAs. The vast family of ncRNAs comprises diverse regulatory RNAs, such as microRNAs and long noncoding (lnc)RNAs, but also the poorly explored class of circular (circ)RNAs. Although first discovered more than three decades ago by electron microscopy, only the advent of high-throughput RNA-sequencing (RNA-seq) and the development of innovative bioinformatic pipelines have begun to allow the systematic identification of circRNAs (Szabo and Salzman, 2016; Panda et al ., 2017b; Panda et al ., 2017c). However, the validation of true circRNAs identified by RNA sequencing requires other molecular biology techniques including reverse transcription (RT) followed by conventional or quantitative (q) polymerase chain reaction (PCR), and Northern blot analysis (Jeck and Sharpless, 2014). RT-qPCR analysis of circular RNAs using divergent primers has been widely used for the detection, validation, and sometimes quantification of circRNAs (Abdelmohsen et al ., 2015 and 2017; Panda et al ., 2017b). As detailed here, divergent primers designed to span the circRNA backsplice junction sequence can specifically amplify the circRNAs and not the counterpart linear RNA. In sum, RT-PCR analysis using divergent primers allows direct detection and quantification of circRNAs.

  18. Guidance simulation and test support for differential GPS flight experiment

    NASA Technical Reports Server (NTRS)

    Geier, G. J.; Loomis, P. V. W.; Cabak, A.

    1987-01-01

    Three separate tasks which supported the test preparation, test operations, and post test analysis of the NASA Ames flight test evaluation of the differential Global Positioning System (GPS) are presented. Task 1 consisted of a navigation filter design, coding, and testing to optimally make use of GPS in a differential mode. The filter can be configured to accept inputs from external censors such as an accelerometer and a barometric or radar altimeter. The filter runs in real time onboard a NASA helicopter. It processes raw pseudo and delta range measurements from a single channel sequential GPS receiver. The Kalman filter software interfaces are described in detail, followed by a description of the filter algorithm, including the basic propagation and measurement update equations. The performance during flight tests is reviewed and discussed. Task 2 describes a refinement performed on the lateral and vertical steering algorithms developed on a previous contract. The refinements include modification of the internal logic to allow more diverse inflight initialization procedures, further data smoothing and compensation for system induced time delays. Task 3 describes the TAU Corp participation in the analysis of the real time Kalman navigation filter. The performance was compared to that of the Z-set filter in flight and to the laser tracker position data during post test analysis. This analysis allowed a more optimum selection of the parameters of the filter.

  19. The Role of Faulting on the Growth of a Carbonate Platform: Evidence from 3D Seismic Analysis and Section Restoration

    NASA Astrophysics Data System (ADS)

    Nur Fathiyah Jamaludin, Siti; Pubellier, Manuel; Prasad Ghosh, Deva; Menier, David; Pierson, Bernard

    2014-05-01

    Tectonics in addition to other environmental factors impacts the growth of carbonate platforms and plays an important role in shaping the internal architecture of the platforms. Detailed of faults and fractures development and healing in carbonate environment have not been explored sufficiently. Using 3D seismic and well data, we attempt to reconstruct the structural evolution of a Miocene carbonate platform in Central Luconia Province, offshore Malaysia. Luconia Province is located in the NW coast of Borneo and has become one of the largest carbonate factories in SE Asia. Seismic interpretations including seismic attribute analysis are applied to the carbonate platform to discern its sedimentology and structural details. Detailed seismic interpretations highlight the relationships of carbonate deposition with syn-depositional faulting. Branching conjugate faults are common in this carbonate platform and have become a template for reef growth, attesting lateral facies changes within the carbonate environments. Structural restoration was then appropriately performed on the interpreted seismic sections based on sequential restoration techniques, and provided images different from those of horizon flattening methods. This permits us to compensate faults' displacement, remove recent sediment layers and finally restore the older rock units prior to the fault motions. It allows prediction of platform evolution as a response to faulting before and after carbonate deposition and also enhances the pitfalls of interpretation. Once updated, the reconstructions allow unravelling of the un-seen geological features underneath the carbonate platform, such as paleo-structures and paleo-topography which in turn reflects the paleo-environment before deformations took place. Interestingly, sections balancing and restoration revealed the late-phase (Late Oligocene-Early Miocene) rifting of South China Sea, otherwise difficult to visualize on seismic sections. Later it is shown that this carbonate platform was possibly originated from two or more connected reef build-ups. The platform evolution in terms of tectonic influences on carbonate growth and development may serve as a case example for re-evaluation of pre-Late Miocene structures as a new potential target for hydrocarbon exploration in Central Luconia Province. Eventually, techniques used in this study might be of interest to oil and gas explorers in carbonate system.

  20. Modelling the nonlinear behaviour of an underplatform damper test rig for turbine applications

    NASA Astrophysics Data System (ADS)

    Pesaresi, L.; Salles, L.; Jones, A.; Green, J. S.; Schwingshackl, C. W.

    2017-02-01

    Underplatform dampers (UPD) are commonly used in aircraft engines to mitigate the risk of high-cycle fatigue failure of turbine blades. The energy dissipated at the friction contact interface of the damper reduces the vibration amplitude significantly, and the couplings of the blades can also lead to significant shifts of the resonance frequencies of the bladed disk. The highly nonlinear behaviour of bladed discs constrained by UPDs requires an advanced modelling approach to ensure that the correct damper geometry is selected during the design of the turbine, and that no unexpected resonance frequencies and amplitudes will occur in operation. Approaches based on an explicit model of the damper in combination with multi-harmonic balance solvers have emerged as a promising way to predict the nonlinear behaviour of UPDs correctly, however rigorous experimental validations are required before approaches of this type can be used with confidence. In this study, a nonlinear analysis based on an updated explicit damper model having different levels of detail is performed, and the results are evaluated against a newly-developed UPD test rig. Detailed linear finite element models are used as input for the nonlinear analysis, allowing the inclusion of damper flexibility and inertia effects. The nonlinear friction interface between the blades and the damper is described with a dense grid of 3D friction contact elements which allow accurate capturing of the underlying nonlinear mechanism that drives the global nonlinear behaviour. The introduced explicit damper model showed a great dependence on the correct contact pressure distribution. The use of an accurate, measurement based, distribution, better matched the nonlinear dynamic behaviour of the test rig. Good agreement with the measured frequency response data could only be reached when the zero harmonic term (constant term) was included in the multi-harmonic expansion of the nonlinear problem, highlighting its importance when the contact interface experiences large normal load variation. The resulting numerical damper kinematics with strong translational and rotational motion, and the global blades frequency response were fully validated experimentally, showing the accuracy of the suggested high detailed explicit UPD modelling approach.

  1. Strong anticipation and long-range cross-correlation: Application of detrended cross-correlation analysis to human behavioral data

    NASA Astrophysics Data System (ADS)

    Delignières, Didier; Marmelat, Vivien

    2014-01-01

    In this paper, we analyze empirical data, accounting for coordination processes between complex systems (bimanual coordination, interpersonal coordination, and synchronization with a fractal metronome), by using a recently proposed method: detrended cross-correlation analysis (DCCA). This work is motivated by the strong anticipation hypothesis, which supposes that coordination between complex systems is not achieved on the basis of local adaptations (i.e., correction, predictions), but results from a more global matching of complexity properties. Indeed, recent experiments have evidenced a very close correlation between the scaling properties of the series produced by two coordinated systems, despite a quite weak local synchronization. We hypothesized that strong anticipation should result in the presence of long-range cross-correlations between the series produced by the two systems. Results allow a detailed analysis of the effects of coordination on the fluctuations of the series produced by the two systems. In the long term, series tend to present similar scaling properties, with clear evidence of long-range cross-correlation. Short-term results strongly depend on the nature of the task. Simulation studies allow disentangling the respective effects of noise and short-term coupling processes on DCCA results, and suggest that the matching of long-term fluctuations could be the result of short-term coupling processes.

  2. Merging Dietary Assessment with the Adolescent Lifestyle

    PubMed Central

    Schap, TusaRebecca E; Zhu, Fengqing M; Delp, Edward J; Boushey, Carol J

    2013-01-01

    The use of image-based dietary assessment methods shows promise for improving dietary self-report among children. The Technology Assisted Dietary Assessment (TADA) food record application is a self-administered food record specifically designed to address the burden and human error associated with conventional methods of dietary assessment. Users would take images of foods and beverages at all eating occasions using a mobile telephone or mobile device with an integrated camera, (e.g., Apple iPhone, Google Nexus One, Apple iPod Touch). Once the images are taken, the images are transferred to a back-end server for automated analysis. The first step in this process is image analysis, i.e., segmentation, feature extraction, and classification, allows for automated food identification. Portion size estimation is also automated via segmentation and geometric shape template modeling. The results of the automated food identification and volume estimation can be indexed with the Food and Nutrient Database for Dietary Studies (FNDDS) to provide a detailed diet analysis for use in epidemiologic or intervention studies. Data collected during controlled feeding studies in a camp-like setting have allowed for formative evaluation and validation of the TADA food record application. This review summarizes the system design and the evidence-based development of image-based methods for dietary assessment among children. PMID:23489518

  3. Measurement and statistical analysis of single-molecule current-voltage characteristics, transition voltage spectroscopy, and tunneling barrier height.

    PubMed

    Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian

    2011-11-30

    We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.

  4. Computational and Statistical Analyses of Amino Acid Usage and Physico-Chemical Properties of the Twelve Late Embryogenesis Abundant Protein Classes

    PubMed Central

    Jaspard, Emmanuel; Macherel, David; Hunault, Gilles

    2012-01-01

    Late Embryogenesis Abundant Proteins (LEAPs) are ubiquitous proteins expected to play major roles in desiccation tolerance. Little is known about their structure - function relationships because of the scarcity of 3-D structures for LEAPs. The previous building of LEAPdb, a database dedicated to LEAPs from plants and other organisms, led to the classification of 710 LEAPs into 12 non-overlapping classes with distinct properties. Using this resource, numerous physico-chemical properties of LEAPs and amino acid usage by LEAPs have been computed and statistically analyzed, revealing distinctive features for each class. This unprecedented analysis allowed a rigorous characterization of the 12 LEAP classes, which differed also in multiple structural and physico-chemical features. Although most LEAPs can be predicted as intrinsically disordered proteins, the analysis indicates that LEAP class 7 (PF03168) and probably LEAP class 11 (PF04927) are natively folded proteins. This study thus provides a detailed description of the structural properties of this protein family opening the path toward further LEAP structure - function analysis. Finally, since each LEAP class can be clearly characterized by a unique set of physico-chemical properties, this will allow development of software to predict proteins as LEAPs. PMID:22615859

  5. MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system

    NASA Astrophysics Data System (ADS)

    Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg

    2005-01-01

    We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.

  6. Sensory properties of Californian and imported extra virgin olive oils.

    PubMed

    Delgado, Claudia; Guinard, Jean-Xavier

    2011-04-01

    Production and consumption of extra-virgin olive has been increasing in the United States, particularly in California. The objective of this study was to compare the sensory characteristics of 22 extra virgin olive oils (EVOO) from California, Italy, Spain, Chile, and Australia using a generic descriptive analysis. A total of 22 sensory attributes were identified and defined by the descriptive panel. With the exception of thick and citrus, all sensory attributes were significantly different among the oils. Canonical Variate Analysis (CVA) showed that California oils differed from some imported EVOOs, mainly by their absence of defects. A second analysis, of only those attributes included in the International Olive Council (IOC) official scorecard, provided a less detailed description of the samples and did not allow for a full characterization of the oils. While the IOC attributes allowed for faster classification in terms of clean versus defective EVOOs, the more comprehensive descriptive analysis provided both more information and a more refined classification of the samples. Variety and region of origin were important factors in the classification of both Californian and imported EVOOs.   Measuring olive oil sensory quality using the IOC method-positive attributes of fruitiness, bitterness, and pungency, and defects including fusty, musty, winey, and rancid-allows for the certification of oils as extra virgin but it provides limited information on the sensory characteristics of the oils. A full descriptive profile, on the other hand, provides information that can be used by producers in the processing and marketing of their oils, and is a useful tool in the education of consumers about the wide range of (positive) sensory attributes in EVOO and the various sensory styles of EVOO.

  7. Beam Spin Asymmetry Measurements for Two Pion Photoproduction at CLAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mark D.

    2015-09-01

    The overarching goal of this analysis, and many like it, is to develop our understanding of the strong force interactions within the nucleon by examining the nature of their excitation spectra. As the resonances of these spectra have very short lifetimes (tau = 1x10 -23 s) and often have very similar masses, it is often impossible to directly observe resonances in the excitation spectra of nucleons. Polarization observables allow us to study the resonances by looking at how they affect the spin state of final state particles. The beam asymmetry is a polarization observable that allows us to detect themore » sensitivity of these resonances, and other transition mechanisms, to the electric vector orientation of incident photons. Presented in this thesis are first measurements of the beam asymmetries in the resonant region for the reaction channel pgamma p --> p π + π -focusing on the intermediate mesonic states rho^0 and f^0, and the final state pions. The analysis used data from the g8b experiment undertaken at the Thomas Jefferson National Accelerator Facility (JLab), the first experiment at JLab to use a linearly polarized photon beam. Using the coherent Bremsstrahlung facility and the CLAS detector of Hall B at JLab allowed for many multi-channel reactions to be detected and the first measurements of many polarization observables including those presented here. A brief overview of the theoretical framework used to undertake this analysis is given, followed by a description of the experimental details of the facilities used, then a description of the calibration of the Bremsstrahlung tagging facility which the author undertook, and finally the analysis is presented and the resulting measurements.« less

  8. Approximate Analysis for Interlaminar Stresses in Composite Structures with Thickness Discontinuities

    NASA Technical Reports Server (NTRS)

    Rose, Cheryl A.; Starnes, James H., Jr.

    1996-01-01

    An efficient, approximate analysis for calculating complete three-dimensional stress fields near regions of geometric discontinuities in laminated composite structures is presented. An approximate three-dimensional local analysis is used to determine the detailed local response due to far-field stresses obtained from a global two-dimensional analysis. The stress results from the global analysis are used as traction boundary conditions for the local analysis. A generalized plane deformation assumption is made in the local analysis to reduce the solution domain to two dimensions. This assumption allows out-of-plane deformation to occur. The local analysis is based on the principle of minimum complementary energy and uses statically admissible stress functions that have an assumed through-the-thickness distribution. Examples are presented to illustrate the accuracy and computational efficiency of the local analysis. Comparisons of the results of the present local analysis with the corresponding results obtained from a finite element analysis and from an elasticity solution are presented. These results indicate that the present local analysis predicts the stress field accurately. Computer execution-times are also presented. The demonstrated accuracy and computational efficiency of the analysis make it well suited for parametric and design studies.

  9. Microanalysis of dental caries using laser-scanned fluorescence

    NASA Astrophysics Data System (ADS)

    Barron, Joseph R.; Paton, Barry E.; Zakariasen, Kenneth L.

    1992-06-01

    It is well known that enamel and dentin fluoresce when illuminated by short-wavelength optical radiation. Fluorescence emission from carious and non-carious regions of teeth have been studied using a new experimental scanning technique for fluorescence analysis of dental sections. Scanning in 2 dimensions will allow surface maps of dental caries to be created. These surface images are then enhanced using the conventional and newer image processing techniques. Carious regions can be readily identified and contour maps can be used to graphically display the degree of damage on both surfaces and transverse sections. Numerous studies have shown that carious fluorescence is significantly different than non-carious regions. The scanning laser fluorescence spectrometer focuses light from a 25 mW He-Cd laser at 442 nm through an objective lens onto a cross-section area as small as 3 micrometers in diameter. Microtome prepared dental samples 100 micrometers thick are laid flat onto an optical bench perpendicular to the incident beam. The sample is moved under computer control in X & Y with an absolute precision of 0.1 micrometers . The backscattered light is both spatial and wavelength filtered before being measured on a long wavelength sensitized photomultiplier tube. High precision analysis of dental samples allow detailed maps of carious regions to be determined. Successive images allow time studies of caries growth and even the potential for remineralization studies of decalcified regions.

  10. Research in Theoretical High Energy Nuclear Physics at the University of Arizona

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rafelski, Johann

    In the past decade (2004-2015) we addressed the quest for the understanding of how quark confinement works, how it can be dissolved in a limited space-time domain, and what this means: i) for the paradigm of the laws of physics of present day; and, ii) for our understanding of cosmology. The focus of our in laboratory matter formation work has been centered on the understanding of the less frequently produced hadronic particles (e.g. strange antibaryons, charmed and beauty hadrons, massive resonances, charmonium, B c). We have developed a public analysis tool, SHARE (Statistical HAdronization with REsonances) which allows a precisemore » model description of experimental particle yield and fluctuation data. We have developed a charm recombination model to allow for off-equilibrium rate of charmonium production. We have developed methods and techniques which allowed us to study the hadron resonance yield evolution by kinetic theory. We explored entropy, strangeness and charm as signature of QGP addressing the wide range of reaction energy for AGS, SPS, RHIC and LHC energy range. In analysis of experimental data, we obtained both statistical parameters as well as physical properties of the hadron source. The following pages present listings of our primary writing on these questions. The abstracts are included in lieu of more detailed discussion of our research accomplishments in each of the publications.« less

  11. Microscope-Integrated Optical Coherence Tomography Angiography in the Operating Room in Young Children With Retinal Vascular Disease.

    PubMed

    Chen, Xi; Viehland, Christian; Carrasco-Zevallos, Oscar M; Keller, Brenton; Vajzovic, Lejla; Izatt, Joseph A; Toth, Cynthia A

    2017-05-01

    Intraoperative optical coherence tomography (OCT) has gained traction as an important adjunct for clinical decision making during vitreoretinal surgery, and OCT angiography (OCTA) has provided novel insights in clinical evaluation of retinal diseases. To date, these two technologies have not been applied in combination to evaluate retinal vascular disease in the operating suite. To conduct microscope-integrated, swept-source OCTA (MIOCTA) in children with retinal vascular disease. In this case report analysis, OCT imaging in pediatric patients, MIOCTA images were obtained during examination under anesthesia from a young boy with a history of idiopathic vitreous hemorrhage and a female infant with familial exudative vitreoretinopathy. Side-by-side comparison of research MIOCT angiograms and clinically indicated fluorescein angiograms. In 2 young children with retinal vascular disease, the MIOCTA images showed more detailed vascular patterns than were visible on the fluorescein angiograms although within a more posterior field of view. The MIOCTA system allowed visualization of small pathological retinal vessels in the retinal periphery that were obscured in the fluorescein angiograms by fluorescein staining from underlying, preexisting laser scars. This is the first report to date of the use of MIOCTA in the operating room for young children with retinal vascular disease. Further optimization of this system may allow noninvasive detailed evaluation of retinal vasculature during surgical procedures and in patients who could not cooperate with in-office examinations.

  12. Isolation, electron microscopic imaging, and 3-D visualization of native cardiac thin myofilaments.

    PubMed

    Spiess, M; Steinmetz, M O; Mandinova, A; Wolpensinger, B; Aebi, U; Atar, D

    1999-06-15

    An increasing number of cardiac diseases are currently pinpointed to reside at the level of the thin myofilaments (e.g., cardiomyopathies, reperfusion injury). Hence the aim of our study was to develop a new method for the isolation of mammalian thin myofilaments suitable for subsequent high-resolution electron microscopic imaging. Native cardiac thin myofilaments were extracted from glycerinated porcine myocardial tissue in the presence of protease inhibitors. Separation of thick and thin myofilaments was achieved by addition of ATP and several centrifugation steps. Negative staining and subsequent conventional and scanning transmission electron microscopy (STEM) of thin myofilaments permitted visualization of molecular details; unlike conventional preparations of thin myofilaments, our method reveals the F-actin moiety and allows direct recognition of thin myofilament-associated porcine cardiac troponin complexes. They appear as "bulges" at regular intervals of approximately 36 nm along the actin filaments. Protein analysis using SDS-polyacrylamide gel electrophoresis revealed that only approximately 20% troponin I was lost during the isolation procedure. In a further step, 3-D helical reconstructions were calculated using STEM dark-field images. These 3-D reconstructions will allow further characterization of molecular details, and they will be useful for directly visualizing molecular alterations related to diseased cardiac thin myofilaments (e.g., reperfusion injury, alterations of Ca2+-mediated tropomyosin switch). Copyright 1999 Academic Press.

  13. iELM—a web server to explore short linear motif-mediated interactions

    PubMed Central

    Weatheritt, Robert J.; Jehl, Peter; Dinkel, Holger; Gibson, Toby J.

    2012-01-01

    The recent expansion in our knowledge of protein–protein interactions (PPIs) has allowed the annotation and prediction of hundreds of thousands of interactions. However, the function of many of these interactions remains elusive. The interactions of Eukaryotic Linear Motif (iELM) web server provides a resource for predicting the function and positional interface for a subset of interactions mediated by short linear motifs (SLiMs). The iELM prediction algorithm is based on the annotated SLiM classes from the Eukaryotic Linear Motif (ELM) resource and allows users to explore both annotated and user-generated PPI networks for SLiM-mediated interactions. By incorporating the annotated information from the ELM resource, iELM provides functional details of PPIs. This can be used in proteomic analysis, for example, to infer whether an interaction promotes complex formation or degradation. Furthermore, details of the molecular interface of the SLiM-mediated interactions are also predicted. This information is displayed in a fully searchable table, as well as graphically with the modular architecture of the participating proteins extracted from the UniProt and Phospho.ELM resources. A network figure is also presented to aid the interpretation of results. The iELM server supports single protein queries as well as large-scale proteomic submissions and is freely available at http://i.elm.eu.org. PMID:22638578

  14. Confounding adjustment in comparative effectiveness research conducted within distributed research networks.

    PubMed

    Toh, Sengwee; Gagne, Joshua J; Rassen, Jeremy A; Fireman, Bruce H; Kulldorff, Martin; Brown, Jeffrey S

    2013-08-01

    A distributed research network (DRN) of electronic health care databases, in which data reside behind the firewall of each data partner, can support a wide range of comparative effectiveness research (CER) activities. An essential component of a fully functional DRN is the capability to perform robust statistical analyses to produce valid, actionable evidence without compromising patient privacy, data security, or proprietary interests. We describe the strengths and limitations of different confounding adjustment approaches that can be considered in observational CER studies conducted within DRNs, and the theoretical and practical issues to consider when selecting among them in various study settings. Several methods can be used to adjust for multiple confounders simultaneously, either as individual covariates or as confounder summary scores (eg, propensity scores and disease risk scores), including: (1) centralized analysis of patient-level data, (2) case-centered logistic regression of risk set data, (3) stratified or matched analysis of aggregated data, (4) distributed regression analysis, and (5) meta-analysis of site-specific effect estimates. These methods require different granularities of information be shared across sites and afford investigators different levels of analytic flexibility. DRNs are growing in use and sharing of highly detailed patient-level information is not always feasible in DRNs. Methods that incorporate confounder summary scores allow investigators to adjust for a large number of confounding factors without the need to transfer potentially identifiable information in DRNs. They have the potential to let investigators perform many analyses traditionally conducted through a centralized dataset with detailed patient-level information.

  15. Quantitative phase analysis and microstructure characterization of magnetite nanocrystals obtained by microwave assisted non-hydrolytic sol–gel synthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sciancalepore, Corrado, E-mail: corrado.sciancalepore@unimore.it; Bondioli, Federica; INSTM Consortium, Via G. Giusti 9, 51121 Firenze

    2015-02-15

    An innovative preparation procedure, based on microwave assisted non-hydrolytic sol–gel synthesis, to obtain spherical magnetite nanoparticles was reported together with a detailed quantitative phase analysis and microstructure characterization of the synthetic products. The nanoparticle growth was analyzed as a function of the synthesis time and was described in terms of crystallization degree employing the Rietveld method on the magnetic nanostructured system for the determination of the amorphous content using hematite as internal standard. Product crystallinity increases as the microwave thermal treatment is increased and reaches very high percentages for synthesis times longer than 1 h. Microstructural evolution of nanocrystals wasmore » followed by the integral breadth methods to obtain information on the crystallite size-strain distribution. The results of diffraction line profile analysis were compared with nanoparticle grain distribution estimated by dimensional analysis of the transmission electron microscopy (TEM) images. A variation both in the average grain size and in the distribution of the coherently diffraction domains is evidenced, allowing to suppose a relationship between the two quantities. The traditional integral breadth methods have proven to be valid for a rapid assessment of the diffraction line broadening effects in the above-mentioned nanostructured systems and the basic assumption for the correct use of these methods are discussed as well. - Highlights: • Fe{sub 3}O{sub 4} nanocrystals were obtained by MW-assisted non-hydrolytic sol–gel synthesis. • Quantitative phase analysis revealed that crystallinity up to 95% was reached. • The strategy of Rietveld refinements was discussed in details. • Dimensional analysis showed nanoparticles ranging from 4 to 8 nm. • Results of integral breadth methods were compared with microscopic analysis.« less

  16. TRU Waste Management Program. Cost/schedule optimization analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.A.; Raudenbush, M.H.; Wolaver, R.W.

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office Rockwell International (JIO/RI) during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, taskmore » guidance development, task monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short-term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. Systems models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions.« less

  17. NMR and MS Methods for Metabolomics.

    PubMed

    Amberg, Alexander; Riefke, Björn; Schlotterbeck, Götz; Ross, Alfred; Senn, Hans; Dieterle, Frank; Keck, Matthias

    2017-01-01

    Metabolomics, also often referred as "metabolic profiling," is the systematic profiling of metabolites in biofluids or tissues of organisms and their temporal changes. In the last decade, metabolomics has become more and more popular in drug development, molecular medicine, and other biotechnology fields, since it profiles directly the phenotype and changes thereof in contrast to other "-omics" technologies. The increasing popularity of metabolomics has been possible only due to the enormous development in the technology and bioinformatics fields. In particular, the analytical technologies supporting metabolomics, i.e., NMR, UPLC-MS, and GC-MS, have evolved into sensitive and highly reproducible platforms allowing the determination of hundreds of metabolites in parallel. This chapter describes the best practices of metabolomics as seen today. All important steps of metabolic profiling in drug development and molecular medicine are described in great detail, starting from sample preparation to determining the measurement details of all analytical platforms, and finally to discussing the corresponding specific steps of data analysis.

  18. NMR and MS methods for metabonomics.

    PubMed

    Dieterle, Frank; Riefke, Björn; Schlotterbeck, Götz; Ross, Alfred; Senn, Hans; Amberg, Alexander

    2011-01-01

    Metabonomics, also often referred to as "metabolomics" or "metabolic profiling," is the systematic profiling of metabolites in bio-fluids or tissues of organisms and their temporal changes. In the last decade, metabonomics has become increasingly popular in drug development, molecular medicine, and other biotechnology fields, since it profiles directly the phenotype and changes thereof in contrast to other "-omics" technologies. The increasing popularity of metabonomics has been possible only due to the enormous development in the technology and bioinformatics fields. In particular, the analytical technologies supporting metabonomics, i.e., NMR, LC-MS, UPLC-MS, and GC-MS have evolved into sensitive and highly reproducible platforms allowing the determination of hundreds of metabolites in parallel. This chapter describes the best practices of metabonomics as seen today. All important steps of metabolic profiling in drug development and molecular medicine are described in great detail, starting from sample preparation, to determining the measurement details of all analytical platforms, and finally, to discussing the corresponding specific steps of data analysis.

  19. In situ spectroradiometric calibration of EREP imagery and estuarine and coastal oceanography of Block Island sound and adjacent New York coastal waters. [Willcox, Arizona

    NASA Technical Reports Server (NTRS)

    Yost, E. F. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. The first part of the study resulted in photographic procedures for making multispectral positive images which greatly enhance the color differences in land detail using an additive color viewer. An additive color analysis of the geologic features near Willcox, Arizona using enhanced black and white multispectral positives allowed compilation of a significant number of unmapped geologic units which do not appear on geologic maps of the area. The second part demonstrated the feasibility of utilizing Skylab remote sensor data to monitor and manage the coastal environment by relating physical, chemical, and biological ship sampled data to S190A, S190B, and S192 image characteristics. Photographic reprocessing techniques were developed which greatly enhanced subtle low brightness water detail. Using these photographic contrast-stretch techniques, two water masses having an extinction coefficient difference of only 0.07 measured simultaneously with the acquisition of S190A data were readily differentiated.

  20. Thermal design of composite materials high temperature attachments

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The thermal aspects of using filamentary composite materials as primary airframe structures on advanced atmospheric entry spacecraft such as the space shuttle vehicle were investigated to identify and evaluate potential design approaches for maintaining composite structures within allowable temperature limits at thermal protection system (TPS) attachments and/or penetrations. The investigation included: (1) definition of thermophysical data for composite material structures; (2) parametric characterization and identification of the influence of the aerodynamic heating and attachment design parameters on composite material temperatures; (3) conceptual design, evaluation, and detailed thermal analyses of temperature limiting design concepts; and (4) the development of experimental data for assessment of the thermal design methodologies and data used for evaluation of the temperature-limiting design concepts. Temperature suppression attachment concepts were examined for relative merit. The simple isolator was identified as the most weight-effective concept and was selected for detail design, thermal analysis, and testing. Tests were performed on TPS standoff attachments to boron/aluminum, boron/polyimide and graphite/epoxy composite structures.

Top