Sample records for enabling detailed analysis

  1. ESR Analysis of Polymer Photo-Oxidation

    NASA Technical Reports Server (NTRS)

    Kim, Soon Sam; Liang, Ranty Hing; Tsay, Fun-Dow; Gupta, Amitave

    1987-01-01

    Electron-spin resonance identifies polymer-degradation reactions and their kinetics. New technique enables derivation of kinetic model of specific chemical reactions involved in degradation of particular polymer. Detailed information provided by new method enables prediction of aging characteristics long before manifestation of macroscopic mechanical properties.

  2. Acoustical Emission Source Location in Thin Rods Through Wavelet Detail Crosscorrelation

    DTIC Science & Technology

    1998-03-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS ACOUSTICAL EMISSION SOURCE LOCATION IN THIN RODS THROUGH WAVELET DETAIL CROSSCORRELATION...ACOUSTICAL EMISSION SOURCE LOCATION IN THIN RODS THROUGH WAVELET DETAIL CROSSCORRELATION 6. AUTHOR(S) Jerauld, Joseph G. 5. FUNDING NUMBERS Grant...frequency characteristics of Wavelet Analysis. Software implementation now enables the exploration of the Wavelet Transform to identify the time of

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strout, Michelle

    Programming parallel machines is fraught with difficulties: the obfuscation of algorithms due to implementation details such as communication and synchronization, the need for transparency between language constructs and performance, the difficulty of performing program analysis to enable automatic parallelization techniques, and the existence of important "dusty deck" codes. The SAIMI project developed abstractions that enable the orthogonal specification of algorithms and implementation details within the context of existing DOE applications. The main idea is to enable the injection of small programming models such as expressions involving transcendental functions, polyhedral iteration spaces with sparse constraints, and task graphs into full programsmore » through the use of pragmas. These smaller, more restricted programming models enable orthogonal specification of many implementation details such as how to map the computation on to parallel processors, how to schedule the computation, and how to allocation storage for the computation. At the same time, these small programming models enable the expression of the most computationally intense and communication heavy portions in many scientific simulations. The ability to orthogonally manipulate the implementation for such computations will significantly ease performance programming efforts and expose transformation possibilities and parameter to automated approaches such as autotuning. At Colorado State University, the SAIMI project was supported through DOE grant DE-SC3956 from April 2010 through August 2015. The SAIMI project has contributed a number of important results to programming abstractions that enable the orthogonal specification of implementation details in scientific codes. This final report summarizes the research that was funded by the SAIMI project.« less

  4. A Ballistic Limit Analysis Program for Shielding Against Micrometeoroids and Orbital Debris

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Erie

    2010-01-01

    A software program has been developed that enables the user to quickly and simply perform ballistic limit calculations for common spacecraft structures that are subject to hypervelocity impact of micrometeoroid and orbital debris (MMOD) projectiles. This analysis program consists of two core modules: design, and; performance. The design module enables a user to calculate preliminary dimensions of a shield configuration (e.g., thicknesses/areal densities, spacing, etc.) for a ?design? particle (diameter, density, impact velocity, incidence). The performance module enables a more detailed shielding analysis, providing the performance of a user-defined shielding configuration over the range of relevant in-orbit impact conditions.

  5. Aircraft Engine Systems

    NASA Technical Reports Server (NTRS)

    Veres, Joseph

    2001-01-01

    This report outlines the detailed simulation of Aircraft Turbofan Engine. The objectives were to develop a detailed flow model of a full turbofan engine that runs on parallel workstation clusters overnight and to develop an integrated system of codes for combustor design and analysis to enable significant reduction in design time and cost. The model will initially simulate the 3-D flow in the primary flow path including the flow and chemistry in the combustor, and ultimately result in a multidisciplinary model of the engine. The overnight 3-D simulation capability of the primary flow path in a complete engine will enable significant reduction in the design and development time of gas turbine engines. In addition, the NPSS (Numerical Propulsion System Simulation) multidisciplinary integration and analysis are discussed.

  6. High Fidelity System Simulation of Multiple Components in Support of the UEET Program

    NASA Technical Reports Server (NTRS)

    Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton

    2006-01-01

    The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.

  7. PEMNetwork: Barriers and Enablers to Collaboration and Multimedia Education in the Digital Age.

    PubMed

    Lumba-Brown, Angela; Tat, Sonny; Auerbach, Marc A; Kessler, David O; Alletag, Michelle; Grover, Purva; Schnadower, David; Macias, Charles G; Chang, Todd P

    2016-08-01

    In January 2005, PEMFellows.com was created to unify fellows in pediatric emergency medicine. Since then, the website has expanded, contracted, and focused to adapt to the interests of the pediatric emergency medicine practitioner during the internet boom. This review details the innovation of the PEMNetwork, from the inception of the initial website and its evolution into a needs-based, user-directed educational hub. Barriers and enablers to success are detailed with unique examples from descriptive analysis and metrics of PEMNetwork web traffic as well as examples from other online medical communities and digital education websites.

  8. Modern Methods of Rail Welding

    NASA Astrophysics Data System (ADS)

    Kozyrev, Nikolay A.; Kozyreva, Olga A.; Usoltsev, Aleksander A.; Kryukov, Roman E.; Shevchenko, Roman A.

    2017-10-01

    Existing methods of rail welding, which are enable to get continuous welded rail track, are observed in this article. Analysis of existing welding methods allows considering an issue of continuous rail track in detail. Metallurgical and welding technologies of rail welding and also process technologies reducing aftereffects of temperature exposure are important factors determining the quality and reliability of the continuous rail track. Analysis of the existing methods of rail welding enable to find the research line for solving this problem.

  9. Vii. New Kr IV - VII Oscillator Strengths and an Improved Spectral Analysis of the Hot, Hydrogen-deficient Do-type White Dwarf RE 0503-289

    NASA Technical Reports Server (NTRS)

    Rauch, T.; Quinet, P.; Hoyer, D.; Werner, K.; Richter, P.; Kruk, J. W.; Demleitner, M.

    2016-01-01

    For the spectral analysis of high-resolution and high signal-to-noise (SN) spectra of hot stars, state-of-the-art non-local thermodynamic equilibrium (NLTE) model atmospheres are mandatory. These are strongly dependent on the reliability of the atomic data that is used for their calculation. Aims. New Krivvii oscillator strengths for a large number of lines enable us to construct more detailed model atoms for our NLTEmodel-atmosphere calculations. This enables us to search for additional Kr lines in observed spectra and to improve Kr abundance determinations. Methods. We calculated Krivvii oscillator strengths to consider radiative and collisional bound-bound transitions in detail in our NLTE stellar-atmosphere models for the analysis of Kr lines that are exhibited in high-resolution and high SN ultraviolet (UV)observations of the hot white dwarf RE 0503.

  10. Meta-Analysis of Multiple Simulation-Based Experiments

    DTIC Science & Technology

    2013-06-01

    Alberts et al ., 2010), C2 Approaches differ on at least three major aspects: the allocation of decision rights (ADR), the pattern of interaction among...results obtained from the meta-analysis support the hypothesis that more network-enabled C2 Approaches are more agile (for details see Bernier et al ...consult Bernier, Chan et al . (2013) for more details. DoI PoI ADR Figure 2: Mapping of all CiCs into each axis of the C2 Approach Space. 18th

  11. Vehicle Design Evaluation Program (VDEP). A computer program for weight sizing, economic, performance and mission analysis of fuel-conservative aircraft, multibodied aircraft and large cargo aircraft using both JP and alternative fuels

    NASA Technical Reports Server (NTRS)

    Oman, B. H.

    1977-01-01

    The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.

  12. High speed cylindrical roller bearing analysis, SKF computer program CYBEAN. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Pirvics, J.

    1978-01-01

    The CYBEAN (Cylindrical Bearing Analysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. Input and output architectures containing guidelines for use and a sample execution are detailed.

  13. Geometric Modelling of Tree Roots with Different Levels of Detail

    NASA Astrophysics Data System (ADS)

    Guerrero Iñiguez, J. I.

    2017-09-01

    This paper presents a geometric approach for modelling tree roots with different Levels of Detail, suitable for analysis of the tree anchoring, potentially occupied underground space, interaction with urban elements and damage produced and taken in the built-in environment. Three types of tree roots are considered to cover several species: tap root, heart shaped root and lateral roots. Shrubs and smaller plants are not considered, however, a similar approach can be considered if the information is available for individual species. The geometrical approach considers the difficulties of modelling the actual roots, which are dynamic and almost opaque to direct observation, proposing generalized versions. For each type of root, different geometric models are considered to capture the overall shape of the root, a simplified block model, and a planar or surface projected version. Lower detail versions are considered as compatibility version for 2D systems while higher detail models are suitable for 3D analysis and visualization. The proposed levels of detail are matched with CityGML Levels of Detail, enabling both analysis and aesthetic views for urban modelling.

  14. Image processing, analysis, and management tools for gusset plate connections in steel truss bridges.

    DOT National Transportation Integrated Search

    2016-10-01

    This report details the research undertaken and software tools that were developed that enable digital : images of gusset plates to be converted into orthophotos, establish physical dimensions, collect : geometric information from them, and conduct s...

  15. Analysis of transport eco-efficiency scenarios to support sustainability assessment: a study on Dhaka City, Bangladesh.

    PubMed

    Iqbal, Asif; Allan, Andrew; Afroze, Shirina

    2017-08-01

    The study focused to assess the level of efficiency (of both emissions and service quality) that can be achieved for the transport system in Dhaka City, Bangladesh. The assessment technique attempted to quantify the extent of eco-efficiency achievable for the system modifications due to planning or strategy. The eco-efficiency analysis was facilitated with a detailed survey data on Dhaka City transport system, which was conducted for 9 months in 2012-2013. Line source modelling (CALINE4) was incorporated to estimate the on-road emission concentration. The eco-efficiency of the transport systems was assessed with the 'multi-criteria analysis' (MCA) technique that enabled the valuation of systems' qualitative and quantitative parameters. As per the analysis, driving indiscipline on road can alone promise about 47% reductions in emissions, which along with the number of private vehicles were the important stressors that restrict achieving eco-efficiency in Dhaka City. Detailed analysis of the transport system together with the potential transport system scenarios can offer a checklist to the policy makers enabling to identify the possible actions needed that can offer greater services to the dwellers against lesser emissions, which in turn can bring sustainability of the system.

  16. Abstractions for DNA circuit design.

    PubMed

    Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew

    2012-03-07

    DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.

  17. Hyperspectral data analysis procedures with reduced sensitivity to noise

    NASA Technical Reports Server (NTRS)

    Landgrebe, David A.

    1993-01-01

    Multispectral sensor systems have become steadily improved over the years in their ability to deliver increased spectral detail. With the advent of hyperspectral sensors, including imaging spectrometers, this technology is in the process of taking a large leap forward, thus providing the possibility of enabling delivery of much more detailed information. However, this direction of development has drawn even more attention to the matter of noise and other deleterious effects in the data, because reducing the fundamental limitations of spectral detail on information collection raises the limitations presented by noise to even greater importance. Much current effort in remote sensing research is thus being devoted to adjusting the data to mitigate the effects of noise and other deleterious effects. A parallel approach to the problem is to look for analysis approaches and procedures which have reduced sensitivity to such effects. We discuss some of the fundamental principles which define analysis algorithm characteristics providing such reduced sensitivity. One such analysis procedure including an example analysis of a data set is described, illustrating this effect.

  18. Signal-processing analysis of the MC2823 radar fuze: an addendum concerning clutter effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jelinek, D.A.

    1978-07-01

    A detailed analysis of the signal processing of the MC2823 radar fuze was published by Thompson in 1976 which enabled the computation of dud probability versus signal-to-noise ratio where the noise was receiver noise. An addendum to Thompson's work was published by Williams in 1978 that modified the weighting function used by Thompson. The analysis presented herein extends the work of Thompson to include the effects of clutter (the non-signal portion of the echo from a terrain) using the new weighting function. This extension enables computation of dud probability versus signal-to-total-noise ratio where total noise is the sum of themore » receiver-noise power and the clutter power.« less

  19. Numerical Propulsion System Simulation (NPSS) 1999 Industry Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Evans, Austin

    2000-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. In addition, the paper contains a summary of the feedback received from industry partners in the development effort and the actions taken over the past year to respond to that feedback. The NPSS development was supported in FY99 by the High Performance Computing and Communications Program.

  20. X-Ray Your Data with Rasch

    ERIC Educational Resources Information Center

    Curtis, David D.; Boman, Peter

    2007-01-01

    By using the Rasch model, much detailed diagnostic information is available to developers of survey and assessment instruments and to the researchers who use them. We outline an approach to the analysis of data obtained from the administration of survey instruments that can enable researchers to recognise and diagnose difficulties with those…

  1. Simulating observations with HARMONI: the integral field spectrograph for the European Extremely Large Telescope

    NASA Astrophysics Data System (ADS)

    Zieleniewski, Simon; Thatte, Niranjan; Kendrew, Sarah; Houghton, Ryan; Tecza, Matthias; Clarke, Fraser; Fusco, Thierry; Swinbank, Mark

    2014-07-01

    With the next generation of extremely large telescopes commencing construction, there is an urgent need for detailed quantitative predictions of the scientific observations that these new telescopes will enable. Most of these new telescopes will have adaptive optics fully integrated with the telescope itself, allowing unprecedented spatial resolution combined with enormous sensitivity. However, the adaptive optics point spread function will be strongly wavelength dependent, requiring detailed simulations that accurately model these variations. We have developed a simulation pipeline for the HARMONI integral field spectrograph, a first light instrument for the European Extremely Large Telescope. The simulator takes high-resolution input data-cubes of astrophysical objects and processes them with accurate atmospheric, telescope and instrumental effects, to produce mock observed cubes for chosen observing parameters. The output cubes represent the result of a perfect data reduc- tion process, enabling a detailed analysis and comparison between input and output, showcasing HARMONI's capabilities. The simulations utilise a detailed knowledge of the telescope's wavelength dependent adaptive op- tics point spread function. We discuss the simulation pipeline and present an early example of the pipeline functionality for simulating observations of high redshift galaxies.

  2. White Light Used to Enable Enhanced Surface Topography, Geometry, and Wear Characterization of Oil-Free Bearings

    NASA Technical Reports Server (NTRS)

    Lucero, John M.

    2003-01-01

    A new optically based measuring capability that characterizes surface topography, geometry, and wear has been employed by NASA Glenn Research Center s Tribology and Surface Science Branch. To characterize complex parts in more detail, we are using a three-dimensional, surface structure analyzer-the NewView5000 manufactured by Zygo Corporation (Middlefield, CT). This system provides graphical images and high-resolution numerical analyses to accurately characterize surfaces. Because of the inherent complexity of the various analyzed assemblies, the machine has been pushed to its limits. For example, special hardware fixtures and measuring techniques were developed to characterize Oil- Free thrust bearings specifically. We performed a more detailed wear analysis using scanning white light interferometry to image and measure the bearing structure and topography, enabling a further understanding of bearing failure causes.

  3. TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.

    PubMed

    Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo

    2018-06-15

    We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  4. Recording human cortical population spikes non-invasively--An EEG tutorial.

    PubMed

    Waterstraat, Gunnar; Fedele, Tommaso; Burghoff, Martin; Scheer, Hans-Jürgen; Curio, Gabriel

    2015-07-30

    Non-invasively recorded somatosensory high-frequency oscillations (sHFOs) evoked by electric nerve stimulation are markers of human cortical population spikes. Previously, their analysis was based on massive averaging of EEG responses. Advanced neurotechnology and optimized off-line analysis can enhance the signal-to-noise ratio of sHFOs, eventually enabling single-trial analysis. The rationale for developing dedicated low-noise EEG technology for sHFOs is unfolded. Detailed recording procedures and tailored analysis principles are explained step-by-step. Source codes in Matlab and Python are provided as supplementary material online. Combining synergistic hardware and analysis improvements, evoked sHFOs at around 600 Hz ('σ-bursts') can be studied in single-trials. Additionally, optimized spatial filters increase the signal-to-noise ratio of components at about 1 kHz ('κ-bursts') enabling their detection in non-invasive surface EEG. sHFOs offer a unique possibility to record evoked human cortical population spikes non-invasively. The experimental approaches and algorithms presented here enable also non-specialized EEG laboratories to combine measurements of conventional low-frequency EEG with the analysis of concomitant cortical population spike responses. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Methods for the evaluation of alternative disaster warning systems

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.

    1977-01-01

    For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.

  6. Mobility as a Fusion Enabler

    DTIC Science & Technology

    2008-07-01

    identification sensor is then VrT where T is some reasonable time in comparison to the approach speed of a potentially hostile contact. In fact, if the...effective range of the identification sensor on the mobile platform is VrT = VrA/Vh = h r V VA Further detail of such analysis is ongoing as part

  7. Improved numerical solutions for chaotic-cancer-model

    NASA Astrophysics Data System (ADS)

    Yasir, Muhammad; Ahmad, Salman; Ahmed, Faizan; Aqeel, Muhammad; Akbar, Muhammad Zubair

    2017-01-01

    In biological sciences, dynamical system of cancer model is well known due to its sensitivity and chaoticity. Present work provides detailed computational study of cancer model by counterbalancing its sensitive dependency on initial conditions and parameter values. Cancer chaotic model is discretized into a system of nonlinear equations that are solved using the well-known Successive-Over-Relaxation (SOR) method with a proven convergence. This technique enables to solve large systems and provides more accurate approximation which is illustrated through tables, time history maps and phase portraits with detailed analysis.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  9. A Fully Non-Metallic Gas Turbine Engine Enabled by Additive Manufacturing Part I: System Analysis, Component Identification, Additive Manufacturing, and Testing of Polymer Composites

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.; Haller, William J.; Poinsatte, Philip E.; Halbig, Michael C.; Schnulo, Sydney L.; Singh, Mrityunjay; Weir, Don; Wali, Natalie; Vinup, Michael; Jones, Michael G.; hide

    2015-01-01

    The research and development activities reported in this publication were carried out under NASA Aeronautics Research Institute (NARI) funded project entitled "A Fully Nonmetallic Gas Turbine Engine Enabled by Additive Manufacturing." The objective of the project was to conduct evaluation of emerging materials and manufacturing technologies that will enable fully nonmetallic gas turbine engines. The results of the activities are described in three part report. The first part of the report contains the data and analysis of engine system trade studies, which were carried out to estimate reduction in engine emissions and fuel burn enabled due to advanced materials and manufacturing processes. A number of key engine components were identified in which advanced materials and additive manufacturing processes would provide the most significant benefits to engine operation. The technical scope of activities included an assessment of the feasibility of using additive manufacturing technologies to fabricate gas turbine engine components from polymer and ceramic matrix composites, which were accomplished by fabricating prototype engine components and testing them in simulated engine operating conditions. The manufacturing process parameters were developed and optimized for polymer and ceramic composites (described in detail in the second and third part of the report). A number of prototype components (inlet guide vane (IGV), acoustic liners, engine access door) were additively manufactured using high temperature polymer materials. Ceramic matrix composite components included turbine nozzle components. In addition, IGVs and acoustic liners were tested in simulated engine conditions in test rigs. The test results are reported and discussed in detail.

  10. Procedure Enabling Simulation and In-Depth Analysis of Optical Effects in Camera-Based Time-Of Sensors

    NASA Astrophysics Data System (ADS)

    Baumgart, M.; Druml, N.; Consani, M.

    2018-05-01

    This paper presents a simulation approach for Time-of-Flight cameras to estimate sensor performance and accuracy, as well as to help understanding experimentally discovered effects. The main scope is the detailed simulation of the optical signals. We use a raytracing-based approach and use the optical path length as the master parameter for depth calculations. The procedure is described in detail with references to our implementation in Zemax OpticStudio and Python. Our simulation approach supports multiple and extended light sources and allows accounting for all effects within the geometrical optics model. Especially multi-object reflection/scattering ray-paths, translucent objects, and aberration effects (e.g. distortion caused by the ToF lens) are supported. The optical path length approach also enables the implementation of different ToF senor types and transient imaging evaluations. The main features are demonstrated on a simple 3D test scene.

  11. Novel scanning procedure enabling the vectorization of entire rhizotron-grown root systems

    PubMed Central

    2013-01-01

    This paper presents an original spit-and-combine imaging procedure that enables the complete vectorization of complex root systems grown in rhizotrons. The general principle of the method is to (1) separate the root system into a small number of large pieces to reduce root overlap, (2) scan these pieces one by one, (3) analyze separate images with a root tracing software and (4) combine all tracings into a single vectorized root system. This method generates a rich dataset containing morphological, topological and geometrical information of entire root systems grown in rhizotrons. The utility of the method is illustrated with a detailed architectural analysis of a 20-day old maize root system, coupled with a spatial analysis of water uptake patterns. PMID:23286457

  12. Novel scanning procedure enabling the vectorization of entire rhizotron-grown root systems.

    PubMed

    Lobet, Guillaume; Draye, Xavier

    2013-01-04

    : This paper presents an original spit-and-combine imaging procedure that enables the complete vectorization of complex root systems grown in rhizotrons. The general principle of the method is to (1) separate the root system into a small number of large pieces to reduce root overlap, (2) scan these pieces one by one, (3) analyze separate images with a root tracing software and (4) combine all tracings into a single vectorized root system. This method generates a rich dataset containing morphological, topological and geometrical information of entire root systems grown in rhizotrons. The utility of the method is illustrated with a detailed architectural analysis of a 20-day old maize root system, coupled with a spatial analysis of water uptake patterns.

  13. Experimental analysis of bruises in human volunteers using radiometric depth profiling and diffuse reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Vidovič, Luka; Milanič, Matija; Majaron, Boris

    2015-07-01

    We combine pulsed photothermal radiometry (PPTR) depth profiling with diffuse reflectance spectroscopy (DRS) measurements for a comprehensive analysis of bruise evolution in vivo. While PPTR enables extraction of detailed depth distribution and concentration profiles of selected absorbers (e.g. melanin, hemoglobin), DRS provides information in a wide range of visible wavelengths and thus offers an additional insight into dynamics of the hemoglobin degradation products. Combining the two approaches enables us to quantitatively characterize bruise evolution dynamics. Our results indicate temporal variations of the bruise evolution parameters in the course of bruise self-healing process. The obtained parameter values and trends represent a basis for a future development of an objective technique for bruise age determination.

  14. Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.

    ERIC Educational Resources Information Center

    Lindahl, William H.; Gardner, James H.

    Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…

  15. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  16. Flow cytogenetics and chromosome sorting.

    PubMed

    Cram, L S

    1990-06-01

    This review of flow cytogenetics and chromosome sorting provides an overview of general information in the field and describes recent developments in more detail. From the early developments of chromosome analysis involving single parameter or one color analysis to the latest developments in slit scanning of single chromosomes in a flow stream, the field has progressed rapidly and most importantly has served as an important enabling technology for the human genome project. Technological innovations that advanced flow cytogenetics are described and referenced. Applications in basic cell biology, molecular biology, and clinical investigations are presented. The necessary characteristics for large number chromosome sorting are highlighted. References to recent review articles are provided as a starting point for locating individual references that provide more detail. Specific references are provided for recent developments.

  17. Computer Analysis Of High-Speed Roller Bearings

    NASA Technical Reports Server (NTRS)

    Coe, H.

    1988-01-01

    High-speed cylindrical roller-bearing analysis program (CYBEAN) developed to compute behavior of cylindrical rolling-element bearings at high speeds and with misaligned shafts. With program, accurate assessment of geometry-induced roller preload possible for variety of out-ring and housing configurations and loading conditions. Enables detailed examination of bearing performance and permits exploration of causes and consequences of bearing skew. Provides general capability for assessment of designs of bearings supporting main shafts of engines. Written in FORTRAN IV.

  18. Many-body-theory study of lithium photoionization

    NASA Technical Reports Server (NTRS)

    Chang, T. N.; Poe, R. T.

    1975-01-01

    A detailed theoretical calculation is carried out for the photoionization of lithium at low energies within the framework of Brueckner-Goldstone perturbational approach. In this calculation extensive use is made of the recently developed multiple-basis-set technique. Through this technique all second-order perturbation terms, plus a number of important classes of terms to infinite order, have been taken into account. Analysis of the results enables one to resolve the discrepancies between two previous works on this subject. The detailed calculation also serves as a test on the convergence of the many-body perturbation-expansion approach.

  19. Optical Traps to Study Properties of Molecular Motors

    PubMed Central

    Spudich, James A.; Rice, Sarah E.; Rock, Ronald S.; Purcell, Thomas J.; Warrick, Hans M.

    2016-01-01

    In vitro motility assays enabled the analysis of coupling between ATP hydrolysis and movement of myosin along actin filaments or kinesin along microtubules. Single-molecule assays using laser trapping have been used to obtain more detailed information about kinesins, myosins, and processive DNA enzymes. The combination of in vitro motility assays with laser-trap measurements has revealed detailed dynamic structural changes associated with the ATPase cycle. This article describes the use of optical traps to study processive and nonprocessive molecular motor proteins, focusing on the design of the instrument and the assays to characterize motility. PMID:22046048

  20. Non-airborne conflicts: The causes and effects of runway transgressions

    NASA Technical Reports Server (NTRS)

    Tarrel, Richard J.

    1985-01-01

    The 1210 ASRS runway transgression reports are studied and expanded to yield descriptive statistics. Additionally, a one of three subset was studied in detail for purposes of evaluating the causes, risks, and consequences behind trangression events. Occurrences are subdivided by enabling factor and flight phase designations. It is concluded that a larger risk of collision is associated with controller enabled departure transgressions over all other categories. The influence of this type is especially evident during the period following the air traffic controllers' strike of 1981. Causal analysis indicates that, coincidentally, controller enabled departure transgressions also, show the strongest correlations between causal factors. It shows that departure errors occur more often when visibility is reduced, and when multiple takeoff runways or intersection takeoffs are employed. In general, runway transgressions attributable to both pilot and controller errors arise from three problem areas: information transfer, awareness, and spatial judgement. Enhanced awareness by controllers will probably reduce controller enabled incidents.

  1. Westinghouse programs in pulsed homopolar power supplies

    NASA Technical Reports Server (NTRS)

    Litz, D. C.; Mullan, E.

    1984-01-01

    This document details Westinghouse's ongoing study of homopolar machines since 1929 with the major effort occurring in the early 1970's to the present. The effort has enabled Westinghouse to develop expertise in the technology required for the design, fabrication and testing of such machines. This includes electrical design, electromagnetic analysis, current collection, mechanical design, advanced cooling, stress analysis, transient rotor performance, bearing analysis and seal technology. Westinghouse is using this capability to explore the use of homopolar machines as pulsed power supplies for future systems in both military and commercial applications.

  2. Investigation of H2 Diaphragm Compressors to Enable Low-Cost Long-Life Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, Aashish; Johnson, Kenneth I.

    2013-12-01

    This is a “short” annual report to DOE Fuel Cell Technology Office describing the research on modeling and materials analysis of diaphragms in a diaphragm-type hydrogen compressor. The compressor design details and diaphragm materials were provided by PDC Machines, Inc., a commercial manufacturer of diaphragm-type hydrogen compressors that PNNL is partnering with in this project

  3. A simple apparatus for quick qualitative analysis of CR39 nuclear track detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gautier, D. C.; Kline, J. L.; Flippo, K. A.

    2008-10-15

    Quantifying the ion pits in Columbia Resin 39 (CR39) nuclear track detector from Thomson parabolas is a time consuming and tedious process using conventional microscope based techniques. A simple inventive apparatus for fast screening and qualitative analysis of CR39 detectors has been developed, enabling efficient selection of data for a more detailed analysis. The system consists simply of a green He-Ne laser and a high-resolution digital single-lens reflex camera. The laser illuminates the edge of the CR39 at grazing incidence and couples into the plastic, acting as a light pipe. Subsequently, the laser illuminates all ion tracks on the surface.more » A high-resolution digital camera is used to photograph the scattered light from the ion tracks, enabling one to quickly determine charge states and energies measured by the Thomson parabola.« less

  4. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  5. Depth-resolved multilayer pigment identification in paintings: combined use of laser-induced breakdown spectroscopy (LIBS) and optical coherence tomography (OCT).

    PubMed

    Kaszewska, Ewa A; Sylwestrzak, Marcin; Marczak, Jan; Skrzeczanowski, Wojciech; Iwanicka, Magdalena; Szmit-Naud, Elżbieta; Anglos, Demetrios; Targowski, Piotr

    2013-08-01

    A detailed feasibility study on the combined use of laser-induced breakdown spectroscopy with optical coherence tomography (LIBS/OCT), aiming at a realistic depth-resolved elemental analysis of multilayer stratigraphies in paintings, is presented. Merging a high spectral resolution LIBS system with a high spatial resolution spectral OCT instrument significantly enhances the quality and accuracy of stratigraphic analysis. First, OCT mapping is employed prior to LIBS analysis in order to assist the selection of specific areas of interest on the painting surface to be examined in detail. Then, intertwined with LIBS, the OCT instrument is used as a precise profilometer for the online determination of the depth of the ablation crater formed by individual laser pulses during LIBS depth-profile analysis. This approach is novel and enables (i) the precise in-depth scaling of elemental concentration profiles, and (ii) the recognition of layer boundaries by estimating the corresponding differences in material ablation rate. Additionally, the latter is supported, within the transparency of the object, by analysis of the OCT cross-sectional views. The potential of this method is illustrated by presenting results on the detailed analysis of the structure of an historic painting on canvas performed to aid planned restoration of the artwork.

  6. Contingency Power Study for Short Haul Civil Tiltrotor

    NASA Technical Reports Server (NTRS)

    Eisenberg, Joseph D. (Technical Monitor); Wait, John

    2003-01-01

    AlliedSignal Engines (AE) defined a number of concepts that significantly increased the horsepower of a turboshaft engine to accommodate the loss of an engine and enable the safe landing of a twin-engined, 40-passenger, short haul civil tiltrotor. From these concepts, "Water/Methanol Injection," a "Better Power Turbine Than Required," and a "Secondary Combustor For Interturbine Reheat" were chosen, based on system safety and economics, for more detailed examination. Engine performance, mission, and cost analysis of these systems indicated contingency power levels of 26 to 70 percent greater than normal rated takeoff could be attained for short durations, thus enabling direct operating cost savings between 2 and 6 percent.

  7. High speed cylindrical roller bearing analysis. SKF computer program CYBEAN. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Dyba, G. J.; Kleckner, R. J.

    1981-01-01

    CYBEAN (CYlindrical BEaring ANalysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. The practical and correct implementation of CYBEAN is discussed. The capability to execute the program at four different levels of complexity was included. In addition, the program was updated to properly direct roller-to-raceway contact load vectors automatically in those cases where roller or ring profiles have small radii of curvature. Input and output architectures containing guidelines for use and two sample executions are detailed.

  8. Satellite Imagery Analysis for Automated Global Food Security Forecasting

    NASA Astrophysics Data System (ADS)

    Moody, D.; Brumby, S. P.; Chartrand, R.; Keisler, R.; Mathis, M.; Beneke, C. M.; Nicholaeff, D.; Skillman, S.; Warren, M. S.; Poehnelt, J.

    2017-12-01

    The recent computing performance revolution has driven improvements in sensor, communication, and storage technology. Multi-decadal remote sensing datasets at the petabyte scale are now available in commercial clouds, with new satellite constellations generating petabytes/year of daily high-resolution global coverage imagery. Cloud computing and storage, combined with recent advances in machine learning, are enabling understanding of the world at a scale and at a level of detail never before feasible. We present results from an ongoing effort to develop satellite imagery analysis tools that aggregate temporal, spatial, and spectral information and that can scale with the high-rate and dimensionality of imagery being collected. We focus on the problem of monitoring food crop productivity across the Middle East and North Africa, and show how an analysis-ready, multi-sensor data platform enables quick prototyping of satellite imagery analysis algorithms, from land use/land cover classification and natural resource mapping, to yearly and monthly vegetative health change trends at the structural field level.

  9. Flow structure through pool-riffle sequences and a conceptual model for their sustainability in gravel-bed rivers

    Treesearch

    D. Caamano; P. Goodwin; J. M. Buffington

    2010-01-01

    Detailed field measurements and simulations of three-dimensional flow structure were used to develop a conceptual model to explain the sustainability of self-formed pool-riffle sequences in gravel-bed rivers. The analysis was conducted at the Red River Wildlife Management Area in Idaho, USA, and enabled characterization of the flow structure through two consecutive...

  10. Integrated Glass Coating Manufacturing Line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brophy, Brenor

    2015-09-30

    This project aims to enable US module manufacturers to coat glass with Enki’s state of the art tunable functionalized AR coatings at the lowest possible cost and highest possible performance by encapsulating Enki’s coating process in an integrated tool that facilitates effective process improvement through metrology and data analysis for greater quality and performance while reducing footprint, operating and capital costs. The Phase 1 objective was a fully designed manufacturing line, including fully specified equipment ready for issue of purchase requisitions; a detailed economic justification based on market prices at the end of Phase 1 and projected manufacturing costs andmore » a detailed deployment plan for the equipment.« less

  11. Inhibited Shaped Charge Launcher Testing of Spacecraft Shield Designs

    NASA Technical Reports Server (NTRS)

    Grosch, Donald J.

    1996-01-01

    This report describes a test program in which several orbital debris shield designs were impact tested using the inhibited shaped charge launcher facility at Southwest Research Institute. This facility enables researchers to study the impact of one-gram aluminum projectiles on various shielding designs at velocities above 11 km/s. A total of twenty tests were conducted on targets provided by NASA-MSFC. This report discusses in detail the shield design, the projectile parameters and the test configuration used for each test. A brief discussion of the target damage is provided, as the detailed analysis of the target response will be done by NASA-MSFC.

  12. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    NASA Technical Reports Server (NTRS)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  13. Enabling Rapid and Robust Structural Analysis During Conceptual Design

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.; Padula, Sharon L.; Li, Wu

    2015-01-01

    This paper describes a multi-year effort to add a structural analysis subprocess to a supersonic aircraft conceptual design process. The desired capabilities include parametric geometry, automatic finite element mesh generation, static and aeroelastic analysis, and structural sizing. The paper discusses implementation details of the new subprocess, captures lessons learned, and suggests future improvements. The subprocess quickly compares concepts and robustly handles large changes in wing or fuselage geometry. The subprocess can rank concepts with regard to their structural feasibility and can identify promising regions of the design space. The automated structural analysis subprocess is deemed robust and rapid enough to be included in multidisciplinary conceptual design and optimization studies.

  14. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  15. Morphology of the external genitalia of the adult male and female mice as an endpoint of sex differentiation

    PubMed Central

    Weiss, Dana A.; Rodriguez, Esequiel; Cunha, Tristan; Menshenina, Julia; Barcellos, Dale; Chan, Lok Yun; Risbridger, Gail; Baskin, Laurence; Cunha, Gerald

    2013-01-01

    Adult external genitalia (ExG) are the endpoints of normal sex differentiation. Detailed morphometric analysis and comparison of adult mouse ExG has revealed 10 homologous features distinguishing the penis and clitoris that define masculine vs. feminine sex differentiation. These features have enabled the construction of a simple metric to evaluate various intersex conditions in mutant or hormonally manipulated mice. This review focuses on the morphology of the adult mouse penis and clitoris through detailed analysis of histologic sections, scanning electron microscopy, and three-dimensional reconstruction. We also present previous results from evaluation of “non-traditional” mammals, such as the spotted hyena and wallaby to demonstrate the complex process of sex differentiation that involves not only androgen-dependent processes, but also estrogen-dependent and hormone-independent mechanisms. PMID:21893161

  16. Complications in proximal humeral fractures.

    PubMed

    Calori, Giorgio Maria; Colombo, Massimiliano; Bucci, Miguel Simon; Fadigati, Piero; Colombo, Alessandra Ines Maria; Mazzola, Simone; Cefalo, Vittorio; Mazza, Emilio

    2016-10-01

    Necrosis of the humeral head, infections and non-unions are among the most dangerous and difficult-to-treat complications of proximal humeral fractures. The aim of this work was to analyse in detail non-unions and post-traumatic bone defects and to suggest an algorithm of care. Treatment options are based not only on the radiological frame, but also according to a detailed analysis of the patient, who is classified using a risk factor analysis. This method enables the surgeon to choose the most suitable treatment for the patient, thereby facilitating return of function in the shortest possible time. The treatment of such serious complications requires the surgeon to be knowledgeable about the following possible solutions: increased mechanical stability; biological stimulation; and reconstructive techniques in two steps, with application of biotechnologies and prosthetic substitution. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Enabling Technologies for Unified Life-Cycle Engineering of Structural Components

    DTIC Science & Technology

    1991-03-22

    representations for entities in the ULCE system for unambiguous, reliable, and efficient retrieval, manipulation, and transfer of data. Develop a rapid analysis...approaches to these functions. It is reasonable to assume that program budgets for future systems will be more restrictive and that fixed- price contracting...enemy threats, economics, and politics. The requirements are voluminous and may stipulate firm fixed- price proposals with detailed schedules. At this

  18. COST FUNCTION STUDIES FOR POWER REACTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heestand, J.; Wos, L.T.

    1961-11-01

    A function to evaluate the cost of electricity produced by a nuclear power reactor was developed. The basic equation, revenue = capital charges + profit + operating expenses, was expanded in terms of various cost parameters to enable analysis of multiregion nuclear reactors with uranium and/or plutonium for fuel. A corresponding IBM 704 computer program, which will compute either the price of electricity or the value of plutonium, is presented in detail. (auth)

  19. Mass Storage Performance Information System

    NASA Technical Reports Server (NTRS)

    Scheuermann, Peter

    2000-01-01

    The purpose of this task is to develop a data warehouse to enable system administrators and their managers to gather information by querying the data logs of the MDSDS. Currently detailed logs capture the activity of the MDSDS internal to the different systems. The elements to be included in the data warehouse are requirements analysis, data cleansing, database design, database population, hardware/software acquisition, data transformation, query and report generation, and data mining.

  20. Dynamics of land change in India: a fine-scale spatial analysis

    NASA Astrophysics Data System (ADS)

    Meiyappan, P.; Roy, P. S.; Sharma, Y.; Jain, A. K.; Ramachandran, R.; Joshi, P. K.

    2015-12-01

    Land is scarce in India: India occupies 2.4% of worlds land area, but supports over 1/6th of worlds human and livestock population. This high population to land ratio, combined with socioeconomic development and increasing consumption has placed tremendous pressure on India's land resources for food, feed, and fuel. In this talk, we present contemporary (1985 to 2005) spatial estimates of land change in India using national-level analysis of Landsat imageries. Further, we investigate the causes of the spatial patterns of change using two complementary lines of evidence. First, we use statistical models estimated at macro-scale to understand the spatial relationships between land change patterns and their concomitant drivers. This analysis using our newly compiled extensive socioeconomic database at village level (~630,000 units), is 100x higher in spatial resolution compared to existing datasets, and covers over 200 variables. The detailed socioeconomic data enabled the fine-scale spatial analysis with Landsat data. Second, we synthesized information from over 130 survey based case studies on land use drivers in India to complement our macro-scale analysis. The case studies are especially useful to identify unobserved variables (e.g. farmer's attitude towards risk). Ours is the most detailed analysis of contemporary land change in India, both in terms of national extent, and the use of detailed spatial information on land change, socioeconomic factors, and synthesis of case studies.

  1. SemanticSCo: A platform to support the semantic composition of services for gene expression analysis.

    PubMed

    Guardia, Gabriela D A; Ferreira Pires, Luís; da Silva, Eduardo G; de Farias, Cléver R G

    2017-02-01

    Gene expression studies often require the combined use of a number of analysis tools. However, manual integration of analysis tools can be cumbersome and error prone. To support a higher level of automation in the integration process, efforts have been made in the biomedical domain towards the development of semantic web services and supporting composition environments. Yet, most environments consider only the execution of simple service behaviours and requires users to focus on technical details of the composition process. We propose a novel approach to the semantic composition of gene expression analysis services that addresses the shortcomings of the existing solutions. Our approach includes an architecture designed to support the service composition process for gene expression analysis, and a flexible strategy for the (semi) automatic composition of semantic web services. Finally, we implement a supporting platform called SemanticSCo to realize the proposed composition approach and demonstrate its functionality by successfully reproducing a microarray study documented in the literature. The SemanticSCo platform provides support for the composition of RESTful web services semantically annotated using SAWSDL. Our platform also supports the definition of constraints/conditions regarding the order in which service operations should be invoked, thus enabling the definition of complex service behaviours. Our proposed solution for semantic web service composition takes into account the requirements of different stakeholders and addresses all phases of the service composition process. It also provides support for the definition of analysis workflows at a high-level of abstraction, thus enabling users to focus on biological research issues rather than on the technical details of the composition process. The SemanticSCo source code is available at https://github.com/usplssb/SemanticSCo. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Supporting tactical intelligence using collaborative environments and social networking

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur B.; Farry, Michael P.; Stark, Robert F.

    2013-05-01

    Modern military environments place an increased emphasis on the collection and analysis of intelligence at the tactical level. The deployment of analytical tools at the tactical level helps support the Warfighter's need for rapid collection, analysis, and dissemination of intelligence. However, given the lack of experience and staffing at the tactical level, most of the available intelligence is not exploited. Tactical environments are staffed by a new generation of intelligence analysts who are well-versed in modern collaboration environments and social networking. An opportunity exists to enhance tactical intelligence analysis by exploiting these personnel strengths, but is dependent on appropriately designed information sharing technologies. Existing social information sharing technologies enable users to publish information quickly, but do not unite or organize information in a manner that effectively supports intelligence analysis. In this paper, we present an alternative approach to structuring and supporting tactical intelligence analysis that combines the benefits of existing concepts, and provide detail on a prototype system embodying that approach. Since this approach employs familiar collaboration support concepts from social media, it enables new-generation analysts to identify the decision-relevant data scattered among databases and the mental models of other personnel, increasing the timeliness of collaborative analysis. Also, the approach enables analysts to collaborate visually to associate heterogeneous and uncertain data within the intelligence analysis process, increasing the robustness of collaborative analyses. Utilizing this familiar dynamic collaboration environment, we hope to achieve a significant reduction of time and skill required to glean actionable intelligence in these challenging operational environments.

  3. Base-By-Base: single nucleotide-level analysis of whole viral genome alignments.

    PubMed

    Brodie, Ryan; Smith, Alex J; Roper, Rachel L; Tcherepanov, Vasily; Upton, Chris

    2004-07-14

    With ever increasing numbers of closely related virus genomes being sequenced, it has become desirable to be able to compare two genomes at a level more detailed than gene content because two strains of an organism may share the same set of predicted genes but still differ in their pathogenicity profiles. For example, detailed comparison of multiple isolates of the smallpox virus genome (each approximately 200 kb, with 200 genes) is not feasible without new bioinformatics tools. A software package, Base-By-Base, has been developed that provides visualization tools to enable researchers to 1) rapidly identify and correct alignment errors in large, multiple genome alignments; and 2) generate tabular and graphical output of differences between the genomes at the nucleotide level. Base-By-Base uses detailed annotation information about the aligned genomes and can list each predicted gene with nucleotide differences, display whether variations occur within promoter regions or coding regions and whether these changes result in amino acid substitutions. Base-By-Base can connect to our mySQL database (Virus Orthologous Clusters; VOCs) to retrieve detailed annotation information about the aligned genomes or use information from text files. Base-By-Base enables users to quickly and easily compare large viral genomes; it highlights small differences that may be responsible for important phenotypic differences such as virulence. It is available via the Internet using Java Web Start and runs on Macintosh, PC and Linux operating systems with the Java 1.4 virtual machine.

  4. Quasistatic Cavity Resonance for Ubiquitous Wireless Power Transfer.

    PubMed

    Chabalko, Matthew J; Shahmohammadi, Mohsen; Sample, Alanson P

    2017-01-01

    Wireless power delivery has the potential to seamlessly power our electrical devices as easily as data is transmitted through the air. However, existing solutions are limited to near contact distances and do not provide the geometric freedom to enable automatic and un-aided charging. We introduce quasistatic cavity resonance (QSCR), which can enable purpose-built structures, such as cabinets, rooms, and warehouses, to generate quasistatic magnetic fields that safely deliver kilowatts of power to mobile receivers contained nearly anywhere within. A theoretical model of a quasistatic cavity resonator is derived, and field distributions along with power transfer efficiency are validated against measured results. An experimental demonstration shows that a 54 m3 QSCR room can deliver power to small coil receivers in nearly any position with 40% to 95% efficiency. Finally, a detailed safety analysis shows that up to 1900 watts can be transmitted to a coil receiver enabling safe and ubiquitous wireless power.

  5. Quasistatic Cavity Resonance for Ubiquitous Wireless Power Transfer

    PubMed Central

    Shahmohammadi, Mohsen; Sample, Alanson P.

    2017-01-01

    Wireless power delivery has the potential to seamlessly power our electrical devices as easily as data is transmitted through the air. However, existing solutions are limited to near contact distances and do not provide the geometric freedom to enable automatic and un-aided charging. We introduce quasistatic cavity resonance (QSCR), which can enable purpose-built structures, such as cabinets, rooms, and warehouses, to generate quasistatic magnetic fields that safely deliver kilowatts of power to mobile receivers contained nearly anywhere within. A theoretical model of a quasistatic cavity resonator is derived, and field distributions along with power transfer efficiency are validated against measured results. An experimental demonstration shows that a 54 m3 QSCR room can deliver power to small coil receivers in nearly any position with 40% to 95% efficiency. Finally, a detailed safety analysis shows that up to 1900 watts can be transmitted to a coil receiver enabling safe and ubiquitous wireless power. PMID:28199321

  6. Enabling university teaching for Canadian academics with multiple sclerosis through problem-focused coping.

    PubMed

    Crooks, Valorie A; Stone, Sharon Dale; Owen, Michelle

    2011-02-01

    Research shows that sustained employment contributes to a higher quality of life for those with multiple sclerosis (MS). Occupational therapists can work to create therapeutic interventions that assist people with MS with maintaining employment. To detail the problem-focused coping strategies that academics with MS employ to enable them to teach in universities. Semi-structured interviews were conducted with 45 Canadian academics with MS. Thematic analysis was used to generate findings. While there is flexibility in research and service work tasks, teaching tasks are the most seemingly inflexible. This necessitated the development of problem-focused coping strategies to enable teaching. Three categories of strategies were employed: (1) organizational; (2) before/after teaching; and (3) during teaching. This brief report is intended to serve as a resource for occupational therapists and others wanting to gain a better understanding of the types of therapeutic interventions useful to those teaching in universities.

  7. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2000 by the High Performance Computing and Communications Program.

  8. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2001 by the High Performance Computing and Communications Program.

  9. Miniature near-infrared spectrometer for point-of-use chemical analysis

    NASA Astrophysics Data System (ADS)

    Friedrich, Donald M.; Hulse, Charles A.; von Gunten, Marc; Williamson, Eric P.; Pederson, Christopher G.; O'Brien, Nada A.

    2014-03-01

    Point-of-use chemical analysis holds tremendous promise for a number of industries, including agriculture, recycling, pharmaceuticals and homeland security. Near infrared (NIR) spectroscopy is an excellent candidate for these applications, with minimal sample preparation for real-time decision-making. We will detail the development of a golf ball-sized NIR spectrometer developed specifically for this purpose. The instrument is based upon a thin-film dispersive element that is very stable over time and temperature, with less than 2 nm change expected over the operating temperature range and lifetime of the instrument. This filter is coupled with an uncooled InGaAs detector array in a small, rugged, environmentally stable optical bench ideally suited to unpredictable environments. The resulting instrument weighs less than 60 grams, includes onboard illumination and collection optics for diffuse reflectance applications in the 900-1700 nm wavelength range, and is USB-powered. It can be driven in the field by a laptop, tablet or even a smartphone. The software design includes the potential for both on-board and cloud-based storage, analysis and decision-making. The key attributes of the instrument and the underlying design tradeoffs will be discussed, focusing on miniaturization, ruggedization, power consumption and cost. The optical performance of the instrument, as well as its fit-for purpose will be detailed. Finally, we will show that our manufacturing process has enabled us to build instruments with excellent unit-to-unit reproducibility. We will show that this is a key enabler for instrumentindependent chemical analysis models, a requirement for mass point-of-use deployment.

  10. The analysis of a complex fire event using multispaceborne observations

    NASA Astrophysics Data System (ADS)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  11. Reconfigurable wavefront sensor for ultrashort pulses.

    PubMed

    Bock, Martin; Das, Susanta Kumar; Fischer, Carsten; Diehl, Michael; Börner, Peter; Grunwald, Ruediger

    2012-04-01

    A highly flexible Shack-Hartmann wavefront sensor for ultrashort pulse diagnostics is presented. The temporal system performance is studied in detail. Reflective operation is enabled by programming tilt-tolerant microaxicons into a liquid-crystal-on-silicon spatial light modulator. Nearly undistorted pulse transfer is obtained by generating nondiffracting needle beams as subbeams. Reproducible wavefront analysis and spatially resolved second-order autocorrelation are demonstrated at incident angles up to 50° and pulse durations down to 6 fs.

  12. Computer Simulation For Design Of TWT's

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1992-01-01

    A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.

  13. Discriminating semiarid vegetation using airborne imaging spectrometer data - A preliminary assessment

    NASA Technical Reports Server (NTRS)

    Thomas, Randall W.; Ustin, Susan L.

    1987-01-01

    A preliminary assessment was made of Airborne Imaging Spectrometer (AIS) data for discriminating and characterizing vegetation in a semiarid environment. May and October AIS data sets were acquired over a large alluvial fan in eastern California, on which were found Great Basin desert shrub communities. Maximum likelihood classification of a principal components representation of the May AIS data enabled discrimination of subtle spatial detail in images relating to vegetation and soil characteristics. The spatial patterns in the May AIS classification were, however, too detailed for complete interpretation with existing ground data. A similar analysis of the October AIS data yielded poor results. Comparison of AIS results with a similar analysis of May Landsat Thematic Mapper data showed that the May AIS data contained approximately three to four times as much spectrally coherent information. When only two shortwave infrared TM bands were used, results were similar to those from AIS data acquired in October.

  14. Analyzing ion distributions around DNA: sequence-dependence of potassium ion distributions from microsecond molecular dynamics

    PubMed Central

    Pasi, Marco; Maddocks, John H.; Lavery, Richard

    2015-01-01

    Microsecond molecular dynamics simulations of B-DNA oligomers carried out in an aqueous environment with a physiological salt concentration enable us to perform a detailed analysis of how potassium ions interact with the double helix. The oligomers studied contain all 136 distinct tetranucleotides and we are thus able to make a comprehensive analysis of base sequence effects. Using a recently developed curvilinear helicoidal coordinate method we are able to analyze the details of ion populations and densities within the major and minor grooves and in the space surrounding DNA. The results show higher ion populations than have typically been observed in earlier studies and sequence effects that go beyond the nature of individual base pairs or base pair steps. We also show that, in some special cases, ion distributions converge very slowly and, on a microsecond timescale, do not reflect the symmetry of the corresponding base sequence. PMID:25662221

  15. Using Social Network Measures in Wildlife Disease Ecology, Epidemiology, and Management

    PubMed Central

    Silk, Matthew J.; Croft, Darren P.; Delahay, Richard J.; Hodgson, David J.; Boots, Mike; Weber, Nicola; McDonald, Robbie A.

    2017-01-01

    Abstract Contact networks, behavioral interactions, and shared use of space can all have important implications for the spread of disease in animals. Social networks enable the quantification of complex patterns of interactions; therefore, network analysis is becoming increasingly widespread in the study of infectious disease in animals, including wildlife. We present an introductory guide to using social-network-analytical approaches in wildlife disease ecology, epidemiology, and management. We focus on providing detailed practical guidance for the use of basic descriptive network measures by suggesting the research questions to which each technique is best suited and detailing the software available for each. We also discuss how using network approaches can be used beyond the study of social contacts and across a range of spatial and temporal scales. Finally, we integrate these approaches to examine how network analysis can be used to inform the implementation and monitoring of effective disease management strategies. PMID:28596616

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Partridge Jr, William P.; Choi, Jae-Soon

    By directly resolving spatial and temporal species distributions within operating honeycomb monolith catalysts, spatially resolved capillary inlet mass spectrometry (SpaciMS) provides a uniquely enabling perspective for advancing automotive catalysis. Specifically, the ability to follow the spatiotemporal evolution of reactions throughout the catalyst is a significant advantage over inlet-and-effluent-limited analysis. Intracatalyst resolution elucidates numerous catalyst details including the network and sequence of reactions, clarifying reaction pathways; the relative rates of different reactions and impacts of operating conditions and catalyst state; and reaction dynamics and intermediate species that exist only within the catalyst. These details provide a better understanding of how themore » catalyst functions and have basic and practical benefits; e.g., catalyst system design; strategies for on-road catalyst state assessment, control, and on-board diagnostics; and creating robust and accurate predictive catalyst models. Moreover, such spatiotemporally distributed data provide for critical model assessment, and identification of improvement opportunities that might not be apparent from effluent assessment; i.e., while an incorrectly formulated model may provide correct effluent predictions, one that can accurately predict the spatiotemporal evolution of reactions along the catalyst channels will be more robust, accurate, and reliable. In such ways, intracatalyst diagnostics comprehensively enable improved design and development tools, and faster and lower-cost development of more efficient and durable automotive catalyst systems. Beyond these direct contributions, SpaciMS has spawned and been applied to enable other analytical techniques for resolving transient distributed intracatalyst performance. This chapter focuses on SpaciMS applications and associated catalyst insights and improvements, with specific sections related to lean NOx traps, selective catalytic reduction catalysts, oxidation catalysts, and particulate filters. The objective is to promote broader use and development of intracatalyst analytical methods, and thereby expand the insights resulting from this detailed perspective for advancing automotive catalyst technologies.« less

  17. Omics Profiling in Precision Oncology*

    PubMed Central

    Yu, Kun-Hsing; Snyder, Michael

    2016-01-01

    Cancer causes significant morbidity and mortality worldwide, and is the area most targeted in precision medicine. Recent development of high-throughput methods enables detailed omics analysis of the molecular mechanisms underpinning tumor biology. These studies have identified clinically actionable mutations, gene and protein expression patterns associated with prognosis, and provided further insights into the molecular mechanisms indicative of cancer biology and new therapeutics strategies such as immunotherapy. In this review, we summarize the techniques used for tumor omics analysis, recapitulate the key findings in cancer omics studies, and point to areas requiring further research on precision oncology. PMID:27099341

  18. High frequency ultrasound with color Doppler in dermatology*

    PubMed Central

    Barcaui, Elisa de Oliveira; Carvalho, Antonio Carlos Pires; Lopes, Flavia Paiva Proença Lobo; Piñeiro-Maceira, Juan; Barcaui, Carlos Baptista

    2016-01-01

    Ultrasonography is a method of imaging that classically is used in dermatology to study changes in the hypoderma, as nodules and infectious and inflammatory processes. The introduction of high frequency and resolution equipments enabled the observation of superficial structures, allowing differentiation between skin layers and providing details for the analysis of the skin and its appendages. This paper aims to review the basic principles of high frequency ultrasound and its applications in different areas of dermatology. PMID:27438191

  19. Smart energy management system

    NASA Astrophysics Data System (ADS)

    Desai, Aniruddha; Singh, Jugdutt

    2010-04-01

    Peak and average energy usage in domestic and industrial environments is growing rapidly and absence of detailed energy consumption metrics is making systematic reduction of energy usage very difficult. Smart energy management system aims at providing a cost-effective solution for managing soaring energy consumption and its impact on green house gas emissions and climate change. The solution is based on seamless integration of existing wired and wireless communication technologies combined with smart context-aware software which offers a complete solution for automation of energy measurement and device control. The persuasive software presents users with easy-to-assimilate visual cues identifying problem areas and time periods and encourages a behavioural change to conserve energy. The system allows analysis of real-time/statistical consumption data with the ability to drill down into detailed analysis of power consumption, CO2 emissions and cost. The system generates intelligent projections and suggests potential methods (e.g. reducing standby, tuning heating/cooling temperature, etc.) of reducing energy consumption. The user interface is accessible using web enabled devices such as PDAs, PCs, etc. or using SMS, email, and instant messaging. Successful real-world trial of the system has demonstrated the potential to save 20 to 30% energy consumption on an average. Low cost of deployment and the ability to easily manage consumption from various web enabled devices offers gives this system a high penetration and impact capability offering a sustainable solution to act on climate change today.

  20. First GIS Analysis of Modern Stone Tools Used by Wild Chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    PubMed Central

    Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio

    2015-01-01

    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642

  1. Final (Tier 1) environmental impact statement for the Galileo and Ulysses Missions

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Presented here is a Final (Tier 1) Environmental Impact Statement (EIS) addressing the potential environmental consequences associated with continuing the modifications of the Galileo and Ulysses spacecraft for launch using a booster/upper stage combination that is different from the one planned for use prior to the Challenger accident, while conducting the detailed safety and environmental analysis in order to preserve the October 1989 launch opportunity for Galileo and an October 1990 launch opportunity for Ulysses. While detailed safety and environmental analyses associated with the missions are underway, they currently are not complete. Nevertheless, sufficient information is available to enable a choice among the reconfiguration alternatives presented. Relevant assessments of the potential for environmental impacts are presented.

  2. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  3. Software For Drawing Design Details Concurrently

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    Software system containing five computer-aided-design programs enables more than one designer to work on same part or assembly at same time. Reduces time necessary to produce design by implementing concept of parallel or concurrent detailing, in which all detail drawings documenting three-dimensional model of part or assembly produced simultaneously, rather than sequentially. Keeps various detail drawings consistent with each other and with overall design by distributing changes in each detail to all other affected details.

  4. Time-course human urine proteomics in space-flight simulation experiments.

    PubMed

    Binder, Hans; Wirth, Henry; Arakelyan, Arsen; Lembcke, Kathrin; Tiys, Evgeny S; Ivanisenko, Vladimir A; Kolchanov, Nikolay A; Kononikhin, Alexey; Popov, Igor; Nikolaev, Evgeny N; Pastushkova, Lyudmila; Larina, Irina M

    2014-01-01

    Long-term space travel simulation experiments enabled to discover different aspects of human metabolism such as the complexity of NaCl salt balance. Detailed proteomics data were collected during the Mars105 isolation experiment enabling a deeper insight into the molecular processes involved. We studied the abundance of about two thousand proteins extracted from urine samples of six volunteers collected weekly during a 105-day isolation experiment under controlled dietary conditions including progressive reduction of salt consumption. Machine learning using Self Organizing maps (SOM) in combination with different analysis tools was applied to describe the time trajectories of protein abundance in urine. The method enables a personalized and intuitive view on the physiological state of the volunteers. The abundance of more than one half of the proteins measured clearly changes in the course of the experiment. The trajectory splits roughly into three time ranges, an early (week 1-6), an intermediate (week 7-11) and a late one (week 12-15). Regulatory modes associated with distinct biological processes were identified using previous knowledge by applying enrichment and pathway flow analysis. Early protein activation modes can be related to immune response and inflammatory processes, activation at intermediate times to developmental and proliferative processes and late activations to stress and responses to chemicals. The protein abundance profiles support previous results about alternative mechanisms of salt storage in an osmotically inactive form. We hypothesize that reduced NaCl consumption of about 6 g/day presumably will reduce or even prevent the activation of inflammatory processes observed in the early time range of isolation. SOM machine learning in combination with analysis methods of class discovery and functional annotation enable the straightforward analysis of complex proteomics data sets generated by means of mass spectrometry.

  5. ARM Data File Standards Version: 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kehoe, Kenneth; Beus, Sherman; Cialella, Alice

    2014-04-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools formore » the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.« less

  6. Mindtagger: A Demonstration of Data Labeling in Knowledge Base Construction.

    PubMed

    Shin, Jaeho; Ré, Christopher; Cafarella, Michael

    2015-08-01

    End-to-end knowledge base construction systems using statistical inference are enabling more people to automatically extract high-quality domain-specific information from unstructured data. As a result of deploying DeepDive framework across several domains, we found new challenges in debugging and improving such end-to-end systems to construct high-quality knowledge bases. DeepDive has an iterative development cycle in which users improve the data. To help our users, we needed to develop principles for analyzing the system's error as well as provide tooling for inspecting and labeling various data products of the system. We created guidelines for error analysis modeled after our colleagues' best practices, in which data labeling plays a critical role in every step of the analysis. To enable more productive and systematic data labeling, we created Mindtagger, a versatile tool that can be configured to support a wide range of tasks. In this demonstration, we show in detail what data labeling tasks are modeled in our error analysis guidelines and how each of them is performed using Mindtagger.

  7. Evaluation of variability in high-resolution protein structures by global distance scoring.

    PubMed

    Anzai, Risa; Asami, Yoshiki; Inoue, Waka; Ueno, Hina; Yamada, Koya; Okada, Tetsuji

    2018-01-01

    Systematic analysis of the statistical and dynamical properties of proteins is critical to understanding cellular events. Extraction of biologically relevant information from a set of high-resolution structures is important because it can provide mechanistic details behind the functional properties of protein families, enabling rational comparison between families. Most of the current structural comparisons are pairwise-based, which hampers the global analysis of increasing contents in the Protein Data Bank. Additionally, pairing of protein structures introduces uncertainty with respect to reproducibility because it frequently accompanies other settings for superimposition. This study introduces intramolecular distance scoring for the global analysis of proteins, for each of which at least several high-resolution structures are available. As a pilot study, we have tested 300 human proteins and showed that the method is comprehensively used to overview advances in each protein and protein family at the atomic level. This method, together with the interpretation of the model calculations, provide new criteria for understanding specific structural variation in a protein, enabling global comparison of the variability in proteins from different species.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palanisamy, Giri

    The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document willmore » enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.« less

  9. Applications of mass spectrometry for quantitative protein analysis in formalin-fixed paraffin-embedded tissues

    PubMed Central

    Steiner, Carine; Ducret, Axel; Tille, Jean-Christophe; Thomas, Marlene; McKee, Thomas A; Rubbia-Brandt, Laura A; Scherl, Alexander; Lescuyer, Pierre; Cutler, Paul

    2014-01-01

    Proteomic analysis of tissues has advanced in recent years as instruments and methodologies have evolved. The ability to retrieve peptides from formalin-fixed paraffin-embedded tissues followed by shotgun or targeted proteomic analysis is offering new opportunities in biomedical research. In particular, access to large collections of clinically annotated samples should enable the detailed analysis of pathologically relevant tissues in a manner previously considered unfeasible. In this paper, we review the current status of proteomic analysis of formalin-fixed paraffin-embedded tissues with a particular focus on targeted approaches and the potential for this technique to be used in clinical research and clinical diagnosis. We also discuss the limitations and perspectives of the technique, particularly with regard to application in clinical diagnosis and drug discovery. PMID:24339433

  10. Hybrid rendering of the chest and virtual bronchoscopy [corrected].

    PubMed

    Seemann, M D; Seemann, O; Luboldt, W; Gebicke, K; Prime, G; Claussen, C D

    2000-10-30

    Thin-section spiral computed tomography was used to acquire the volume data sets of the thorax. The tracheobronchial system and pathological changes of the chest were visualized using a color-coded surface rendering method. The structures of interest were then superimposed on a volume rendering of the other thoracic structures, thus producing a hybrid rendering. The hybrid rendering technique exploit the advantages of both rendering methods and enable virtual bronchoscopic examinations using different representation models. Virtual bronchoscopic examinations with a transparent color-coded shaded-surface model enables the simultaneous visualization of both the airways and the adjacent structures behind of the tracheobronchial wall and therefore, offers a practical alternative to fiberoptic bronchoscopy. Hybrid rendering and virtual endoscopy obviate the need for time consuming detailed analysis and presentation of axial source images.

  11. Development of an improved MATLAB GUI for the prediction of coefficients of restitution, and integration into LMS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baca, Renee Nicole; Congdon, Michael L.; Brake, Matthew Robert

    In 2012, a Matlab GUI for the prediction of the coefficient of restitution was developed in order to enable the formulation of more accurate Finite Element Analysis (FEA) models of components. This report details the development of a new Rebound Dynamics GUI, and how it differs from the previously developed program. The new GUI includes several new features, such as source and citation documentation for the material database, as well as a multiple materials impact modeler for use with LMS Virtual.Lab Motion (LMS VLM), and a rigid body dynamics modeling software. The Rebound Dynamics GUI has been designed to workmore » with LMS VLM to enable straightforward incorporation of velocity-dependent coefficients of restitution in rigid body dynamics simulations.« less

  12. Analysis of Processed Foods Containing Oils and Fats by Time of Flight Mass Spectrometry with an APCI Direct Probe.

    PubMed

    Ito, Shihomi; Chikasou, Masato; Inohana, Shuichi; Fujita, Kazuhiro

    2016-01-01

    Discriminating vegetable oils and animal and milk fats by infrared absorption spectroscopy is difficult due to similarities in their spectral patterns. Therefore, a rapid and simple method for analyzing vegetable oils, animal fats, and milk fats using TOF/MS with an APCI direct probe ion source was developed. This method enabled discrimination of these oils and fats based on mass spectra and detailed analyses of the ions derived from sterols, even in samples consisting of only a few milligrams. Analyses of the mass spectra of processed foods containing oils and milk fats, such as butter, cheese, and chocolate, enabled confirmation of the raw material origin based on specific ions derived from the oils and fats used to produce the final product.

  13. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  14. Thermomagnetic instabilities in a vertical layer of ferrofluid: nonlinear analysis away from a critical point

    NASA Astrophysics Data System (ADS)

    Dey, Pinkee; Suslov, Sergey A.

    2016-12-01

    A finite amplitude instability has been analysed to discover the exact mechanism leading to the appearance of stationary magnetoconvection patterns in a vertical layer of a non-conducting ferrofluid heated from the side and placed in an external magnetic field perpendicular to the walls. The physical results have been obtained using a version of a weakly nonlinear analysis that is based on the disturbance amplitude expansion. It enables a low-dimensional reduction of a full nonlinear problem in supercritical regimes away from a bifurcation point. The details of the reduction are given in comparison with traditional small-parameter expansions. It is also demonstrated that Squire’s transformation can be introduced for higher-order nonlinear terms thus reducing the full three-dimensional problem to its equivalent two-dimensional counterpart and enabling significant computational savings. The full three-dimensional instability patterns are subsequently recovered using the inverse transforms The analysed stationary thermomagnetic instability is shown to occur as a result of a supercritical pitchfork bifurcation.

  15. Development of a competency mapping tool for undergraduate professional degree programmes, using mechanical engineering as a case study

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.

  16. Enabling Long-Duration Lunar Equatorial Operations With Thermal Wadi Infrastructure

    NASA Technical Reports Server (NTRS)

    Jones, Heather L.; Thornton, John P.; Balasubramaniam, Ramaswamy; Gokoglu, Suleyman, A.; Sacksteder, Kurt R.; Whittaker, William L.

    2011-01-01

    Long duration missions on the Moon s equator must survive lunar nights. With 350 hr of cryogenic temperatures, lunar nights present a challenge to robotic survival. Insulation is imperfect, so it is not possible to passively contain enough heat to stay warm through the night. Components that enable mobility, environmental sensing and solar power generation must be exposed, and they leak heat. Small, lightweight rovers cannot store enough energy to warm components throughout the night without some external source of heat or power. Thermal wadis, however, can act as external heat sources to keep robots warm through the lunar night. Electrical power can also be provided to rovers during the night from batteries stored in the ground beside wadis. Buried batteries can be warmed by the wadi s heat. Results from analysis of the interaction between a rover and a wadi are presented. A detailed three-dimensional (3D) thermal model and an easily configurable two-dimensional (2D) thermal model are used for analysis.

  17. The Biomolecular Crystallization Database Version 4: expanded content and new features.

    PubMed

    Tung, Michael; Gallagher, D Travis

    2009-01-01

    The Biological Macromolecular Crystallization Database (BMCD) has been a publicly available resource since 1988, providing a curated archive of information on crystal growth for proteins and other biological macromolecules. The BMCD content has recently been expanded to include 14 372 crystal entries. The resource continues to be freely available at http://xpdb.nist.gov:8060/BMCD4. In addition, the software has been adapted to support the Java-based Lucene query language, enabling detailed searching over specific parameters, and explicit search of parameter ranges is offered for five numeric variables. Extensive tools have been developed for import and handling of data from the RCSB Protein Data Bank. The updated BMCD is called version 4.02 or BMCD4. BMCD4 entries have been expanded to include macromolecule sequence, enabling more elaborate analysis of relations among protein properties, crystal-growth conditions and the geometric and diffraction properties of the crystals. The BMCD version 4.02 contains greatly expanded content and enhanced search capabilities to facilitate scientific analysis and design of crystal-growth strategies.

  18. A new light on caloric test--what was disclosed by three dimensional analysis of caloric nystagmus?

    NASA Technical Reports Server (NTRS)

    Arai, Y.

    2001-01-01

    For better understanding of caloric nystagmus, this phenomenon will be reviewed historically in three stages. 1) The first light on caloric nystagmus was thrown by Barany 1906. Through direct observation of eye movements, Barany established the caloric test as an important tool to determine the side of lesion for vertigo. 2) The second light is shed by electrooculogram (EOG) from the late 1950th. EOG enabled qualitative analysis of caloric nystagmus, and proved Barany's convection theory, but resulted in neglect of vertical and roll eye movements. 3) The third light is gained by 3D recording of eye movements started from the late 1980th. 3D recordings of eye movements enabled us to analyze the spatial orientation of caloric nystagmus, and disclose the close correlation of the nystagmus components in the head vertical and the space vertical planes, suggesting a contribution of the velocity storage integrator. The 3D property of caloric nystagmus will be explained in detail.

  19. Capturing Fine Details Involving Low-Cost Sensors -a Comparative Study

    NASA Astrophysics Data System (ADS)

    Rehany, N.; Barsi, A.; Lovas, T.

    2017-11-01

    Capturing the fine details on the surface of small objects is a real challenge to many conventional surveying methods. Our paper discusses the investigation of several data acquisition technologies, such as arm scanner, structured light scanner, terrestrial laser scanner, object line-scanner, DSLR camera, and mobile phone camera. A palm-sized embossed sculpture reproduction was used as a test object; it has been surveyed by all the instruments. The result point clouds and meshes were then analyzed, using the arm scanner's dataset as reference. In addition to general statistics, the results have been evaluated based both on 3D deviation maps and 2D deviation graphs; the latter allows even more accurate analysis of the characteristics of the different data acquisition approaches. Additionally, own-developed local minimum maps were created that nicely visualize the potential level of detail provided by the applied technologies. Besides the usual geometric assessment, the paper discusses the different resource needs (cost, time, expertise) of the discussed techniques. Our results proved that even amateur sensors operated by amateur users can provide high quality datasets that enable engineering analysis. Based on the results, the paper contains an outlook to potential future investigations in this field.

  20. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    NASA Astrophysics Data System (ADS)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  1. Structurally detailed coarse-grained model for Sec-facilitated co-translational protein translocation and membrane integration

    PubMed Central

    Miller, Thomas F.

    2017-01-01

    We present a coarse-grained simulation model that is capable of simulating the minute-timescale dynamics of protein translocation and membrane integration via the Sec translocon, while retaining sufficient chemical and structural detail to capture many of the sequence-specific interactions that drive these processes. The model includes accurate geometric representations of the ribosome and Sec translocon, obtained directly from experimental structures, and interactions parameterized from nearly 200 μs of residue-based coarse-grained molecular dynamics simulations. A protocol for mapping amino-acid sequences to coarse-grained beads enables the direct simulation of trajectories for the co-translational insertion of arbitrary polypeptide sequences into the Sec translocon. The model reproduces experimentally observed features of membrane protein integration, including the efficiency with which polypeptide domains integrate into the membrane, the variation in integration efficiency upon single amino-acid mutations, and the orientation of transmembrane domains. The central advantage of the model is that it connects sequence-level protein features to biological observables and timescales, enabling direct simulation for the mechanistic analysis of co-translational integration and for the engineering of membrane proteins with enhanced membrane integration efficiency. PMID:28328943

  2. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  3. Students' meaning making in classroom discussions: the importance of peer interaction

    NASA Astrophysics Data System (ADS)

    Rudsberg, Karin; Östman, Leif; Aaro Östman, Elisabeth

    2017-09-01

    The aim is to investigate how encounters with peers affect an individual's meaning making in argumentation about socio-scientific issues, and how the individual's meaning making influences the argumentation at the collective level. The analysis is conducted using the analytical method "transactional argumentation analysis" (TAA) which enables in situ studies. TAA combines a transactional perspective on meaning making based on John Dewey's pragmatic philosophy with an argument analysis based on Toulmin's argument pattern. Here TAA is developed further to enable analysis that in detail clarifies the dynamic interplay between the individual and the collective—the intra- and the inter-personal dimensions—and the result of this interplay in terms of meaning making and learning. The empirical material in this study consists of a video-recorded lesson in a Swedish upper secondary school. The results show that the analysed student is influenced by peers when construing arguments, and thereby acts on others' reasoning when making meaning. Further, the results show that most of the additions made by the analysed student are taken further by peers in the subsequent discussion. This study shows how an individual's earlier experiences, knowledge and thinking contribute to the collective meaning making in the classroom.

  4. Statistical analysis of road-vehicle-driver interaction as an enabler to designing behavioural models

    NASA Astrophysics Data System (ADS)

    Chakravarty, T.; Chowdhury, A.; Ghose, A.; Bhaumik, C.; Balamuralidhar, P.

    2014-03-01

    Telematics form an important technology enabler for intelligent transportation systems. By deploying on-board diagnostic devices, the signatures of vehicle vibration along with its location and time are recorded. Detailed analyses of the collected signatures offer deep insights into the state of the objects under study. Towards that objective, we carried out experiments by deploying telematics device in one of the office bus that ferries employees to office and back. Data is being collected from 3-axis accelerometer, GPS, speed and the time for all the journeys. In this paper, we present initial results of the above exercise by applying statistical methods to derive information through systematic analysis of the data collected over four months. It is demonstrated that the higher order derivative of the measured Z axis acceleration samples display the properties Weibull distribution when the time axis is replaced by the amplitude of such processed acceleration data. Such an observation offers us a method to predict future behaviour where deviations from prediction are classified as context-based aberrations or progressive degradation of the system. In addition we capture the relationship between speed of the vehicle and median of the jerk energy samples using regression analysis. Such results offer an opportunity to develop a robust method to model road-vehicle interaction thereby enabling us to predict such like driving behaviour and condition based maintenance etc.

  5. RAPID COMMUNICATION: Diffusion thermopower in graphene

    NASA Astrophysics Data System (ADS)

    Vaidya, R. G.; Kamatagi, M. D.; Sankeshwar, N. S.; Mulimani, B. G.

    2010-09-01

    The diffusion thermopower of graphene, Sd, is studied for 30 < T < 300 K, considering the electrons to be scattered by impurities, vacancies, surface roughness and acoustic and optical phonons via deformation potential couplings. Sd is found to increase almost linearly with temperature, determined mainly by vacancy and impurity scatterings. A departure from linear behaviour due to optical phonons is noticed. As a function of carrier concentration, a change in the sign of |Sd| is observed. Our analysis of recent thermopower data obtains a good fit. The limitations of Mott formula are discussed. Detailed analysis of data will enable a better understanding of the scattering mechanisms operative in graphene.

  6. Frontal affinity chromatography: A unique research tool for biospecific interaction that promotes glycobiology

    PubMed Central

    KASAI, Kenichi

    2014-01-01

    Combination of bioaffinity and chromatography gave birth to affinity chromatography. A further combination with frontal analysis resulted in creation of frontal affinity chromatography (FAC). This new versatile research tool enabled detailed analysis of weak interactions that play essential roles in living systems, especially those between complex saccharides and saccharide-binding proteins. FAC now becomes the best method for the investigation of saccharide-binding proteins (lectins) from viewpoints of sensitivity, accuracy, and efficiency, and is contributing greatly to the development of glycobiology. It opened a door leading to deeper understanding of the significance of saccharide recognition in life. The theory is also concisely described. PMID:25169774

  7. WarpIV: In situ visualization and analysis of ion accelerator simulations

    DOE PAGES

    Rubel, Oliver; Loring, Burlen; Vay, Jean -Luc; ...

    2016-05-09

    The generation of short pulses of ion beams through the interaction of an intense laser with a plasma sheath offers the possibility of compact and cheaper ion sources for many applications--from fast ignition and radiography of dense targets to hadron therapy and injection into conventional accelerators. To enable the efficient analysis of large-scale, high-fidelity particle accelerator simulations using the Warp simulation suite, the authors introduce the Warp In situ Visualization Toolkit (WarpIV). WarpIV integrates state-of-the-art in situ visualization and analysis using VisIt with Warp, supports management and control of complex in situ visualization and analysis workflows, and implements integrated analyticsmore » to facilitate query- and feature-based data analytics and efficient large-scale data analysis. WarpIV enables for the first time distributed parallel, in situ visualization of the full simulation data using high-performance compute resources as the data is being generated by Warp. The authors describe the application of WarpIV to study and compare large 2D and 3D ion accelerator simulations, demonstrating significant differences in the acceleration process in 2D and 3D simulations. WarpIV is available to the public via https://bitbucket.org/berkeleylab/warpiv. The Warp In situ Visualization Toolkit (WarpIV) supports large-scale, parallel, in situ visualization and analysis and facilitates query- and feature-based analytics, enabling for the first time high-performance analysis of large-scale, high-fidelity particle accelerator simulations while the data is being generated by the Warp simulation suite. Furthermore, this supplemental material https://extras.computer.org/extra/mcg2016030022s1.pdf provides more details regarding the memory profiling and optimization and the Yee grid recentering optimization results discussed in the main article.« less

  8. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  9. Leading team learning: what makes interprofessional teams learn to work well?

    PubMed

    Chatalalsingh, Carole; Reeves, Scott

    2014-11-01

    This article describes an ethnographic study focused on exploring leaders of team learning in well-established nephrology teams in an academic healthcare organization in Canada. Employing situational theory of leadership, the article provides details on how well established team members advance as "learning leaders". Data were gathered by ethnographic methods over a 9-month period with the members of two nephrology teams. These learning to care for the sick teams involved over 30 regulated health professionals, such as physicians, nurses, social workers, pharmacists, dietitians and other healthcare practitioners, staff, students and trainees, all of whom were collectively managing obstacles and coordinating efforts. Analysis involved an inductive thematic analysis of observations, reflections, and interview transcripts. The study indicated how well established members progress as team-learning leaders, and how they adapt to an interprofessional culture through the activities they employ to enable day-to-day learning. The article uses situational theory of leadership to generate a detailed illumination of the nature of leaders' interactions within an interprofessional context.

  10. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.

    2016-12-01

    Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.

  11. Microbial genomics, transcriptomics and proteomics: new discoveries in decomposition research using complementary methods.

    PubMed

    Baldrian, Petr; López-Mondéjar, Rubén

    2014-02-01

    Molecular methods for the analysis of biomolecules have undergone rapid technological development in the last decade. The advent of next-generation sequencing methods and improvements in instrumental resolution enabled the analysis of complex transcriptome, proteome and metabolome data, as well as a detailed annotation of microbial genomes. The mechanisms of decomposition by model fungi have been described in unprecedented detail by the combination of genome sequencing, transcriptomics and proteomics. The increasing number of available genomes for fungi and bacteria shows that the genetic potential for decomposition of organic matter is widespread among taxonomically diverse microbial taxa, while expression studies document the importance of the regulation of expression in decomposition efficiency. Importantly, high-throughput methods of nucleic acid analysis used for the analysis of metagenomes and metatranscriptomes indicate the high diversity of decomposer communities in natural habitats and their taxonomic composition. Today, the metaproteomics of natural habitats is of interest. In combination with advanced analytical techniques to explore the products of decomposition and the accumulation of information on the genomes of environmentally relevant microorganisms, advanced methods in microbial ecophysiology should increase our understanding of the complex processes of organic matter transformation.

  12. rCAD: A Novel Database Schema for the Comparative Analysis of RNA.

    PubMed

    Ozer, Stuart; Doshi, Kishore J; Xu, Weijia; Gutell, Robin R

    2011-12-31

    Beyond its direct involvement in protein synthesis with mRNA, tRNA, and rRNA, RNA is now being appreciated for its significance in the overall metabolism and regulation of the cell. Comparative analysis has been very effective in the identification and characterization of RNA molecules, including the accurate prediction of their secondary structure. We are developing an integrative scalable data management and analysis system, the RNA Comparative Analysis Database (rCAD), implemented with SQL Server to support RNA comparative analysis. The platformagnostic database schema of rCAD captures the essential relationships between the different dimensions of information for RNA comparative analysis datasets. The rCAD implementation enables a variety of comparative analysis manipulations with multiple integrated data dimensions for advanced RNA comparative analysis workflows. In this paper, we describe details of the rCAD schema design and illustrate its usefulness with two usage scenarios.

  13. rCAD: A Novel Database Schema for the Comparative Analysis of RNA

    PubMed Central

    Ozer, Stuart; Doshi, Kishore J.; Xu, Weijia; Gutell, Robin R.

    2013-01-01

    Beyond its direct involvement in protein synthesis with mRNA, tRNA, and rRNA, RNA is now being appreciated for its significance in the overall metabolism and regulation of the cell. Comparative analysis has been very effective in the identification and characterization of RNA molecules, including the accurate prediction of their secondary structure. We are developing an integrative scalable data management and analysis system, the RNA Comparative Analysis Database (rCAD), implemented with SQL Server to support RNA comparative analysis. The platformagnostic database schema of rCAD captures the essential relationships between the different dimensions of information for RNA comparative analysis datasets. The rCAD implementation enables a variety of comparative analysis manipulations with multiple integrated data dimensions for advanced RNA comparative analysis workflows. In this paper, we describe details of the rCAD schema design and illustrate its usefulness with two usage scenarios. PMID:24772454

  14. Reducing adaptive optics latency using Xeon Phi many-core processors

    NASA Astrophysics Data System (ADS)

    Barr, David; Basden, Alastair; Dipper, Nigel; Schwartz, Noah

    2015-11-01

    The next generation of Extremely Large Telescopes (ELTs) for astronomy will rely heavily on the performance of their adaptive optics (AO) systems. Real-time control is at the heart of the critical technologies that will enable telescopes to deliver the best possible science and will require a very significant extrapolation from current AO hardware existing for 4-10 m telescopes. Investigating novel real-time computing architectures and testing their eligibility against anticipated challenges is one of the main priorities of technology development for the ELTs. This paper investigates the suitability of the Intel Xeon Phi, which is a commercial off-the-shelf hardware accelerator. We focus on wavefront reconstruction performance, implementing a straightforward matrix-vector multiplication (MVM) algorithm. We present benchmarking results of the Xeon Phi on a real-time Linux platform, both as a standalone processor and integrated into an existing real-time controller (RTC). Performance of single and multiple Xeon Phis are investigated. We show that this technology has the potential of greatly reducing the mean latency and variations in execution time (jitter) of large AO systems. We present both a detailed performance analysis of the Xeon Phi for a typical E-ELT first-light instrument along with a more general approach that enables us to extend to any AO system size. We show that systematic and detailed performance analysis is an essential part of testing novel real-time control hardware to guarantee optimal science results.

  15. Tackling The Dragon: Investigating Lensed Galaxy Structure

    NASA Astrophysics Data System (ADS)

    Fortenberry, Alexander; Livermore, Rachael

    2018-01-01

    Galaxies have been seen to have a rapid decrease in star formation beginning at a redshift of around 1-2 up to the present day. To understand the processes underpinning this change, we need to observe the inner structure of galaxies and understand where and how the stellar mass builds up. However, at high redshifts our observable resolution is limited, which hinders the accuracy of the data. The lack of resolution at high redshift can be counteracted with the use of gravitational lensing. The magnification provided by the gravitational lens between us and the galaxies in question enables us to see extreme detail within the galaxies. To begin fine-tuning this process, we used Hubble data of Abell 370, a galaxy cluster, which lenses a galaxy know as “The Dragon” at z=0.725. With the increased detail proved by the gravitational lens we provide a detailed analysis of the galaxy’s spatially resolved star formation rate, stellar age, and masses.

  16. Monitoring Change Through Hierarchical Segmentation of Remotely Sensed Image Data

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Lawrence, William T.

    2005-01-01

    NASA's Goddard Space Flight Center has developed a fast and effective method for generating image segmentation hierarchies. These segmentation hierarchies organize image data in a manner that makes their information content more accessible for analysis. Image segmentation enables analysis through the examination of image regions rather than individual image pixels. In addition, the segmentation hierarchy provides additional analysis clues through the tracing of the behavior of image region characteristics at several levels of segmentation detail. The potential for extracting the information content from imagery data based on segmentation hierarchies has not been fully explored for the benefit of the Earth and space science communities. This paper explores the potential of exploiting these segmentation hierarchies for the analysis of multi-date data sets, and for the particular application of change monitoring.

  17. Effect of the statin therapy on biochemical laboratory tests--a chemometrics study.

    PubMed

    Durceková, Tatiana; Mocák, Ján; Boronová, Katarína; Balla, Ján

    2011-01-05

    Statins are the first-line choice for lowering total and LDL cholesterol levels and very important medicaments for reducing the risk of coronary artery disease. The aim of this study is therefore assessment of the results of biochemical tests characterizing the condition of 172 patients before and after administration of statins. For this purpose, several chemometric tools, namely principal component analysis, cluster analysis, discriminant analysis, logistic regression, KNN classification, ROC analysis, descriptive statistics and ANOVA were used. Mutual relations of 11 biochemical laboratory tests, the patient's age and gender were investigated in detail. Achieved results enable to evaluate the extent of the statin treatment in each individual case. They may also help in monitoring the dynamic progression of the disease. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. SIMPLEX: Cloud-Enabled Pipeline for the Comprehensive Analysis of Exome Sequencing Data

    PubMed Central

    Fischer, Maria; Snajder, Rene; Pabinger, Stephan; Dander, Andreas; Schossig, Anna; Zschocke, Johannes; Trajanoski, Zlatko; Stocker, Gernot

    2012-01-01

    In recent studies, exome sequencing has proven to be a successful screening tool for the identification of candidate genes causing rare genetic diseases. Although underlying targeted sequencing methods are well established, necessary data handling and focused, structured analysis still remain demanding tasks. Here, we present a cloud-enabled autonomous analysis pipeline, which comprises the complete exome analysis workflow. The pipeline combines several in-house developed and published applications to perform the following steps: (a) initial quality control, (b) intelligent data filtering and pre-processing, (c) sequence alignment to a reference genome, (d) SNP and DIP detection, (e) functional annotation of variants using different approaches, and (f) detailed report generation during various stages of the workflow. The pipeline connects the selected analysis steps, exposes all available parameters for customized usage, performs required data handling, and distributes computationally expensive tasks either on a dedicated high-performance computing infrastructure or on the Amazon cloud environment (EC2). The presented application has already been used in several research projects including studies to elucidate the role of rare genetic diseases. The pipeline is continuously tested and is publicly available under the GPL as a VirtualBox or Cloud image at http://simplex.i-med.ac.at; additional supplementary data is provided at http://www.icbi.at/exome. PMID:22870267

  19. Description of a Portable Wireless Device for High-Frequency Body Temperature Acquisition and Analysis

    PubMed Central

    Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau

    2009-01-01

    We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility. PMID:22408473

  20. Description of a portable wireless device for high-frequency body temperature acquisition and analysis.

    PubMed

    Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau

    2009-01-01

    We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubel, Oliver; Loring, Burlen; Vay, Jean -Luc

    The generation of short pulses of ion beams through the interaction of an intense laser with a plasma sheath offers the possibility of compact and cheaper ion sources for many applications--from fast ignition and radiography of dense targets to hadron therapy and injection into conventional accelerators. To enable the efficient analysis of large-scale, high-fidelity particle accelerator simulations using the Warp simulation suite, the authors introduce the Warp In situ Visualization Toolkit (WarpIV). WarpIV integrates state-of-the-art in situ visualization and analysis using VisIt with Warp, supports management and control of complex in situ visualization and analysis workflows, and implements integrated analyticsmore » to facilitate query- and feature-based data analytics and efficient large-scale data analysis. WarpIV enables for the first time distributed parallel, in situ visualization of the full simulation data using high-performance compute resources as the data is being generated by Warp. The authors describe the application of WarpIV to study and compare large 2D and 3D ion accelerator simulations, demonstrating significant differences in the acceleration process in 2D and 3D simulations. WarpIV is available to the public via https://bitbucket.org/berkeleylab/warpiv. The Warp In situ Visualization Toolkit (WarpIV) supports large-scale, parallel, in situ visualization and analysis and facilitates query- and feature-based analytics, enabling for the first time high-performance analysis of large-scale, high-fidelity particle accelerator simulations while the data is being generated by the Warp simulation suite. Furthermore, this supplemental material https://extras.computer.org/extra/mcg2016030022s1.pdf provides more details regarding the memory profiling and optimization and the Yee grid recentering optimization results discussed in the main article.« less

  2. A Microfluidic Technique to Probe Cell Deformability

    PubMed Central

    Hoelzle, David J.; Varghese, Bino A.; Chan, Clara K.; Rowat, Amy C.

    2014-01-01

    Here we detail the design, fabrication, and use of a microfluidic device to evaluate the deformability of a large number of individual cells in an efficient manner. Typically, data for ~102 cells can be acquired within a 1 hr experiment. An automated image analysis program enables efficient post-experiment analysis of image data, enabling processing to be complete within a few hours. Our device geometry is unique in that cells must deform through a series of micron-scale constrictions, thereby enabling the initial deformation and time-dependent relaxation of individual cells to be assayed. The applicability of this method to human promyelocytic leukemia (HL-60) cells is demonstrated. Driving cells to deform through micron-scale constrictions using pressure-driven flow, we observe that human promyelocytic (HL-60) cells momentarily occlude the first constriction for a median time of 9.3 msec before passaging more quickly through the subsequent constrictions with a median transit time of 4.0 msec per constriction. By contrast, all-trans retinoic acid-treated (neutrophil-type) HL-60 cells occlude the first constriction for only 4.3 msec before passaging through the subsequent constrictions with a median transit time of 3.3 msec. This method can provide insight into the viscoelastic nature of cells, and ultimately reveal the molecular origins of this behavior. PMID:25226269

  3. Development of an integrated aeroservoelastic analysis program and correlation with test data

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.; Brenner, M. J.; Voelker, L. S.

    1991-01-01

    The details and results are presented of the general-purpose finite element STructural Analysis RoutineS (STARS) to perform a complete linear aeroelastic and aeroservoelastic analysis. The earlier version of the STARS computer program enabled effective finite element modeling as well as static, vibration, buckling, and dynamic response of damped and undamped systems, including those with pre-stressed and spinning structures. Additions to the STARS program include aeroelastic modeling for flutter and divergence solutions, and hybrid control system augmentation for aeroservoelastic analysis. Numerical results of the X-29A aircraft pertaining to vibration, flutter-divergence, and open- and closed-loop aeroservoelastic controls analysis are compared to ground vibration, wind-tunnel, and flight-test results. The open- and closed-loop aeroservoelastic control analyses are based on a hybrid formulation representing the interaction of structural, aerodynamic, and flight-control dynamics.

  4. Distilling the Antecedents and Enabling Dynamics of Leader Moral Courage: A Framework to Guide Action.

    PubMed

    Hutchinson, Marie; Jackson, Debra; Daly, John; Usher, Kim

    2015-05-01

    Intelligent, robust and courageous nursing leadership is essential in all areas of nursing, including mental health. However, in the nursing leadership literature, the theoretical discourse regarding how leaders recognise the need for action and make the choice to act with moral purpose is currently limited. Little has been written about the cognitions, capabilities and contextual factors that enable leader courage. In particular, the interplay between leader values and actions that are characterised as good or moral remains underexplored in the nursing leadership literature. In this article, through a discursive literature synthesis we seek to distill a more detailed understanding of leader moral courage; specifically, what factors contribute to leaders' ability to act with moral courage, what factors impede such action, and what factors do leaders need to foster within themselves and others to enable action that is driven by moral courage. From the analysis, we distilled a multi-level framework that identifies a range of individual characteristics and capabilities, and enabling contextual factors that underpin leader moral courage. The framework suggests leader moral courage is more complex than often posited in theories of leadership, as it comprises elements that shape moral thought and conduct. Given the complexity and challenges of nursing work, the framework for moral action derived from our analysis provides insight and suggestions for strengthening individual and group capacity to assist nurse leaders and mental health nurses to act with integrity and courage.

  5. Rotationally resolved fluorescence spectroscopy of molecular iodine

    NASA Astrophysics Data System (ADS)

    Lemon, Christopher; Canagaratna, Sebastian; Gray, Jeffrey

    2008-03-01

    Vibration-electronic spectroscopy of I2 vapor is a common, important experiment in physical chemistry lab courses. We use narrow bandwidth diode-pumped solid state (DPSS) lasers to excite specific rotational levels; these lasers are surprisingly stable and are now available at low cost. We also use efficient miniature fiber-optic spectrometers to resolve rotational fluorescence patterns in a vibrational progression. The resolution enables thorough and accurate analysis of spectroscopic constants for the ground electronic state. The high signal-to-noise ratio, which is easily achieved, also enables students to precisely measure fluorescence band intensities, providing further insight into vibrational wavefunctions and the molecular potential function. We will provide a detailed list of parts for the apparatus as well as modeling algorithms with statistical evaluation to facilitate widespread adoption of these experimental improvements by instructors of intermediate and advanced lab courses.

  6. A spatial database for landslides in northern Bavaria: A methodological approach

    NASA Astrophysics Data System (ADS)

    Jäger, Daniel; Kreuzer, Thomas; Wilde, Martina; Bemm, Stefan; Terhorst, Birgit

    2018-04-01

    Landslide databases provide essential information for hazard modeling, damages on buildings and infrastructure, mitigation, and research needs. This study presents the development of a landslide database system named WISL (Würzburg Information System on Landslides), currently storing detailed landslide data for northern Bavaria, Germany, in order to enable scientific queries as well as comparisons with other regional landslide inventories. WISL is based on free open source software solutions (PostgreSQL, PostGIS) assuring good correspondence of the various softwares and to enable further extensions with specific adaptions of self-developed software. Apart from that, WISL was designed to be particularly compatible for easy communication with other databases. As a central pre-requisite for standardized, homogeneous data acquisition in the field, a customized data sheet for landslide description was compiled. This sheet also serves as an input mask for all data registration procedures in WISL. A variety of "in-database" solutions for landslide analysis provides the necessary scalability for the database, enabling operations at the local server. In its current state, WISL already enables extensive analysis and queries. This paper presents an example analysis of landslides in Oxfordian Limestones in the northeastern Franconian Alb, northern Bavaria. The results reveal widely differing landslides in terms of geometry and size. Further queries related to landslide activity classifies the majority of the landslides as currently inactive, however, they clearly possess a certain potential for remobilization. Along with some active mass movements, a significant percentage of landslides potentially endangers residential areas or infrastructure. The main aspect of future enhancements of the WISL database is related to data extensions in order to increase research possibilities, as well as to transfer the system to other regions and countries.

  7. Regolith Gardening Caused by Recent Lunar Impacts Observed by the Lunar Reconnaissance Obiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.

    2016-12-01

    Temporal observations by the Lunar Reconnaissance Obiter Camera (LROC) Narrow Angle Camera (NAC) enable us to map and measure the spatial distribution of ejecta as well as quantify faint distal zones that may be the result of early stage jetting caused by meteoroid impacts. These detailed before and after observations enable the examination of surface reflectance changes as well as the analysis of nearby features (i.e. highly degraded craters, secondary craters, and new/spatially shifted boulders). In addition, NAC temporal pairs reveal numerous areas where the regolith has been churned and modified. These features, which we refer to as splotches, are most likely caused by small secondary impacts due to their high population near recent impact events [Robinson et al., 2015]. Using over 14,000 NAC temporal pairs, we identified over 47,000 splotches and quantified their spatial coverage and rate of formation. Based on the observed size frequency distribution, our models indicate that 99% of the entire lunar surface is modified by 1 m in diameter and larger splotches over a period of 8.1x10^4 years. These splotches have the potential to churn the upper few cm of regolith, which influence the local surface roughness and ultimately the surface reflectance observed from orbit. This new churning rate estimate is consistent with previous analysis of regolith properties within drive core samples acquired during the Apollo missions; these cores reveal that the upper 2 cm was rapidly and continuously modified over periods of <=10^5 years [Fruchter et al., 1977]. Overall, the examination of LROC NAC temporal pairs enables detailed studies of the impact process on a scale that exceeds laboratory experiments. Continued collection of NAC temporal pairs during the LRO Cornerstone Mission and future extended missions will aid in the discovery of new, larger impact craters and other contemporary surface changes. References:Fruchter et al. 1977. Proc. Lunar Planet Sci. Conf. 8th. pp. 3595-3605. Robinson et al. 2015. Icarus 252, 229-235.

  8. Analysis of active volcanoes from the Earth Observing System

    NASA Technical Reports Server (NTRS)

    Mouginis-Mark, Peter; Rowland, Scott; Crisp, Joy; Glaze, Lori; Jones, Kenneth; Kahle, Anne; Pieri, David; Zebker, Howard; Krueger, Arlin; Walter, Lou

    1991-01-01

    The Earth Observing System (EOS) scheduled for launch in 1997 and 1999 is briefly described, and the EOS volcanology investigation objectives are discussed. The volcanology investigation will include long- and short-term monitoring of selected volcanoes, the detection of precursor activity associated with unanticipated eruptions, and a detailed study of on-going eruptions. A variety of instruments on the EOS platforms will enable the study of local- and regional-scale thermal and deformational features of volcanoes, and the chemical and structural features of volcanic eruption plumes and aerosols.

  9. Identification of Phosphorylated Proteins on a Global Scale.

    PubMed

    Iliuk, Anton

    2018-05-31

    Liquid chromatography (LC) coupled with tandem mass spectrometry (MS/MS) has enabled researchers to analyze complex biological samples with unprecedented depth. It facilitates the identification and quantification of modifications within thousands of proteins in a single large-scale proteomic experiment. Analysis of phosphorylation, one of the most common and important post-translational modifications, has particularly benefited from such progress in the field. Here, detailed protocols are provided for a few well-regarded, common sample preparation methods for an effective phosphoproteomic experiment. © 2018 by John Wiley & Sons, Inc. Copyright © 2018 John Wiley & Sons, Inc.

  10. Torus Breakdown and Homoclinic Chaos in a Glow Discharge Tube

    NASA Astrophysics Data System (ADS)

    Ginoux, Jean-Marc; Meucci, Riccardo; Euzzor, Stefano

    2017-12-01

    Starting from historical researches, we used, like Van der Pol and Le Corbeiller, a cubic function for modeling the current-voltage characteristic of a direct current low-pressure plasma discharge tube, i.e. a neon tube. This led us to propose a new four-dimensional autonomous dynamical system allowing to describe the experimentally observed phenomenon. Then, mathematical analysis and detailed numerical investigations of such a fourth-order torus circuit enabled to highlight bifurcation routes from torus breakdown to homoclinic chaos following the Newhouse-Ruelle-Takens scenario.

  11. Computer program to perform cost and weight analysis of transport aircraft. Volume 2: Technical volume

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.

  12. Single-cell analysis and sorting using droplet-based microfluidics.

    PubMed

    Mazutis, Linas; Gilbert, John; Ung, W Lloyd; Weitz, David A; Griffiths, Andrew D; Heyman, John A

    2013-05-01

    We present a droplet-based microfluidics protocol for high-throughput analysis and sorting of single cells. Compartmentalization of single cells in droplets enables the analysis of proteins released from or secreted by cells, thereby overcoming one of the major limitations of traditional flow cytometry and fluorescence-activated cell sorting. As an example of this approach, we detail a binding assay for detecting antibodies secreted from single mouse hybridoma cells. Secreted antibodies are detected after only 15 min by co-compartmentalizing single mouse hybridoma cells, a fluorescent probe and single beads coated with anti-mouse IgG antibodies in 50-pl droplets. The beads capture the secreted antibodies and, when the captured antibodies bind to the probe, the fluorescence becomes localized on the beads, generating a clearly distinguishable fluorescence signal that enables droplet sorting at ∼200 Hz as well as cell enrichment. The microfluidic system described is easily adapted for screening other intracellular, cell-surface or secreted proteins and for quantifying catalytic or regulatory activities. In order to screen ∼1 million cells, the microfluidic operations require 2-6 h; the entire process, including preparation of microfluidic devices and mammalian cells, requires 5-7 d.

  13. A high-level 3D visualization API for Java and ImageJ.

    PubMed

    Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin

    2010-05-21

    Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.

  14. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  15. Single-cell analysis and sorting using droplet-based microfluidics

    PubMed Central

    Mazutis, Linas; Gilbert, John; Ung, W Lloyd; Weitz, David A; Griffiths, Andrew D; Heyman, John A

    2014-01-01

    We present a droplet-based microfluidics protocol for high-throughput analysis and sorting of single cells. compartmentalization of single cells in droplets enables the analysis of proteins released from or secreted by cells, thereby overcoming one of the major limitations of traditional flow cytometry and fluorescence-activated cell sorting. as an example of this approach, we detail a binding assay for detecting antibodies secreted from single mouse hybridoma cells. secreted antibodies are detected after only 15 min by co-compartmentalizing single mouse hybridoma cells, a fluorescent probe and single beads coated with anti-mouse IgG antibodies in 50-pl droplets. the beads capture the secreted antibodies and, when the captured antibodies bind to the probe, the fluorescence becomes localized on the beads, generating a clearly distinguishable fluorescence signal that enables droplet sorting at ~200 Hz as well as cell enrichment. the microfluidic system described is easily adapted for screening other intracellular, cell-surface or secreted proteins and for quantifying catalytic or regulatory activities. In order to screen ~1 million cells, the microfluidic operations require 2–6 h; the entire process, including preparation of microfluidic devices and mammalian cells, requires 5–7 d. PMID:23558786

  16. Statistical analysis for understanding and predicting battery degradations in real-life electric vehicle use

    NASA Astrophysics Data System (ADS)

    Barré, Anthony; Suard, Frédéric; Gérard, Mathias; Montaru, Maxime; Riu, Delphine

    2014-01-01

    This paper describes the statistical analysis of recorded data parameters of electrical battery ageing during electric vehicle use. These data permit traditional battery ageing investigation based on the evolution of the capacity fade and resistance raise. The measured variables are examined in order to explain the correlation between battery ageing and operating conditions during experiments. Such study enables us to identify the main ageing factors. Then, detailed statistical dependency explorations present the responsible factors on battery ageing phenomena. Predictive battery ageing models are built from this approach. Thereby results demonstrate and quantify a relationship between variables and battery ageing global observations, and also allow accurate battery ageing diagnosis through predictive models.

  17. Technology-enabled academic detailing: computer-mediated education between pharmacists and physicians for evidence-based prescribing.

    PubMed

    Ho, Kendall; Nguyen, Anne; Jarvis-Selinger, Sandra; Novak Lauscher, Helen; Cressman, Céline; Zibrik, Lindsay

    2013-09-01

    Academic detailing (AD) is the practice of specially trained pharmacists with detailed medication knowledge meeting with physicians to share best practices of prescribing. AD has demonstrated efficacy in positively influencing physicians' prescribing behavior. Nevertheless, a key challenge has been that physicians in rural and remote locations, or physicians who are time challenged, have limited ability to participate in face-to-face meetings with academic detailers, as these specially trained academic detailers are primarily urban-based and limited in numbers. To determine the feasibility of using information technologies to facilitate communication between academic detailers and physicians (known as Technology-Enabled Academic Detailing or TEAD) through a comparison to traditional face-to-face academic detailing (AD). Specifically, TEAD is compared to AD in terms of the ability to aid physicians in acquiring evidence-informed prescribing information on diabetes-related medications, measured in terms of time efficiency, satisfaction of both physicians and pharmacists, and quality of knowledge exchange. General Practitioner Physicians (n=105) and pharmacists (n=12) were recruited from across British Columbia. Pharmacists were trained to be academic detailers on diabetes medication usage. Physicians were assigned to one of four intervention groups to receive four academic detailing sessions from trained pharmacists. Intervention groups included: (1) AD only, (2) TEAD only, (3) TEAD crossed over to AD at midpoint, and (4) AD crossed over to TEAD at midpoint. Evaluation included physician-completed surveys before and after each session, pharmacist logs after each detailing session, interviews and focus groups with physicians and pharmacists at study completion, as well as a technical support log to record all phone calls and emails from physicians and pharmacists regarding any technical challenges during the TEAD sessions, or usage of the web portal. Because recruitment was very low for the cross over groups, we analyzed the results in two groups instead: AD only and TEAD only. 354 sessions were conducted (AD=161, TEAD=193). Of these, complete data were available for 300 sessions, which were included in analysis (AD=133, TEAD=167). On average, TEAD sessions were 49min long, and AD sessions 81min long. Overall, physicians enjoyed both modalities of academic detailing (AD and TEAD) because they received information that both reinforced their existing diabetes knowledge and also provided new prescribing insights and approaches. The results suggest that TEAD is an acceptable alternative to AD for providing physicians advice about prescribing. TEAD is more time efficient, facilitates effective knowledge exchange and interprofessional collaboration, and can reach those physicians virtually where face-to-face AD is not possible or practical. Due to logistics, physicians were allocated, rather than randomized, to receive AD and/or TEAD. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Misbheaving Faults: The Expanding Role of Geodetic Imaging in Unraveling Unexpected Fault Slip Behavior

    NASA Astrophysics Data System (ADS)

    Barnhart, W. D.; Briggs, R.

    2015-12-01

    Geodetic imaging techniques enable researchers to "see" details of fault rupture that cannot be captured by complementary tools such as seismology and field studies, thus providing increasingly detailed information about surface strain, slip kinematics, and how an earthquake may be transcribed into the geological record. For example, the recent Haiti, Sierra El Mayor, and Nepal earthquakes illustrate the fundamental role of geodetic observations in recording blind ruptures where purely geological and seismological studies provided incomplete views of rupture kinematics. Traditional earthquake hazard analyses typically rely on sparse paleoseismic observations and incomplete mapping, simple assumptions of slip kinematics from Andersonian faulting, and earthquake analogs to characterize the probabilities of forthcoming ruptures and the severity of ground accelerations. Spatially dense geodetic observations in turn help to identify where these prevailing assumptions regarding fault behavior break down and highlight new and unexpected kinematic slip behavior. Here, we focus on three key contributions of space geodetic observations to the analysis of co-seismic deformation: identifying near-surface co-seismic slip where no easily recognized fault rupture exists; discerning non-Andersonian faulting styles; and quantifying distributed, off-fault deformation. The 2013 Balochistan strike slip earthquake in Pakistan illuminates how space geodesy precisely images non-Andersonian behavior and off-fault deformation. Through analysis of high-resolution optical imagery and DEMs, evidence emerges that a single fault map slip as both a strike slip and dip slip fault across multiple seismic cycles. These observations likewise enable us to quantify on-fault deformation, which account for ~72% of the displacements in this earthquake. Nonetheless, the spatial distribution of on- and off-fault deformation in this event is highly spatially variable- a complicating factor for comparisons of geologic and geodetic slip rates. As such, detailed studies such as this will play a continuing vital role in the accurate assessment of short- and long-term fault slip kinematics.

  19. Partial differential equation techniques for analysing animal movement: A comparison of different methods.

    PubMed

    Wang, Yi-Shan; Potts, Jonathan R

    2017-03-07

    Recent advances in animal tracking have allowed us to uncover the drivers of movement in unprecedented detail. This has enabled modellers to construct ever more realistic models of animal movement, which aid in uncovering detailed patterns of space use in animal populations. Partial differential equations (PDEs) provide a popular tool for mathematically analysing such models. However, their construction often relies on simplifying assumptions which may greatly affect the model outcomes. Here, we analyse the effect of various PDE approximations on the analysis of some simple movement models, including a biased random walk, central-place foraging processes and movement in heterogeneous landscapes. Perhaps the most commonly-used PDE method dates back to a seminal paper of Patlak from 1953. However, our results show that this can be a very poor approximation in even quite simple models. On the other hand, more recent methods, based on transport equation formalisms, can provide more accurate results, as long as the kernel describing the animal's movement is sufficiently smooth. When the movement kernel is not smooth, we show that both the older and newer methods can lead to quantitatively misleading results. Our detailed analysis will aid future researchers in the appropriate choice of PDE approximation for analysing models of animal movement. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  1. Neural classifier in the estimation process of maturity of selected varieties of apples

    NASA Astrophysics Data System (ADS)

    Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Zbytek, Z.; Ludwiczak, A.; Przybylak, A.; Lewicki, A.

    2015-07-01

    This paper seeks to present methods of neural image analysis aimed at estimating the maturity state of selected varieties of apples which are popular in Poland. An identification of the degree of maturity of selected varieties of apples has been conducted on the basis of information encoded in graphical form, presented in the digital photos. The above process involves the application of the BBCH scale, used to determine the maturity of apples. The aforementioned scale is widely used in the EU and has been developed for many species of monocotyledonous plants and dicotyledonous plants. It is also worth noticing that the given scale enables detailed determinations of development stage of a given plant. The purpose of this work is to identify maturity level of selected varieties of apples, which is supported by the use of image analysis methods and classification techniques represented by artificial neural networks. The analysis of graphical representative features based on image analysis method enabled the assessment of the maturity of apples. For the utilitarian purpose the "JabVis 1.1" neural IT system was created, in accordance with requirements of the software engineering dedicated to support the decision-making processes occurring in broadly understood production process and processing of apples.

  2. High-Resolution Water Footprints of Production of the United States

    NASA Astrophysics Data System (ADS)

    Marston, Landon; Ao, Yufei; Konar, Megan; Mekonnen, Mesfin M.; Hoekstra, Arjen Y.

    2018-03-01

    The United States is the largest producer of goods and services in the world. Rainfall, surface water supplies, and groundwater aquifers represent a fundamental input to economic production. Despite the importance of water resources to economic activity, we do not have consistent information on water use for specific locations and economic sectors. A national, spatially detailed database of water use by sector would provide insight into U.S. utilization and dependence on water resources for economic production. To this end, we calculate the water footprint of over 500 food, energy, mining, services, and manufacturing industries and goods produced in the United States. To do this, we employ a data intensive approach that integrates water footprint and input-output techniques into a novel methodological framework. This approach enables us to present the most detailed and comprehensive water footprint analysis of any country to date. This study broadly contributes to our understanding of water in the U.S. economy, enables supply chain managers to assess direct and indirect water dependencies, and provides opportunities to reduce water use through benchmarking. In fact, we find that 94% of U.S. industries could reduce their total water footprint more by sourcing from more water-efficient suppliers in their supply chain than they could by converting their own operations to be more water-efficient.

  3. Developments in remote sensing technology enable more detailed urban flood risk analysis.

    NASA Astrophysics Data System (ADS)

    Denniss, A.; Tewkesbury, A.

    2009-04-01

    Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest digital airborne sensors, both optical and lidar, to produce the input layer for surface water flood modelling. A national flood map product has been created. The new product utilises sophisticated modelling techniques, perfected over many years, which harness graphical processing power. This product will prove particularly valuable for risk assessment decision support within insurance/reinsurance, property/environmental, utilities, risk management and government agencies. However, it is not just the ground elevation that determines the behaviour of surface water. By combining height information (surface and terrain) with high resolution aerial photography and colour infrared imagery, a high definition land cover mapping dataset (LandBase) is being produced, which provides a precise measure of sealed versus non sealed surface. This will allows even more sophisticated modelling of flood scenarios. Thus, the value of airborne survey data can be demonstrated by flood risk analysis down to individual addresses in urban areas. However for some risks, an even more detailed survey may be justified. In order to achieve this, Infoterra is testing new 360˚ mobile lidar technology. Collecting lidar data from a moving vehicle allows each street to be mapped in very high detail, allowing precise information about the location, size and shape of features such as kerbstones, gullies, road camber and building threshold level to be captured quickly and accurately. These data can then be used to model the problem of overland flood risk at the scale of individual properties. Whilst at present it might be impractical to undertake such detailed modelling for all properties, these techniques can certainly be used to improve the flood risk analysis of key locations. This paper will demonstrate how these new high resolution remote sensing techniques can be combined to provide a new resolution of detail to aid urban flood modelling.

  4. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  5. Diversification of transcription factor-DNA interactions and the evolution of gene regulatory networks.

    PubMed

    Rogers, Julia M; Bulyk, Martha L

    2018-04-25

    Sequence-specific transcription factors (TFs) bind short DNA sequences in the genome to regulate the expression of target genes. In the last decade, numerous technical advances have enabled the determination of the DNA-binding specificities of many of these factors. Large-scale screens of many TFs enabled the creation of databases of TF DNA-binding specificities, typically represented as position weight matrices (PWMs). Although great progress has been made in determining and predicting binding specificities systematically, there are still many surprises to be found when studying a particular TF's interactions with DNA in detail. Paralogous TFs' binding specificities can differ in subtle ways, in a manner that is not immediately apparent from looking at their PWMs. These differences affect gene regulatory outputs and enable TFs to rewire transcriptional networks over evolutionary time. This review discusses recent observations made in the study of TF-DNA interactions that highlight the importance of continued in-depth analysis of TF-DNA interactions and their inherent complexity. This article is categorized under: Biological Mechanisms > Regulatory Biology. © 2018 Wiley Periodicals, Inc.

  6. Best practices for germ-free derivation and gnotobiotic zebrafish husbandry

    PubMed Central

    Melancon, E.; De La Torre Canny, S. Gomez; Sichel, S.; Kelly, M.; Wiles, T.J.; Rawls, J.F.; Eisen, J.S.; Guillemin, K.

    2017-01-01

    All animals are ecosystems with resident microbial communities, referred to as microbiota, which play profound roles in host development, physiology, and evolution. Enabled by new DNA sequencing technologies, there is a burgeoning interest in animal–microbiota interactions, but dissecting the specific impacts of microbes on their hosts is experimentally challenging. Gnotobiology, the study of biological systems in which all members are known, enables precise experimental analysis of the necessity and sufficiency of microbes in animal biology by deriving animals germ-free (GF) and inoculating them with defined microbial lineages. Mammalian host models have long dominated gnotobiology, but we have recently adapted gnotobiotic approaches to the zebrafish (Danio rerio), an important aquatic model. Zebrafish offer several experimental attributes that enable rapid, large-scale gnotobiotic experimentation with high replication rates and exquisite optical resolution. Here we describe detailed protocols for three procedures that form the foundation of zebrafish gnotobiology: derivation of GF embryos, microbial association of GF animals, and long-term, GF husbandry. Our aim is to provide sufficient guidance in zebrafish gnotobiotic methodology to expand and enrich this exciting field of research. PMID:28129860

  7. Factors that enable and hinder the implementation of projects in the alcohol and other drug field.

    PubMed

    MacLean, Sarah; Berends, Lynda; Hunter, Barbara; Roberts, Bridget; Mugavin, Janette

    2012-02-01

    Few studies systematically explore elements of successful project implementation across a range of alcohol and other drug (AOD) activities. This paper provides an evidence base to inform project implementation in the AOD field. We accessed records for 127 completed projects funded by the Alcohol, Education and Rehabilitation Foundation from 2002 to 2008. An adapted realist synthesis methodology enabled us to develop categories of enablers and barriers to successful project implementation, and to identify factors statistically associated with successful project implementation, defined as meeting all funding objectives. Thematic analysis of eight case study projects allowed detailed exploration of findings. Nine enabler and 10 barrier categories were identified. Those most frequently reported as both barriers and enablers concerned partnerships with external agencies and communities, staffing and project design. Achieving supportive relationships with partner agencies and communities, employing skilled staff and implementing consumer or participant input mechanisms were statistically associated with successful project implementation. The framework described here will support development of evidence-based project funding guidelines and project performance indicators. The study provides evidence that investing project hours and resources to develop robust relationships with project partners and communities, implementing mechanisms for consumer or participant input and attracting skilled staff are legitimate and important activities, not just in themselves but because they potentially influence achievement of project funding objectives. © 2012 The Authors. ANZJPH © 2012 Public Health Association of Australia.

  8. Analysis of the Genetic Basis of Disease in the Context of Worldwide Human Relationships and Migration

    PubMed Central

    Corona, Erik; Chen, Rong; Sikora, Martin; Morgan, Alexander A.; Patel, Chirag J.; Ramesh, Aditya; Bustamante, Carlos D.; Butte, Atul J.

    2013-01-01

    Genetic diversity across different human populations can enhance understanding of the genetic basis of disease. We calculated the genetic risk of 102 diseases in 1,043 unrelated individuals across 51 populations of the Human Genome Diversity Panel. We found that genetic risk for type 2 diabetes and pancreatic cancer decreased as humans migrated toward East Asia. In addition, biliary liver cirrhosis, alopecia areata, bladder cancer, inflammatory bowel disease, membranous nephropathy, systemic lupus erythematosus, systemic sclerosis, ulcerative colitis, and vitiligo have undergone genetic risk differentiation. This analysis represents a large-scale attempt to characterize genetic risk differentiation in the context of migration. We anticipate that our findings will enable detailed analysis pertaining to the driving forces behind genetic risk differentiation. PMID:23717210

  9. Map of low-frequency electromagnetic noise in the sky

    NASA Astrophysics Data System (ADS)

    Füllekrug, Martin; Mezentsev, Andrew; Watson, Robert; Gaffet, Stéphane; Astin, Ivan; Smith, Nathan; Evans, Adrian

    2015-06-01

    The Earth's natural electromagnetic environment is disturbed by anthropogenic electromagnetic noise. Here we report the first results from an electromagnetic noise survey of the sky. The locations of electromagnetic noise sources are mapped on the hemisphere above a distributed array of wideband receivers that operate in a small aperture configuration. It is found that the noise sources can be localized at elevation angles up to ˜60° in the sky, well above the horizon. The sky also exhibits zones with little or no noise that are found toward the local zenith and the southwest of the array. These results are obtained by a rigorous analysis of the residuals from the classic dispersion relation for electromagnetic waves using an array analysis of electric field measurements in the frequency range from ˜20 to 250 kHz. The observed locations of the noise sources enable detailed observations of ionospheric modification, for example, caused by particle precipitation and lightning discharges, while the observed exclusion zones enable the detection of weak natural electromagnetic emissions, for example, from streamers in transient luminous events above thunderclouds.

  10. DataHub: Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  11. DataHub - Science data management in support of interactive exploratory analysis

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Rubin, Mark R.

    1993-01-01

    DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.

  12. Strategies for the profiling, characterisation and detailed structural analysis of N-linked oligosaccharides.

    PubMed

    Tharmalingam, Tharmala; Adamczyk, Barbara; Doherty, Margaret A; Royle, Louise; Rudd, Pauline M

    2013-02-01

    Many post-translational modifications, including glycosylation, are pivotal for the structural integrity, location and functional activity of glycoproteins. Sub-populations of proteins that are relocated or functionally changed by such modifications can change resting proteins into active ones, mediating specific effector functions, as in the case of monoclonal antibodies. To ensure safe and efficacious drugs it is essential to employ appropriate robust, quantitative analytical strategies that can (i) perform detailed glycan structural analysis, (ii) characterise specific subsets of glycans to assess known critical features of therapeutic activities (iii) rapidly profile glycan pools for at-line monitoring or high level batch to batch screening. Here we focus on these aspects of glycan analysis, showing how state-of-the-art technologies are required at all stages during the production of recombinant glycotherapeutics. These data can provide insights into processing pathways and suggest markers for intervention at critical control points in bioprocessing and also critical decision points in disease and drug monitoring in patients. Importantly, these tools are now enabling the first glycome/genome studies in large populations, allowing the integration of glycomics into other 'omics platforms in a systems biology context.

  13. STARS: An integrated general-purpose finite element structural, aeroelastic, and aeroservoelastic analysis computer program

    NASA Technical Reports Server (NTRS)

    Gupta, Kajal K.

    1991-01-01

    The details of an integrated general-purpose finite element structural analysis computer program which is also capable of solving complex multidisciplinary problems is presented. Thus, the SOLIDS module of the program possesses an extensive finite element library suitable for modeling most practical problems and is capable of solving statics, vibration, buckling, and dynamic response problems of complex structures, including spinning ones. The aerodynamic module, AERO, enables computation of unsteady aerodynamic forces for both subsonic and supersonic flow for subsequent flutter and divergence analysis of the structure. The associated aeroservoelastic analysis module, ASE, effects aero-structural-control stability analysis yielding frequency responses as well as damping characteristics of the structure. The program is written in standard FORTRAN to run on a wide variety of computers. Extensive graphics, preprocessing, and postprocessing routines are also available pertaining to a number of terminals.

  14. Thermal Deformation and RF Performance Analyses for the SWOT Large Deployable Ka-Band Reflectarray

    NASA Technical Reports Server (NTRS)

    Fang, H.; Sunada, E.; Chaubell, J.; Esteban-Fernandez, D.; Thomson, M.; Nicaise, F.

    2010-01-01

    A large deployable antenna technology for the NASA Surface Water and Ocean Topography (SWOT) Mission is currently being developed by JPL in response to NRC Earth Science Tier 2 Decadal Survey recommendations. This technology is required to enable the SWOT mission due to the fact that no currently available antenna is capable of meeting SWOT's demanding Ka-Band remote sensing requirements. One of the key aspects of this antenna development is to minimize the effect of the on-orbit thermal distortion to the antenna RF performance. An analysis process which includes: 1) the on-orbit thermal analysis to obtain the temperature distribution; 2) structural deformation analysis to get the geometry of the antenna surface; and 3) the RF performance with the given deformed antenna surface has been developed to accommodate the development of this antenna technology. The detailed analysis process and some analysis results will be presented and discussed by this paper.

  15. Telemetric Intracranial Pressure Monitoring with the Raumedic Neurovent P-tel.

    PubMed

    Antes, Sebastian; Tschan, Christoph A; Heckelmann, Michael; Breuskin, David; Oertel, Joachim

    2016-07-01

    Devices enabling long-term intracranial pressure monitoring have been demanded for some time. The first solutions using telemetry were proposed in 1967. Since then, many other wireless systems have followed but some technical restrictions have led to unacceptable measurement uncertainties. In 2009, a completely revised telemetric pressure device called Neurovent P-tel was introduced to the market. This report reviews technical aspects, handling, possibilities of data analysis, and the efficiency of the probe in clinical routine. The telemetric device consists of 3 main parts: the passive implant, the active antenna, and the storage monitor. The implant with its parenchymal pressure transducer is inserted via a frontal burr hole. Pressure values can be registered with a frequency of 1 Hz or 5 Hz. Telemetrically gathered data can be viewed on the storage monitor or saved on a computer for detailed analyses. A total of 247 patients with suspected (n = 123) or known (n = 124) intracranial pressure disorders underwent insertion of the telemetric pressure probe. A detailed analysis of the long-term intracranial pressure profile including mean values, maximum and negative peaks, pathologic slow waves, and pulse pressure amplitudes is feasible using the detection rate of 5 Hz. This enables the verification of suspected diagnoses as normal-pressure hydrocephalus, benign intracranial hypertension, shunt malfunction, or shunt overdrainage. Long-term application also facilitates postoperative surveillance and supports valve adjustments of shunt-treated patients. The presented telemetric measurement system is a valuable and effective diagnostic tool in selected cases. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Differentiating Pseudomonas sp. strain ADP cells in suspensions and biofilms using Raman spectroscopy and scanning electron microscopy.

    PubMed

    Henry, Victoria A; Jessop, Julie L P; Peeples, Tonya L

    2017-02-01

    High quality spectra of Pseudomonas sp. strain ADP in the planktonic and biofilm state were obtained using Raman microspectroscopy. These spectra enabled the identification of key differences between free and biofilm cells in the fingerprint region of Raman spectra in the nucleic acid, carbohydrate, and protein regions. Scanning electron microscopy (SEM) enabled detailed visualization of ADP biofilm with confirmation of associated extracellular matrix structure. Following extraction and Raman analysis of extracellular polymeric substances, Raman spectral differences between free and biofilm cells were largely attributed to the contribution of extracellular matrix components produced in mature biofilms. Raman spectroscopy complemented with SEM proves to be useful in distinguishing physiological properties among cells of the same species. Graphical Abstract Raman spectroscopy complemented with SEM proves to be useful in distinguishing physiological properties among cells of the same species.

  17. Spatiotemporal polarization modulation microscopy with a microretarder array

    NASA Astrophysics Data System (ADS)

    Ding, Changqin; Ulcickas, James R. W.; Simpson, Garth J.

    2018-02-01

    A patterned microretarder array positioned in the rear conjugate plane of a microscope enables rapid polarizationdependent nonlinear optical microscopy. The pattern introduced to the array results in periodic modulation of the polarization-state of the incident light as a function of position within the field of view with no moving parts or active control. Introduction of a single stationary optical element and a fixed polarizer into the beam of a nonlinear optical microscope enabled nonlinear optical tensor recovery, which informs on local structure and orientation. Excellent agreement was observed between the measured and predicted second harmonic generation (SHG) of z-cut quartz, selected as a test system with well-established nonlinear optical properties. Subsequent studies of spatially varying samples further support the general applicability of this relatively simple strategy for detailed polarization analysis in both conventional and nonlinear optical imaging of structurally diverse samples.

  18. Polaron spin echo envelope modulations in an organic semiconducting polymer

    DOE PAGES

    Mkhitaryan, V. V.; Dobrovitski, V. V.

    2017-06-01

    Here, we present a theoretical analysis of the electron spin echo envelope modulation (ESEEM) spectra of polarons in semiconducting π -conjugated polymers. We show that the contact hyperfine coupling and the dipolar interaction between the polaron and the proton spins give rise to different features in the ESEEM spectra. Our theory enables direct selective probe of different groups of nuclear spins, which affect the polaron spin dynamics. Namely, we demonstrate how the signal from the distant protons (coupled to the polaron spin via dipolar interactions) can be distinguished from the signal coming from the protons residing on the polaron sitemore » (coupled to the polaron spin via contact hyperfine interaction). We propose a method for directly probing the contact hyperfine interaction, that would enable detailed study of the polaron orbital state and its immediate environment. Lastly, we also analyze the decay of the spin echo modulation, and its connection to the polaron transport.« less

  19. CPTAC Assay Portal: a repository of targeted proteomic assays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.

    2014-06-27

    To address these issues, the Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as a public repository of well-characterized quantitative, MS-based, targeted proteomic assays. The purpose of the CPTAC Assay Portal is to facilitate widespread adoption of targeted MS assays by disseminating SOPs, reagents, and assay characterization data for highly characterized assays. A primary aim of the NCI-supported portal is to bring together clinicians or biologists and analytical chemists to answer hypothesis-driven questions using targeted, MS-based assays. Assay content is easily accessed through queries and filters, enabling investigatorsmore » to find assays to proteins relevant to their areas of interest. Detailed characterization data are available for each assay, enabling researchers to evaluate assay performance prior to launching the assay in their own laboratory.« less

  20. Spectral analysis of pharmaceutical formulations prepared according to ancient recipes in comparison with old museum remains.

    PubMed

    Gamberini, M Cristina; Baraldi, C; Freguglia, G; Baraldi, P

    2011-10-01

    A study of the composition of the remains of ancient ointments from museums was undertaken to enable understanding of the preparation techniques. Comparison of ancient recipes from different historical periods and spectroscopic characteristics of inorganic and/or organic remains recovered in museum vessels enabled preparation of ancient pharmaceutical-cosmetic formulations. Farmacopea Augustana by Occo was one the most important books studied for the 14 formulations prepared in the laboratory. Three formulations are discussed in detail and raw materials and new preparations were proposed for ozone ageing. The most important micro Raman results are discussed. The spectra of the raw materials lipids, beeswax, and resins are discussed; beeswax and pig suet (axŭngia) Raman spectra were found to be similar, but different from those of the aged oils. SERS was applied to ancient ointments and galbanum and the Raman spectra are reported and discussed for the first time.

  1. Transgenic mouse models enabling photolabeling of individual neurons in vivo.

    PubMed

    Peter, Manuel; Bathellier, Brice; Fontinha, Bruno; Pliota, Pinelopi; Haubensak, Wulf; Rumpel, Simon

    2013-01-01

    One of the biggest tasks in neuroscience is to explain activity patterns of individual neurons during behavior by their cellular characteristics and their connectivity within the neuronal network. To greatly facilitate linking in vivo experiments with a more detailed molecular or physiological analysis in vitro, we have generated and characterized genetically modified mice expressing photoactivatable GFP (PA-GFP) that allow conditional photolabeling of individual neurons. Repeated photolabeling at the soma reveals basic morphological features due to diffusion of activated PA-GFP into the dendrites. Neurons photolabeled in vivo can be re-identified in acute brain slices and targeted for electrophysiological recordings. We demonstrate the advantages of PA-GFP expressing mice by the correlation of in vivo firing rates of individual neurons with their expression levels of the immediate early gene c-fos. Generally, the mouse models described in this study enable the combination of various analytical approaches to characterize living cells, also beyond the neurosciences.

  2. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  3. SURVIAC Bulletin: AFRL Research Audit Trail Viewer (ATV). Volume 19, Issue 1, 2003

    DTIC Science & Technology

    2003-01-01

    Trail Viewer, the analyst obtained a close up view of the detailed aircraft model using the Orbit View, enabled the SkyBox , enabled fictional ter...trails and element projections, several simulated terrain types and Skybox environments to help the user maintain perspective, file based

  4. Investigation of hindwing folding in ladybird beetles by artificial elytron transplantation and microcomputed tomography.

    PubMed

    Saito, Kazuya; Nomura, Shuhei; Yamamoto, Shuhei; Niiyama, Ryuma; Okabe, Yoji

    2017-05-30

    Ladybird beetles are high-mobility insects and explore broad areas by switching between walking and flying. Their excellent wing transformation systems enabling this lifestyle are expected to provide large potential for engineering applications. However, the mechanism behind the folding of their hindwings remains unclear. The reason is that ladybird beetles close the elytra ahead of wing folding, preventing the observation of detailed processes occurring under the elytra. In the present study, artificial transparent elytra were transplanted on living ladybird beetles, thereby enabling us to observe the detailed wing-folding processes. The result revealed that in addition to the abdominal movements mentioned in previous studies, the edge and ventral surface of the elytra, as well as characteristic shaped veins, play important roles in wing folding. The structures of the wing frames enabling this folding process and detailed 3D shape of the hindwing were investigated using microcomputed tomography. The results showed that the tape spring-like elastic frame plays an important role in the wing transformation mechanism. Compared with other beetles, hindwings in ladybird beetles are characterized by two seemingly incompatible properties: ( i ) the wing rigidity with relatively thick veins and ( ii ) the compactness in stored shapes with complex crease patterns. The detailed wing-folding process revealed in this study is expected to facilitate understanding of the naturally optimized system in this excellent deployable structure.

  5. Investigation of hindwing folding in ladybird beetles by artificial elytron transplantation and microcomputed tomography

    PubMed Central

    Nomura, Shuhei; Yamamoto, Shuhei; Niiyama, Ryuma; Okabe, Yoji

    2017-01-01

    Ladybird beetles are high-mobility insects and explore broad areas by switching between walking and flying. Their excellent wing transformation systems enabling this lifestyle are expected to provide large potential for engineering applications. However, the mechanism behind the folding of their hindwings remains unclear. The reason is that ladybird beetles close the elytra ahead of wing folding, preventing the observation of detailed processes occurring under the elytra. In the present study, artificial transparent elytra were transplanted on living ladybird beetles, thereby enabling us to observe the detailed wing-folding processes. The result revealed that in addition to the abdominal movements mentioned in previous studies, the edge and ventral surface of the elytra, as well as characteristic shaped veins, play important roles in wing folding. The structures of the wing frames enabling this folding process and detailed 3D shape of the hindwing were investigated using microcomputed tomography. The results showed that the tape spring-like elastic frame plays an important role in the wing transformation mechanism. Compared with other beetles, hindwings in ladybird beetles are characterized by two seemingly incompatible properties: (i) the wing rigidity with relatively thick veins and (ii) the compactness in stored shapes with complex crease patterns. The detailed wing-folding process revealed in this study is expected to facilitate understanding of the naturally optimized system in this excellent deployable structure. PMID:28507159

  6. A New Simulation Framework for the Electron-Ion Collider

    NASA Astrophysics Data System (ADS)

    Arrington, John

    2017-09-01

    Last year, a collaboration between Physics Division and High-Energy Physics at Argonne was formed to enable significantly broader contributions to the development of the Electron-Ion Collider. This includes efforts in accelerator R&D, theory, simulations, and detector R&D. I will give a brief overview of the status of these efforts, with emphasis on the aspects aimed at enabling the community to more easily become involved in evaluation of physics, detectors, and details of spectrometer designs. We have put together a new, easy-to-use simulation framework using flexible software tools. The goal is to enable detailed simulations to evaluate detector performance and compare detector designs. In addition, a common framework capable of providing detailed simulations of different spectrometer designs will allow for fully consistent evaluations of the physics reach of different spectrometer designs or detector systems for a variety of physics channels. In addition, new theory efforts will provide self-consistent models of GPDs (including QCD evolution) and TMDs in nucleons and light nuclei, as well as providing more detailed physics input for the evaluation of some new observables. This material is based upon work supported by Laboratory Directed Research and Development (LDRD) funding from Argonne National Laboratory, provided by the Director, Office of Science, of the U.S. Department of Energy under Contract DE-AC02-06CH11357.

  7. Exploring charge density analysis in crystals at high pressure: data collection, data analysis and advanced modelling.

    PubMed

    Casati, Nicola; Genoni, Alessandro; Meyer, Benjamin; Krawczuk, Anna; Macchi, Piero

    2017-08-01

    The possibility to determine electron-density distribution in crystals has been an enormous breakthrough, stimulated by a favourable combination of equipment for X-ray and neutron diffraction at low temperature, by the development of simplified, though accurate, electron-density models refined from the experimental data and by the progress in charge density analysis often in combination with theoretical work. Many years after the first successful charge density determination and analysis, scientists face new challenges, for example: (i) determination of the finer details of the electron-density distribution in the atomic cores, (ii) simultaneous refinement of electron charge and spin density or (iii) measuring crystals under perturbation. In this context, the possibility of obtaining experimental charge density at high pressure has recently been demonstrated [Casati et al. (2016). Nat. Commun. 7, 10901]. This paper reports on the necessities and pitfalls of this new challenge, focusing on the species syn-1,6:8,13-biscarbonyl[14]annulene. The experimental requirements, the expected data quality and data corrections are discussed in detail, including warnings about possible shortcomings. At the same time, new modelling techniques are proposed, which could enable specific information to be extracted, from the limited and less accurate observations, like the degree of localization of double bonds, which is fundamental to the scientific case under examination.

  8. Low-cost inflatable lighter-than-air surveillance system for civilian applications

    NASA Astrophysics Data System (ADS)

    Kiddy, Jason S.; Chen, Peter C.; Niemczuk, John B.

    2002-08-01

    Today's society places an extremely high price on the value of human life and injury. Whenever possible, police and paramilitary actions are always directed towards saving as many lives as possible, whether it is the officer, perpetrator, or innocent civilians. Recently, the advent of robotic systems has enable law enforcement agencies to perform many of the most dangerous aspects of their jobs from relative safety. This is especially true to bomb disposal units but it is also gaining acceptance in other areas. An area where small, remotely operated machines may prove effective is in local aerial surveillance. Currently, the only aerial surveillance assets generally available to law enforcement agencies are costly helicopters. Unfortunately, most of the recently developed unmanned air vehicles (UAVs) are directed towards military applications and have limited civilian use. Systems Planning and Analysis, Inc. (SPA) has conceived and performed a preliminary analysis of a low-cost, inflatable, lighter- than-air surveillance system that may be used in a number of military and law enforcement surveillance situations. The preliminary analysis includes the concept definition, a detailed trade study to determine the optimal configuration of the surveillance system, high-pressure inflation tests, and a control analysis. This paper will provide the details in these areas of the design and provide an insight into the feasibility of such a system.

  9. Novel spectral imaging system combining spectroscopy with imaging applications for biology

    NASA Astrophysics Data System (ADS)

    Malik, Zvi; Cabib, Dario; Buckwald, Robert A.; Garini, Yuval; Soenksen, Dirk G.

    1995-02-01

    A novel analytical spectral-imaging system and its results in the examination of biological specimens are presented. The SpectraCube 1000 system measures the transmission, absorbance, or fluorescence spectra of images studied by light microscopy. The system is based on an interferometer combined with a CCD camera, enabling measurement of the interferogram for each pixel constructing the image. Fourier transformation of the interferograms derives pixel by pixel spectra for 170 X 170 pixels of the image. A special `similarity mapping' program has been developed, enabling comparisons of spectral algorithms of all the spatial and spectral information measured by the system in the image. By comparing the spectrum of each pixel in the specimen with a selected reference spectrum (similarity mapping), there is a depiction of the spatial distribution of macromolecules possessing the characteristics of the reference spectrum. The system has been applied to analyses of bone marrow blood cells as well as fluorescent specimens, and has revealed information which could not be unveiled by other techniques. Similarity mapping has enabled visualization of fine details of chromatin packing in the nucleus of cells and other cytoplasmic compartments. Fluorescence analysis by the system has enabled the determination of porphyrin concentrations and distribution in cytoplasmic organelles of living cells.

  10. An application of the geophysical methods and ALS DTM for the identification of the geological structure in the Kraśnik region - Lublin Upland, Poland

    NASA Astrophysics Data System (ADS)

    Kamiński, Mirosław

    2017-11-01

    The purpose of the study was the assessment of the viability of selected geophysical methods and the Airborne Laser Scanning (ALS) for the identification and interpretation of the geological structure. The studied area is covered with a dense forest. For this reason, the ALS numerical terrain model was applied for the analysis of the topography. Three geophysical methods were used: gravimetric, in the form of a semi-detailed gravimetric photograph, Vertical Electrical Sounding (VES), and Electrical Resistivity Tomography (ERT). The numerical terrain model enabled the identification of Jurassic limestone outcrops and interpretation of the directions of the faults network. The geological interpretation of the digitally processed gravimetric data enabled the determination of the spatial orientation of the synclines and anticlines axes and of the course directions of main faults. Vertical Electrical Sounding carried along the section line perpendicular to the Gościeradów anticline axis enabled the interpretation of the lithology of this structure and identification of its complex tectonic structure. The shallow geophysical surveys using the ERT method enabled the estimation of the thickness of Quaternary formations deposited unconformably on the highly eroded Jurassic limestone outcrop. The lithology of Quaternary, Cretaceous and Jurassic rocks was also interpreted.

  11. [Big data approaches in psychiatry: examples in depression research].

    PubMed

    Bzdok, D; Karrer, T M; Habel, U; Schneider, F

    2017-11-29

    The exploration and therapy of depression is aggravated by heterogeneous etiological mechanisms and various comorbidities. With the growing trend towards big data in psychiatry, research and therapy can increasingly target the individual patient. This novel objective requires special methods of analysis. The possibilities and challenges of the application of big data approaches in depression are examined in closer detail. Examples are given to illustrate the possibilities of big data approaches in depression research. Modern machine learning methods are compared to traditional statistical methods in terms of their potential in applications to depression. Big data approaches are particularly suited to the analysis of detailed observational data, the prediction of single data points or several clinical variables and the identification of endophenotypes. A current challenge lies in the transfer of results into the clinical treatment of patients with depression. Big data approaches enable biological subtypes in depression to be identified and predictions in individual patients to be made. They have enormous potential for prevention, early diagnosis, treatment choice and prognosis of depression as well as for treatment development.

  12. Revealing the structural nature of the Cd isotopes

    NASA Astrophysics Data System (ADS)

    Garrett, P. E.; Diaz Varela, A.; Green, K. L.; Jamieson, D. S.; Jigmeddorj, B.; Wood, J. L.; Yates, S. W.

    2015-10-01

    The even-even Cd isotopes have provided fertile ground for the investigation of collectivity in nuclei. Soon after the development of the Bohr model, the stable Cd isotopes were identified as nearly harmonic vibrators based on their excitation energy patterns. The measurements of enhanced B (E 2) values appeared to support this interpretation. Shape co-existing rotational-like intruder bands were discovered, and mixing between the configurations was invoked to explain the deviation of the decay pattern of multiphonon vibrational states. Very recently, a detailed analysis of the low-lying levels of 110Cd combining results of the (n ,n' γ) reaction and high-statistics β decay, provided strong evidence that the mixing between configurations is weak, except for the ground-state band and ``Kπ =0+ '' intruder band. The analysis of the levels in 110Cd has now been extended to 3 MeV, and combined with data for 112Cd and previous Coulomb excitation data for 114Cd, enables a detailed map of the E 2 collectivity in these nuclei, demanding a complete re-interpretation of the structure of the stable Cd isotopes.

  13. Guidelines for reporting and using prediction tools for genetic variation analysis.

    PubMed

    Vihinen, Mauno

    2013-02-01

    Computational prediction methods are widely used for the analysis of human genome sequence variants and their effects on gene/protein function, splice site aberration, pathogenicity, and disease risk. New methods are frequently developed. We believe that guidelines are essential for those writing articles about new prediction methods, as well as for those applying these tools in their research, so that the necessary details are reported. This will enable readers to gain the full picture of technical information, performance, and interpretation of results, and to facilitate comparisons of related methods. Here, we provide instructions on how to describe new methods, report datasets, and assess the performance of predictive tools. We also discuss what details of predictor implementation are essential for authors to understand. Similarly, these guidelines for the use of predictors provide instructions on what needs to be delineated in the text, as well as how researchers can avoid unwarranted conclusions. They are applicable to most prediction methods currently utilized. By applying these guidelines, authors will help reviewers, editors, and readers to more fully comprehend prediction methods and their use. © 2012 Wiley Periodicals, Inc.

  14. Modeling Progressive Failure of Bonded Joints Using a Single Joint Finite Element

    NASA Technical Reports Server (NTRS)

    Stapleton, Scott E.; Waas, Anthony M.; Bednarcyk, Brett A.

    2010-01-01

    Enhanced finite elements are elements with an embedded analytical solution which can capture detailed local fields, enabling more efficient, mesh-independent finite element analysis. In the present study, an enhanced finite element is applied to generate a general framework capable of modeling an array of joint types. The joint field equations are derived using the principle of minimum potential energy, and the resulting solutions for the displacement fields are used to generate shape functions and a stiffness matrix for a single joint finite element. This single finite element thus captures the detailed stress and strain fields within the bonded joint, but it can function within a broader structural finite element model. The costs associated with a fine mesh of the joint can thus be avoided while still obtaining a detailed solution for the joint. Additionally, the capability to model non-linear adhesive constitutive behavior has been included within the method, and progressive failure of the adhesive can be modeled by using a strain-based failure criteria and re-sizing the joint as the adhesive fails. Results of the model compare favorably with experimental and finite element results.

  15. Requirements' Role in Mobilizing and Enabling Design Conversation

    NASA Astrophysics Data System (ADS)

    Bergman, Mark

    Requirements play a critical role in a design conversation of systems and products. Product and system design exists at the crossroads of problems, solutions and requirements. Requirements contextualize problems and solutions, pointing the way to feasible outcomes. These are captured with models and detailed specifications. Still, stakeholders need to be able to understand one-another using shared design representations in order to mobilize bias and transform knowledge towards legitimized, desired results. Many modern modeling languages, including UML, as well as detailed, logic-based specifications are beyond the comprehension of key stakeholders. Hence, they inhibit, rather than promote design conversation. Improved design boundary objects (DBO), especially design requirements boundary objects (DRBO), need to be created and refined to improve the communications between principals. Four key features of design boundary objects that improve and promote design conversation are discussed in detail. A systems analysis and design case study is presented which demonstrates these features in action. It describes how a small team of analysts worked with key stakeholders to mobilize and guide a complex system design discussion towards an unexpected, yet desired outcome within a short time frame.

  16. Enabling Rapid Naval Architecture Design Space Exploration

    NASA Technical Reports Server (NTRS)

    Mueller, Michael A.; Dufresne, Stephane; Balestrini-Robinson, Santiago; Mavris, Dimitri

    2011-01-01

    Well accepted conceptual ship design tools can be used to explore a design space, but more precise results can be found using detailed models in full-feature computer aided design programs. However, defining a detailed model can be a time intensive task and hence there is an incentive for time sensitive projects to use conceptual design tools to explore the design space. In this project, the combination of advanced aerospace systems design methods and an accepted conceptual design tool facilitates the creation of a tool that enables the user to not only visualize ship geometry but also determine design feasibility and estimate the performance of a design.

  17. Jahn-Teller versus quantum effects in the spin-orbital material LuVO 3

    DOE PAGES

    Skoulatos, M.; Toth, S.; Roessli, B.; ...

    2015-04-13

    In this article, we report on combined neutron and resonant x-ray scattering results, identifying the nature of the spin-orbital ground state and magnetic excitations in LuVO 3 as driven by the orbital parameter. In particular, we distinguish between models based on orbital-Peierls dimerization, taken as a signature of quantum effects in orbitals, and Jahn-Teller distortions, in favor of the latter. In order to solve this long-standing puzzle, polarized neutron beams were employed as a prerequisite in order to solve details of the magnetic structure, which allowed quantitative intensity analysis of extended magnetic-excitation data sets. The results of this detailed studymore » enabled us to draw definite conclusions about the classical versus quantum behavior of orbitals in this system and to discard the previous claims about quantum effects dominating the orbital physics of LuVO 3 and similar systems.« less

  18. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  19. Analysis of the predictive qualities of betting odds and FIFA World Ranking: evidence from the 2006, 2010 and 2014 Football World Cups.

    PubMed

    Wunderlich, Fabian; Memmert, Daniel

    2016-12-01

    The present study aims to investigate the ability of a new framework enabling to derive more detailed model-based predictions from ranking systems. These were compared to predictions from the bet market including data from the World Cups 2006, 2010, and 2014. The results revealed that the FIFA World Ranking has essentially improved its predictive qualities compared to the bet market since the mode of calculation was changed in 2006. While both predictors were useful to obtain accurate predictions in general, the world ranking was able to outperform the bet market significantly for the World Cup 2014 and when the data from the World Cups 2010 and 2014 were pooled. Our new framework can be extended in future research to more detailed prediction tasks (i.e., predicting the final scores of a match or the tournament progress of a team).

  20. LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience

    NASA Astrophysics Data System (ADS)

    Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.

    2016-12-01

    CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)

  1. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  2. Observation model and parameter partials for the JPL geodetic (GPS) modeling software 'GPSOMC'

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.

    1990-01-01

    The physical models employed in GPSOMC, the modeling module of the GIPSY software system developed at JPL for analysis of geodetic Global Positioning Satellite (GPS) measurements are described. Details of the various contributions to range and phase observables are given, as well as the partial derivatives of the observed quantities with respect to model parameters. A glossary of parameters is provided to enable persons doing data analysis to identify quantities with their counterparts in the computer programs. The present version is the second revision of the original document which it supersedes. The modeling is expanded to provide the option of using Cartesian station coordinates; parameters for the time rates of change of universal time and polar motion are also introduced.

  3. Global optimization of small bimetallic Pd-Co binary nanoalloy clusters: a genetic algorithm approach at the DFT level.

    PubMed

    Aslan, Mikail; Davis, Jack B A; Johnston, Roy L

    2016-03-07

    The global optimisation of small bimetallic PdCo binary nanoalloys are systematically investigated using the Birmingham Cluster Genetic Algorithm (BCGA). The effect of size and composition on the structures, stability, magnetic and electronic properties including the binding energies, second finite difference energies and mixing energies of Pd-Co binary nanoalloys are discussed. A detailed analysis of Pd-Co structural motifs and segregation effects is also presented. The maximal mixing energy corresponds to Pd atom compositions for which the number of mixed Pd-Co bonds is maximised. Global minimum clusters are distinguished from transition states by vibrational frequency analysis. HOMO-LUMO gap, electric dipole moment and vibrational frequency analyses are made to enable correlation with future experiments.

  4. Phage diabody repertoires for selection of large numbers of bispecific antibody fragments.

    PubMed

    McGuinness, B T; Walter, G; FitzGerald, K; Schuler, P; Mahoney, W; Duncan, A R; Hoogenboom, H R

    1996-09-01

    Methods for the generation of large numbers of different bispecific antibodies are presented. Cloning strategies are detailed to create repertoires of bispecific diabody molecules with variability on one or both of the antigen binding sites. This diabody format, when combined with the power of phage display technology, allows the generation and analysis of thousands of different bispecific molecules. Selection for binding presumably also selects for more stable diabodies. Phage diabody libraries enable screening or selection of the best combination bispecific molecule with regards to affinity of binding, epitope recognition and pairing before manufacture of the best candidate.

  5. Teaching practice of the course of Laser Principle and Application based on PBL mode

    NASA Astrophysics Data System (ADS)

    Li, Yongliang; Lv, Beibei; Wang, Siqi

    2017-08-01

    The primary task of university education is to stimulate students' autonomic learning and cultivate students' creative thinking. This paper put to use problem based learning (PBL) teaching mode, to enable students master flexible knowledge as the goal, and a detailed analysis of the implementation method and concrete measures of PBL teaching reform in the course of Laser Principle and Application, then compared with the former teaching methods. From the feedback of students and teaching experience, we get good teaching effect and prove the feasibility of PBL teaching mode in practice.

  6. ADDJUST-A View of the First 25 Years

    NASA Technical Reports Server (NTRS)

    Nieberding, Joe; Williams, Craig H.

    2015-01-01

    Various technologies and innovative launch operations were developed during the 50 years of the Centaur upper stage—the first launch vehicle to use high performing liquid hydrogen fuel. One innovation was “ADDJUST”, which enabled the successful negotiation of upper level winds measured only hours before launch. Initial causes for its creation, development, and operation during countdown are detailed. Problem definition, wind measuring/monitoring process, pitch and yaw steering coefficient generation, loads analysis, angle of attack, major risks/concerns, and anecdotal recollections are provided. Launch availability improved from as low as 55 to 95 percent due to ADDJUST, which is still in use.

  7. Multielement mapping of alpha-SiC by scanning Auger microscopy

    NASA Technical Reports Server (NTRS)

    Browning, Ray; Smialek, James L.; Jacobson, Nathan S.

    1987-01-01

    Fine second-phase particles, numerous in sintered alpha-SiC, were analyzed by scanning Auger microscopy and conventional techniques. The Auger analysis utilized computer-controlled data acquisition, multielement correlation diagrams, and a high spatial resolution of 100 nm. This procedure enabled construction of false color maps and the detection of fine compositional details within these particles. Carbon, silicon oxide, and boron-rich particles (qualitatively as BN or B4C) predominated. The BN particles, sometimes having a carbon core, are believed to result from reaction between B4C additives and nitrogen sintering atmospheres.

  8. Feasibility study of a novel miniaturized spectral imaging system architecture in UAV surveillance

    NASA Astrophysics Data System (ADS)

    Liu, Shuyang; Zhou, Tao; Jia, Xiaodong; Cui, Hushan; Huang, Chengjun

    2016-01-01

    The spectral imaging technology is able to analysis the spectral and spatial geometric character of the target at the same time. To break through the limitation brought by the size, weight and cost of the traditional spectral imaging instrument, a miniaturized novel spectral imaging based on CMOS processing has been introduced in the market. This technology has enabled the possibility of applying spectral imaging in the UAV platform. In this paper, the relevant technology and the related possible applications have been presented to implement a quick, flexible and more detailed remote sensing system.

  9. Using 3D Spectroscopy to Probe the Orbital Structure of Composite Bulges

    NASA Astrophysics Data System (ADS)

    Erwin, Peter; Saglia, Roberto; Thomas, Jens; Fabricius, Maximilian; Bender, Ralf; Rusli, Stephanie; Nowak, Nina; Beckman, John E.; Vega Beltrán, Juan Carlos

    2015-02-01

    Detailed imaging and spectroscopic analysis of the centers of nearby S0 and spiral galaxies shows the existence of ``composite bulges'', where both classical bulges and disky pseudobulges coexist in the same galaxy. As part of a search for supermassive black holes in nearby galaxy nuclei, we obtained VLT-SINFONI observations in adaptive-optics mode of several of these galaxies. Schwarzschild dynamical modeling enables us to disentangle the stellar orbital structure of the different central components, and to distinguish the differing contributions of kinematically hot (classical bulge) and kinematically cool (pseudobulge) components in the same galaxy.

  10. Overview of Sparse Graph for Multiple Access in Future Mobile Networks

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui

    2017-10-01

    Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.

  11. Clustering and Network Analysis of Reverse Phase Protein Array Data.

    PubMed

    Byron, Adam

    2017-01-01

    Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.

  12. Cognitive task analysis of network analysts and managers for network situational awareness

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn

    2010-01-01

    The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.

  13. Visualization and Quantitative Analysis of Crack-Tip Plastic Zone in Pure Nickel

    NASA Astrophysics Data System (ADS)

    Kelton, Randall; Sola, Jalal Fathi; Meletis, Efstathios I.; Huang, Haiying

    2018-05-01

    Changes in surface morphology have long been thought to be associated with crack propagation in metallic materials. We have studied areal surface texture changes around crack tips in an attempt to understand the correlations between surface texture changes and crack growth behavior. Detailed profiling of the fatigue sample surface was carried out at short fatigue intervals. An image processing algorithm was developed to calculate the surface texture changes. Quantitative analysis of the crack-tip plastic zone, crack-arrested sites near triple points, and large surface texture changes associated with crack release from arrested locations was carried out. The results indicate that surface texture imaging enables visualization of the development of plastic deformation around a crack tip. Quantitative analysis of the surface texture changes reveals the effects of local microstructures on the crack growth behavior.

  14. Integral equation and discontinuous Galerkin methods for the analysis of light-matter interaction

    NASA Astrophysics Data System (ADS)

    Baczewski, Andrew David

    Light-matter interaction is among the most enduring interests of the physical sciences. The understanding and control of this physics is of paramount importance to the design of myriad technologies ranging from stained glass, to molecular sensing and characterization techniques, to quantum computers. The development of complex engineered systems that exploit this physics is predicated at least partially upon in silico design and optimization that properly capture the light-matter coupling. In this thesis, the details of computational frameworks that enable this type of analysis, based upon both Integral Equation and Discontinuous Galerkin formulations will be explored. There will be a primary focus on the development of efficient and accurate software, with results corroborating both. The secondary focus will be on the use of these tools in the analysis of a number of exemplary systems.

  15. SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.

    2014-05-01

    As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.

  16. SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.

    PubMed

    Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao

    2014-08-08

    Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.

  17. 78 FR 64925 - Request for Comments on Proposed Elimination of Patents Search Templates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... is a detailed, collaborative, and dynamic system that will enable patent examiners and the public to... launched in January 2013. CPC is a detailed, dynamic classification system that is based on the IPC and... updating. Further, the USPTO launched a new classification system, the Cooperative Patent Classification...

  18. 8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN TRACKS RESIDUAL CHLORINE; INDICATES AMOUNT OF SUNLIGHT WHICH ENABLES OPERATOR TO ESTIMATE NEEDED CHLORINE; CENTER SCREEN SHOWS TURNOUT STRUCTURES; RIGHT SCREEN SHOWS INDICATORS OF ALUMINUM SULFATE TANK FARM. - F. E. Weymouth Filtration Plant, 700 North Moreno Avenue, La Verne, Los Angeles County, CA

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goltz, G.; Kaiser, L.M.; Weiner, H.

    A major mission of the U.S. Coast Guard is the task of providing and maintaining Maritime Aids to Navigation. These aids are located on and near the coastline and inland waters of the United States and its possessions. A computer program, Design Synthesis and Performance Analysis (DSPA), has been developed by the Jet Propulsion Laboratory to demonstrate the feasibility of low-cost solar array/battery power systems for use on flashing lamp buoys. To provide detailed, realistic temperature, wind, and solar insolation data for analysis of the flashing lamp buoy power systems, the two DSPA support computer program sets: MERGE and STATmore » were developed. A general description of these two packages is presented in this program summary report. The MERGE program set will enable the Coast Guard to combine temperature and wind velocity data (NOAA TDF-14 tapes) with solar insolation data (NOAA DECK-280 tapes) onto a single sequential MERGE file containing up to 12 years of hourly observations. This MERGE file can then be used as direct input to the DSPA program. The STAT program set will enable a statistical analysis to be performed of the MERGE data and produce high or low or mean profiles of the data and/or do a worst case analysis. The STAT output file consists of a one-year set of hourly statistical weather data which can be used as input to the DSPA program.« less

  20. Broadening the horizon – level 2.5 of the HUPO-PSI format for molecular interactions

    PubMed Central

    Kerrien, Samuel; Orchard, Sandra; Montecchi-Palazzi, Luisa; Aranda, Bruno; Quinn, Antony F; Vinod, Nisha; Bader, Gary D; Xenarios, Ioannis; Wojcik, Jérôme; Sherman, David; Tyers, Mike; Salama, John J; Moore, Susan; Ceol, Arnaud; Chatr-aryamontri, Andrew; Oesterheld, Matthias; Stümpflen, Volker; Salwinski, Lukasz; Nerothin, Jason; Cerami, Ethan; Cusick, Michael E; Vidal, Marc; Gilson, Michael; Armstrong, John; Woollard, Peter; Hogue, Christopher; Eisenberg, David; Cesareni, Gianni; Apweiler, Rolf; Hermjakob, Henning

    2007-01-01

    Background Molecular interaction Information is a key resource in modern biomedical research. Publicly available data have previously been provided in a broad array of diverse formats, making access to this very difficult. The publication and wide implementation of the Human Proteome Organisation Proteomics Standards Initiative Molecular Interactions (HUPO PSI-MI) format in 2004 was a major step towards the establishment of a single, unified format by which molecular interactions should be presented, but focused purely on protein-protein interactions. Results The HUPO-PSI has further developed the PSI-MI XML schema to enable the description of interactions between a wider range of molecular types, for example nucleic acids, chemical entities, and molecular complexes. Extensive details about each supported molecular interaction can now be captured, including the biological role of each molecule within that interaction, detailed description of interacting domains, and the kinetic parameters of the interaction. The format is supported by data management and analysis tools and has been adopted by major interaction data providers. Additionally, a simpler, tab-delimited format MITAB2.5 has been developed for the benefit of users who require only minimal information in an easy to access configuration. Conclusion The PSI-MI XML2.5 and MITAB2.5 formats have been jointly developed by interaction data producers and providers from both the academic and commercial sector, and are already widely implemented and well supported by an active development community. PSI-MI XML2.5 enables the description of highly detailed molecular interaction data and facilitates data exchange between databases and users without loss of information. MITAB2.5 is a simpler format appropriate for fast Perl parsing or loading into Microsoft Excel. PMID:17925023

  1. In Situ Resource Utilization For Mobility In Mars Exploration

    NASA Astrophysics Data System (ADS)

    Hartman, Leo

    There has been considerable interest in the unmanned exploration of Mars for quite some time but the current generation of rovers can explore only a small portion of the total planetary surface. One approach to addressing this deficiency is to consider a rover that has greater range and that is cheaper so that it can be deployed in greater numbers. The option explored in this paper uses the wind to propel a rover platform, trading off precise navigation for greater range. The capabilities of such a rover lie between the global perspective of orbiting satellites and the detailed local analysis of current-generation rovers. In particular, the design includes two inflatable wheels with an unspun payload platform suspended between then. Slightly deflating one of the wheels enables steering away from the direction of the wind and sufficiently deflating both wheels will allow the rover to stop. Current activities revolve around the development of a prototype with a wheel cross-sectional area that is scaled by 1/100 to enable terrestrial trials to provide meaningful insight into the performance and behavior of a full-sized rover on Mars. The paper will discuss the design and its capabilities in more detail as well as current efforts to build a prototype suitable for deployment at a Mars analogue site such as Devon Island in the Canadian arctic.

  2. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  3. Dendrites and Pits: Untangling the Complex Behavior of Lithium Metal Anodes through Operando Video Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Kevin N.; Kazyak, Eric; Chadwick, Alexander F.

    Enabling ultra-high energy density rechargeable Li batteries would have widespread impact on society. But, the critical challenges of Li metal anodes (most notably cycle life and safety) remain unsolved. This is attributed to the evolution of Li metal morphology during cycling, which leads to dendrite growth and surface pitting. Herein, we present a comprehensive understanding of the voltage variations observed during Li metal cycling, which is directly correlated to morphology evolution through the use of operando video microscopy. A custom-designed visualization cell was developed to enable operando synchronized observation of Li metal electrode morphology and electrochemical behavior during cycling. Amore » mechanistic understanding of the complex behavior of these electrodes is gained through correlation with continuum-scale modeling, which provides insight into the dominant surface kinetics. Our work provides a detailed explanation of (1) when dendrite nucleation occurs, (2) how those dendrites evolve as a function of time, (3) when surface pitting occurs during Li electrodissolution, (4) kinetic parameters that dictate overpotential as the electrode morphology evolves, and (5) how this understanding can be applied to evaluate electrode performance in a variety of electrolytes. Our results provide detailed insight into the interplay between morphology and the dominant electrochemical processes occurring on the Li electrode surface through an improved understanding of changes in cell voltage, which represents a powerful new platform for analysis.« less

  4. Dendrites and Pits: Untangling the Complex Behavior of Lithium Metal Anodes through Operando Video Microscopy

    DOE PAGES

    Wood, Kevin N.; Kazyak, Eric; Chadwick, Alexander F.; ...

    2015-10-14

    Enabling ultra-high energy density rechargeable Li batteries would have widespread impact on society. But, the critical challenges of Li metal anodes (most notably cycle life and safety) remain unsolved. This is attributed to the evolution of Li metal morphology during cycling, which leads to dendrite growth and surface pitting. Herein, we present a comprehensive understanding of the voltage variations observed during Li metal cycling, which is directly correlated to morphology evolution through the use of operando video microscopy. A custom-designed visualization cell was developed to enable operando synchronized observation of Li metal electrode morphology and electrochemical behavior during cycling. Amore » mechanistic understanding of the complex behavior of these electrodes is gained through correlation with continuum-scale modeling, which provides insight into the dominant surface kinetics. Our work provides a detailed explanation of (1) when dendrite nucleation occurs, (2) how those dendrites evolve as a function of time, (3) when surface pitting occurs during Li electrodissolution, (4) kinetic parameters that dictate overpotential as the electrode morphology evolves, and (5) how this understanding can be applied to evaluate electrode performance in a variety of electrolytes. Our results provide detailed insight into the interplay between morphology and the dominant electrochemical processes occurring on the Li electrode surface through an improved understanding of changes in cell voltage, which represents a powerful new platform for analysis.« less

  5. Enabling Collaboration and Video Assessment: Exposing Trends in Science Preservice Teachers' Assessments

    ERIC Educational Resources Information Center

    Borowczak, Mike; Burrows, Andrea C.

    2016-01-01

    This article details a new, free resource for continuous video assessment named YouDemo. The tool enables real time rating of uploaded YouTube videos for use in science, technology, engineering, and mathematics (STEM) education and beyond. The authors discuss trends of preservice science teachers' assessments of self- and peer-created videos using…

  6. Medium-throughput processing of whole mount in situ hybridisation experiments into gene expression domains.

    PubMed

    Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes

    2012-01-01

    Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.

  7. A stereotaxic, population-averaged T1w ovine brain atlas including cerebral morphology and tissue volumes

    PubMed Central

    Nitzsche, Björn; Frey, Stephen; Collins, Louis D.; Seeger, Johannes; Lobsien, Donald; Dreyer, Antje; Kirsten, Holger; Stoffel, Michael H.; Fonov, Vladimir S.; Boltze, Johannes

    2015-01-01

    Standard stereotaxic reference systems play a key role in human brain studies. Stereotaxic coordinate systems have also been developed for experimental animals including non-human primates, dogs, and rodents. However, they are lacking for other species being relevant in experimental neuroscience including sheep. Here, we present a spatial, unbiased ovine brain template with tissue probability maps (TPM) that offer a detailed stereotaxic reference frame for anatomical features and localization of brain areas, thereby enabling inter-individual and cross-study comparability. Three-dimensional data sets from healthy adult Merino sheep (Ovis orientalis aries, 12 ewes and 26 neutered rams) were acquired on a 1.5 T Philips MRI using a T1w sequence. Data were averaged by linear and non-linear registration algorithms. Moreover, animals were subjected to detailed brain volume analysis including examinations with respect to body weight (BW), age, and sex. The created T1w brain template provides an appropriate population-averaged ovine brain anatomy in a spatial standard coordinate system. Additionally, TPM for gray (GM) and white (WM) matter as well as cerebrospinal fluid (CSF) classification enabled automatic prior-based tissue segmentation using statistical parametric mapping (SPM). Overall, a positive correlation of GM volume and BW explained about 15% of the variance of GM while a positive correlation between WM and age was found. Absolute tissue volume differences were not detected, indeed ewes showed significantly more GM per bodyweight as compared to neutered rams. The created framework including spatial brain template and TPM represent a useful tool for unbiased automatic image preprocessing and morphological characterization in sheep. Therefore, the reported results may serve as a starting point for further experimental and/or translational research aiming at in vivo analysis in this species. PMID:26089780

  8. Web design and development for centralize area radiation monitoring system in Malaysian Nuclear Agency

    NASA Astrophysics Data System (ADS)

    Ibrahim, Maslina Mohd; Yussup, Nolida; Haris, Mohd Fauzi; Soh @ Shaari, Syirrazie Che; Azman, Azraf; Razalim, Faizal Azrin B. Abdul; Yapp, Raymond; Hasim, Harzawardi; Aslan, Mohd Dzul Aiman

    2017-01-01

    One of the applications for radiation detector is area monitoring which is crucial for safety especially at a place where radiation source is involved. An environmental radiation monitoring system is a professional system that combines flexibility and ease of use for data collection and monitoring. Nowadays, with the growth of technology, devices and equipment can be connected to the network and Internet to enable online data acquisition. This technology enables data from the area monitoring devices to be transmitted to any place and location directly and faster. In Nuclear Malaysia, area radiation monitor devices are located at several selective locations such as laboratories and radiation facility. This system utilizes an Ethernet as a communication media for data acquisition of the area radiation levels from radiation detectors and stores the data at a server for recording and analysis. This paper discusses on the design and development of website that enable all user in Nuclear Malaysia to access and monitor the radiation level for each radiation detectors at real time online. The web design also included a query feature for history data from various locations online. The communication between the server's software and web server is discussed in detail in this paper.

  9. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  10. Specimen preparation, imaging, and analysis protocols for knife-edge scanning microscopy.

    PubMed

    Choe, Yoonsuck; Mayerich, David; Kwon, Jaerock; Miller, Daniel E; Sung, Chul; Chung, Ji Ryang; Huffman, Todd; Keyser, John; Abbott, Louise C

    2011-12-09

    Major advances in high-throughput, high-resolution, 3D microscopy techniques have enabled the acquisition of large volumes of neuroanatomical data at submicrometer resolution. One of the first such instruments producing whole-brain-scale data is the Knife-Edge Scanning Microscope (KESM), developed and hosted in the authors' lab. KESM has been used to section and image whole mouse brains at submicrometer resolution, revealing the intricate details of the neuronal networks (Golgi), vascular networks (India ink), and cell body distribution (Nissl). The use of KESM is not restricted to the mouse nor the brain. We have successfully imaged the octopus brain, mouse lung, and rat brain. We are currently working on whole zebra fish embryos. Data like these can greatly contribute to connectomics research; to microcirculation and hemodynamic research; and to stereology research by providing an exact ground-truth. In this article, we will describe the pipeline, including specimen preparation (fixing, staining, and embedding), KESM configuration and setup, sectioning and imaging with the KESM, image processing, data preparation, and data visualization and analysis. The emphasis will be on specimen preparation and visualization/analysis of obtained KESM data. We expect the detailed protocol presented in this article to help broaden the access to KESM and increase its utilization.

  11. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less

  12. Situational Awareness Issues in the Implementation of Datalink: Shared Situational Awareness in the Joint Flight Deck-ATC Aviation System

    NASA Technical Reports Server (NTRS)

    Hansman, Robert John, Jr.

    1999-01-01

    MIT has investigated Situational Awareness issues relating to the implementation of Datalink in the Air Traffic Control environment for a number of years under this grant activity. This work has investigated: 1) The Effect of "Party Line" Information. 2) The Effect of Datalink-Enabled Automated Flight Management Systems (FMS) on Flight Crew Situational Awareness. 3) The Effect of Cockpit Display of Traffic Information (CDTI) on Situational Awareness During Close Parallel Approaches. 4) Analysis of Flight Path Management Functions in Current and Future ATM Environments. 5) Human Performance Models in Advanced ATC Automation: Flight Crew and Air Traffic Controllers. 6) CDTI of Datalink-Based Intent Information in Advanced ATC Environments. 7) Shared Situational Awareness between the Flight Deck and ATC in Datalink-Enabled Environments. 8) Analysis of Pilot and Controller Shared SA Requirements & Issues. 9) Development of Robust Scenario Generation and Distributed Simulation Techniques for Flight Deck ATC Simulation. 10) Methods of Testing Situation Awareness Using Testable Response Techniques. The work is detailed in specific technical reports that are listed in the following bibliography, and are attached as an appendix to the master final technical report.

  13. Online analysis: Deeper insights into water quality dynamics in spring water.

    PubMed

    Page, Rebecca M; Besmer, Michael D; Epting, Jannis; Sigrist, Jürg A; Hammes, Frederik; Huggenberger, Peter

    2017-12-01

    We have studied the dynamics of water quality in three karst springs taking advantage of new technological developments that enable high-resolution measurements of bacterial load (total cell concentration: TCC) as well as online measurements of abiotic parameters. We developed a novel data analysis approach, using self-organizing maps and non-linear projection methods, to approximate the TCC dynamics using the multivariate data sets of abiotic parameter time-series, thus providing a method that could be implemented in an online water quality management system for water suppliers. The (TCC) data, obtained over several months, provided a good basis to study the microbiological dynamics in detail. Alongside the TCC measurements, online abiotic parameter time-series, including spring discharge, turbidity, spectral absorption coefficient at 254nm (SAC254) and electrical conductivity, were obtained. High-density sampling over an extended period of time, i.e. every 45min for 3months, allowed a detailed analysis of the dynamics in karst spring water quality. Substantial increases in both the TCC and the abiotic parameters followed precipitation events in the catchment area. Differences between the parameter fluctuations were only apparent when analyzed at a high temporal scale. Spring discharge was always the first to react to precipitation events in the catchment area. Lag times between the onset of precipitation and a change in discharge varied between 0.2 and 6.7h, depending on the spring and event. TCC mostly reacted second or approximately concurrent with turbidity and SAC254, whereby the fastest observed reaction in the TCC time series occurred after 2.3h. The methodological approach described here enables a better understanding of bacterial dynamics in karst springs, which can be used to estimate risks and management options to avoid contamination of the drinking water. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Observation model and parameter partials for the JPL geodetic GPS modeling software GPSOMC

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.; Border, J. S.

    1988-01-01

    The physical models employed in GPSOMC and the modeling module of the GIPSY software system developed at JPL for analysis of geodetic Global Positioning Satellite (GPS) measurements are described. Details of the various contributions to range and phase observables are given, as well as the partial derivatives of the observed quantities with respect to model parameters. A glossary of parameters is provided to enable persons doing data analysis to identify quantities in the current report with their counterparts in the computer programs. There are no basic model revisions, with the exceptions of an improved ocean loading model and some new options for handling clock parametrization. Such misprints as were discovered were corrected. Further revisions include modeling improvements and assurances that the model description is in accord with the current software.

  15. 3-D characterization of weathered building limestones by high resolution synchrotron X-ray microtomography.

    PubMed

    Rozenbaum, O

    2011-04-15

    Understanding the weathering processes of building stones and more generally of their transfer properties requires detailed knowledge of the porosity characteristics. This study aims at analyzing three-dimensional images obtained by X-ray microtomography of building stones. In order to validate these new results a weathered limestone previously characterised (Rozenbaum et al., 2007) by two-dimensional image analysis was selected. The 3-D images were analysed by a set of mathematical tools that enable the description of the pore and solid phase distribution. Results show that 3-D image analysis is a powerful technique to characterise the morphological, structural and topological differences due to weathering. The paper also discusses criteria for mathematically determining whether a stone is weathered or not. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. A multi-phase network situational awareness cognitive task analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.

    Abstract The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into making certain that we had feedback from network analysts and managers and understand what their genuine needs are. This article discusses the cognitive task-analysis methodology that we followed to acquire feedback from the analysts. This article also provides the details we acquired from the analysts on their processes, goals, concerns, themore » data and metadata that they analyze. Finally, we describe the generation of a novel task-flow diagram representing the activities of the target user base.« less

  17. cyvcf2: fast, flexible variant analysis with Python.

    PubMed

    Pedersen, Brent S; Quinlan, Aaron R

    2017-06-15

    Variant call format (VCF) files document the genetic variation observed after DNA sequencing, alignment and variant calling of a sample cohort. Given the complexity of the VCF format as well as the diverse variant annotations and genotype metadata, there is a need for fast, flexible methods enabling intuitive analysis of the variant data within VCF and BCF files. We introduce cyvcf2 , a Python library and software package for fast parsing and querying of VCF and BCF files and illustrate its speed, simplicity and utility. bpederse@gmail.com or aaronquinlan@gmail.com. cyvcf2 is available from https://github.com/brentp/cyvcf2 under the MIT license and from common python package managers. Detailed documentation is available at http://brentp.github.io/cyvcf2/. © The Author 2017. Published by Oxford University Press.

  18. Measurement Protocols for In situ Analysis of Organic Compounds at Mars and Comets

    NASA Technical Reports Server (NTRS)

    Mahaffy, P. R.; Brinckerhuff, W. B.; Buch, A.; Cabane, M.; Coll, P.; Demick, J.; Glavin, D. P.; Navarro-Gonzalez, R.

    2005-01-01

    The determination of the abundance and chemical and isotopic composition of organic molecules in comets and those that might be found in protected environments at Mars is a first step toward understanding prebiotic chemistries on these solar system bodies. While future sample return missions from Mars and comets will enable detailed chemical and isotopic analysis with a wide range of analytical techniques, precursor insitu investigations can complement these missions and facilitate the identification of optimal sites for sample return. Robust automated experiments that make efficient use of limited spacecraft power, mass, and data volume resources are required for use by insitu missions. Within these constraints we continue to explore a range of instrument techniques and measurement protocols that can maximize the return from such insitu investigations.

  19. Measurement Protocols for In Situ Analysis of Organic Compounds at Mars and Comets

    NASA Technical Reports Server (NTRS)

    Mahaffy, P. R.; Brinckerhoff, W. B.; Buch, A.; Cabane, M.; Coll, P.; Demick, J.; Glavin, D. P.; Navarro-Gonzalez, R.

    2005-01-01

    The determination of the abundance and chemical and isotopic composition of organic molecules in comets and those that might be found in protected environments at Mars is a first step toward understanding prebiotic chemistries on these solar system bodies. While future sample return missions from Mars and comets will enable detailed chemical and isotopic analysis with a wide range of analytical techniques, precursor insitu investigations can complement these missions and facilitate the identification of optimal sites for sample return. Robust automated experiments that make efficient use of limited spacecraft power, mass, and data volume resources are required for use by insitu missions. Within these constraints we continue to explore a range of instrument techniques and measurement protocols that can maximize the return from such insitu investigations.

  20. A topological multilayer model of the human body.

    PubMed

    Barbeito, Antonio; Painho, Marco; Cabral, Pedro; O'Neill, João

    2015-11-04

    Geographical information systems deal with spatial databases in which topological models are described with alphanumeric information. Its graphical interfaces implement the multilayer concept and provide powerful interaction tools. In this study, we apply these concepts to the human body creating a representation that would allow an interactive, precise, and detailed anatomical study. A vector surface component of the human body is built using a three-dimensional (3-D) reconstruction methodology. This multilayer concept is implemented by associating raster components with the corresponding vector surfaces, which include neighbourhood topology enabling spatial analysis. A root mean square error of 0.18 mm validated the three-dimensional reconstruction technique of internal anatomical structures. The expansion of the identification and the development of a neighbourhood analysis function are the new tools provided in this model.

  1. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  2. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  3. A reference Pelton turbine - High speed visualization in the rotating frame

    NASA Astrophysics Data System (ADS)

    Solemslie, Bjørn W.; Dahlhaug, Ole G.

    2016-11-01

    To enable a detailed study the flow mechanisms effecting the flow within the reference Pelton runner designed at the Waterpower Laboratory (NTNLT) a flow visualization system has been developed. The system enables high speed filming of the hydraulic surface of a single bucket in the rotating frame of reference. It is built with an angular borescopes adapter entering the turbine along the rotational axis and a borescope embedded within a bucket. A stationary high speed camera located outside the turbine housing has been connected to the optical arrangement by a non-contact coupling. The view point of the system includes the whole hydraulic surface of one half of a bucket. The system has been designed to minimize the amount of vibrations and to ensure that the vibrations felt by the borescope are the same as those affecting the camera. The preliminary results captured with the system are promising and enable a detailed study of the flow within the turbine.

  4. Health Monitoring System Technology Assessments: Cost Benefits Analysis

    NASA Technical Reports Server (NTRS)

    Kent, Renee M.; Murphy, Dennis A.

    2000-01-01

    The subject of sensor-based structural health monitoring is very diverse and encompasses a wide range of activities including initiatives and innovations involving the development of advanced sensor, signal processing, data analysis, and actuation and control technologies. In addition, it embraces the consideration of the availability of low-cost, high-quality contributing technologies, computational utilities, and hardware and software resources that enable the operational realization of robust health monitoring technologies. This report presents a detailed analysis of the cost benefit and other logistics and operational considerations associated with the implementation and utilization of sensor-based technologies for use in aerospace structure health monitoring. The scope of this volume is to assess the economic impact, from an end-user perspective, implementation health monitoring technologies on three structures. It specifically focuses on evaluating the impact on maintaining and supporting these structures with and without health monitoring capability.

  5. Global analysis of the yeast lipidome by quantitative shotgun mass spectrometry.

    PubMed

    Ejsing, Christer S; Sampaio, Julio L; Surendranath, Vineeth; Duchoslav, Eva; Ekroos, Kim; Klemm, Robin W; Simons, Kai; Shevchenko, Andrej

    2009-02-17

    Although the transcriptome, proteome, and interactome of several eukaryotic model organisms have been described in detail, lipidomes remain relatively uncharacterized. Using Saccharomyces cerevisiae as an example, we demonstrate that automated shotgun lipidomics analysis enabled lipidome-wide absolute quantification of individual molecular lipid species by streamlined processing of a single sample of only 2 million yeast cells. By comparative lipidomics, we achieved the absolute quantification of 250 molecular lipid species covering 21 major lipid classes. This analysis provided approximately 95% coverage of the yeast lipidome achieved with 125-fold improvement in sensitivity compared with previous approaches. Comparative lipidomics demonstrated that growth temperature and defects in lipid biosynthesis induce ripple effects throughout the molecular composition of the yeast lipidome. This work serves as a resource for molecular characterization of eukaryotic lipidomes, and establishes shotgun lipidomics as a powerful platform for complementing biochemical studies and other systems-level approaches.

  6. Analysis of context dependence in social interaction networks of a massively multiplayer online role-playing game.

    PubMed

    Son, Seokshin; Kang, Ah Reum; Kim, Hyun-chul; Kwon, Taekyoung; Park, Juyong; Kim, Huy Kang

    2012-01-01

    Rapid advances in modern computing and information technology have enabled millions of people to interact online via various social network and gaming services. The widespread adoption of such online services have made possible analysis of large-scale archival data containing detailed human interactions, presenting a very promising opportunity to understand the rich and complex human behavior. In collaboration with a leading global provider of Massively Multiplayer Online Role-Playing Games (MMORPGs), here we present a network science-based analysis of the interplay between distinct types of user interaction networks in the virtual world. We find that their properties depend critically on the nature of the context-interdependence of the interactions, highlighting the complex and multilayered nature of human interactions, a robust understanding of which we believe may prove instrumental in the designing of more realistic future virtual arenas as well as provide novel insights to the science of collective human behavior.

  7. Deformation analysis of MEMS structures by modified digital moiré methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhanwei; Lou, Xinhao; Gao, Jianxin

    2010-11-01

    Quantitative deformation analysis of micro-fabricated electromechanical systems is of importance for the design and functional control of microsystems. In this paper, two modified digital moiré processing methods, Gaussian blurring algorithm combined with digital phase shifting and geometrical phase analysis (GPA) technique based on digital moiré method, are developed to quantitatively analyse the deformation behaviour of micro-electro-mechanical system (MEMS) structures. Measuring principles and experimental procedures of the two methods are described in detail. A digital moiré fringe pattern is generated by superimposing a specimen grating etched directly on a microstructure surface with a digital reference grating (DRG). Most of the grating noise is removed from the digital moiré fringes, which enables the phase distribution of the moiré fringes to be obtained directly. Strain measurement result of a MEMS structure demonstrates the feasibility of the two methods.

  8. Analysis System for Self-Efficacy Training (ASSET). Assessing treatment fidelity of self-management interventions.

    PubMed

    Zinken, Katarzyna M; Cradock, Sue; Skinner, T Chas

    2008-08-01

    The paper presents the development of a coding tool for self-efficacy orientated interventions in diabetes self-management programmes (Analysis System for Self-Efficacy Training, ASSET) and explores its construct validity and clinical utility. Based on four sources of self-efficacy (i.e., mastery experience, role modelling, verbal persuasion and physiological and affective states), published self-efficacy based interventions for diabetes care were analysed in order to identify specific verbal behavioural techniques. Video-recorded facilitating behaviours were evaluated using ASSET. The reliability between four coders was high (K=0.71). ASSET enabled assessment of both self-efficacy based techniques and participants' response to those techniques. Individual patterns of delivery and shifts over time across facilitators were found. In the presented intervention we observed that self-efficacy utterances were followed by longer patient verbal responses than non-self-efficacy utterances. These detailed analyses with ASSET provide rich data and give the researcher an insight into the underlying mechanism of the intervention process. By providing a detailed description of self-efficacy strategies ASSET can be used by health care professionals to guide reflective practice and support training programmes.

  9. Understanding product cost vs. performance through an in-depth system Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Sanson, Mark C.

    2017-08-01

    The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.

  10. Rotating permanent magnet excitation for blood flow measurement.

    PubMed

    Nair, Sarath S; Vinodkumar, V; Sreedevi, V; Nagesh, D S

    2015-11-01

    A compact, portable and improved blood flow measurement system for an extracorporeal circuit having a rotating permanent magnetic excitation scheme is described in this paper. The system consists of a set of permanent magnets rotating near blood or any conductive fluid to create high-intensity alternating magnetic field in it and inducing a sinusoidal varying voltage across the column of fluid. The induced voltage signal is acquired, conditioned and processed to determine its flow rate. Performance analysis shows that a sensitivity of more than 250 mV/lpm can be obtained, which is more than five times higher than conventional flow measurement systems. Choice of rotating permanent magnet instead of an electromagnetic core generates alternate magnetic field of smooth sinusoidal nature which in turn reduces switching and interference noises. These results in reduction in complex electronic circuitry required for processing the signal to a great extent and enable the flow measuring device to be much less costlier, portable and light weight. The signal remains steady even with changes in environmental conditions and has an accuracy of greater than 95%. This paper also describes the construction details of the prototype, the factors affecting sensitivity and detailed performance analysis at various operating conditions.

  11. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  12. Nanostructured Electrode Materials for Electrochemical Capacitor Applications.

    PubMed

    Choi, Hojin; Yoon, Hyeonseok

    2015-06-02

    The advent of novel organic and inorganic nanomaterials in recent years, particularly nanostructured carbons, conducting polymers, and metal oxides, has enabled the fabrication of various energy devices with enhanced performance. In this paper, we review in detail different nanomaterials used in the fabrication of electrochemical capacitor electrodes and also give a brief overview of electric double-layer capacitors, pseudocapacitors, and hybrid capacitors. From a materials point of view, the latest trends in electrochemical capacitor research are also discussed through extensive analysis of the literature and by highlighting notable research examples (published mostly since 2013). Finally, a perspective on next-generation capacitor technology is also given, including the challenges that lie ahead.

  13. The influence of carrier dynamics on double-state lasing in quantum dot lasers at variable temperature

    NASA Astrophysics Data System (ADS)

    Korenev, V. V.; Savelyev, A. V.; Zhukov, A. E.; Omelchenko, A. V.; Maximov, M. V.

    2014-12-01

    It is shown in analytical form that the carrier capture from the matrix as well as carrier dynamics in quantum dots plays an important role in double-state lasing phenomenon. In particular, the de-synchronization of hole and electron captures allows one to describe recently observed quenching of ground-state lasing, which takes place in quantum dot lasers operating in double-state lasing regime at high injection. From the other side, the detailed analysis of charge carrier dynamics in the single quantum dot enables one to describe the observed light-current characteristics and key temperature dependences.

  14. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  15. Using the CPTAC Assay Portal to identify and implement highly characterized targeted proteomics assays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.

    2016-02-12

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative tomore » other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and post-translational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.« less

  16. Computational neuroanatomy using brain deformations: From brain parcellation to multivariate pattern analysis and machine learning.

    PubMed

    Davatzikos, Christos

    2016-10-01

    The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. Copyright © 2016. Published by Elsevier B.V.

  17. Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.

    PubMed

    Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.

  18. Modern Focused-Ion-Beam-Based Site-Specific Specimen Preparation for Atom Probe Tomography.

    PubMed

    Prosa, Ty J; Larson, David J

    2017-04-01

    Approximately 30 years after the first use of focused ion beam (FIB) instruments to prepare atom probe tomography specimens, this technique has grown to be used by hundreds of researchers around the world. This past decade has seen tremendous advances in atom probe applications, enabled by the continued development of FIB-based specimen preparation methodologies. In this work, we provide a short review of the origin of the FIB method and the standard methods used today for lift-out and sharpening, using the annular milling method as applied to atom probe tomography specimens. Key steps for enabling correlative analysis with transmission electron-beam backscatter diffraction, transmission electron microscopy, and atom probe tomography are presented, and strategies for preparing specimens for modern microelectronic device structures are reviewed and discussed in detail. Examples are used for discussion of the steps for each of these methods. We conclude with examples of the challenges presented by complex topologies such as nanowires, nanoparticles, and organic materials.

  19. The Near Earth Object (NEO) Scout Spacecraft: A Low-cost Approach to In-situ Characterization of the NEO Population

    NASA Technical Reports Server (NTRS)

    Woeppel, Eric A.; Balsamo, James M.; Fischer, Karl J.; East, Matthew J.; Styborski, Jeremy A.; Roche, Christopher A.; Ott, Mackenzie D.; Scorza, Matthew J.; Doherty, Christopher D.; Trovato, Andrew J.; hide

    2014-01-01

    This paper describes a microsatellite spacecraft with supporting mission profile and architecture, designed to enable preliminary in-situ characterization of a significant number of Near Earth Objects (NEOs) at reasonably low cost. The spacecraft will be referred to as the NEO-Scout. NEO-Scout spacecraft are to be placed in Geosynchronous Equatorial Orbit (GEO), cis-lunar space, or on earth escape trajectories as secondary payloads on launch vehicles headed for GEO or beyond, and will begin their mission after deployment from the launcher. A distinguishing key feature of the NEO-Scout system is to design the spacecraft and mission timeline so as to enable rendezvous with and landing on the target NEO during NEO close approach (<0.3 AU) to the Earth-Moon system using low-thrust/high-impulse propulsion systems. Mission durations are on the order 100 to 400 days. Mission feasibility and preliminary design analysis are presented, along with detailed trajectory calculations.

  20. The UCSC Genome Browser: What Every Molecular Biologist Should Know

    PubMed Central

    Mangan, Mary E.; Williams, Jennifer M.; Kuhn, Robert M.; Lathe, Warren C.

    2014-01-01

    Electronic data resources can enable molecular biologists to quickly get information from around the world that a decade ago would have been buried in papers scattered throughout the library. The ability to access, query, and display these data make benchwork much more efficient and drive new discoveries. Increasingly, mastery of software resources and corresponding data repositories is required to fully explore the volume of data generated in biomedical and agricultural research, because only small amounts of data are actually found in traditional publications. The UCSC Genome Browser provides a wealth of data and tools that advance understanding of genomic context for many species, enable detailed analysis of data, and provide the ability to interrogate regions of interest across disparate data sets from a wide variety of sources. Researchers can also supplement the standard display with their own data to query and share this with others. Effective use of these resources has become crucial to biological research today, and this unit describes some practical applications of the UCSC Genome Browser. PMID:24984850

  1. BEST: barcode enabled sequencing of tetrads.

    PubMed

    Scott, Adrian C; Ludlow, Catherine L; Cromie, Gareth A; Dudley, Aimée M

    2014-05-01

    Tetrad analysis is a valuable tool for yeast genetics, but the laborious manual nature of the process has hindered its application on large scales. Barcode Enabled Sequencing of Tetrads (BEST)1 replaces the manual processes of isolating, disrupting and spacing tetrads. BEST isolates tetrads by virtue of a sporulation-specific GFP fusion protein that permits fluorescence-activated cell sorting of tetrads directly onto agar plates, where the ascus is enzymatically digested and the spores are disrupted and randomly arrayed by glass bead plating. The haploid colonies are then assigned sister spore relationships, i.e. information about which spores originated from the same tetrad, using molecular barcodes read during genotyping. By removing the bottleneck of manual dissection, hundreds or even thousands of tetrads can be isolated in minutes. Here we present a detailed description of the experimental procedures required to perform BEST in the yeast Saccharomyces cerevisiae, starting with a heterozygous diploid strain through the isolation of colonies derived from the haploid meiotic progeny.

  2. Using the CPTAC Assay Portal to Identify and Implement Highly Characterized Targeted Proteomics Assays.

    PubMed

    Whiteaker, Jeffrey R; Halusa, Goran N; Hoofnagle, Andrew N; Sharma, Vagisha; MacLean, Brendan; Yan, Ping; Wrobel, John A; Kennedy, Jacob; Mani, D R; Zimmerman, Lisa J; Meyer, Matthew R; Mesri, Mehdi; Boja, Emily; Carr, Steven A; Chan, Daniel W; Chen, Xian; Chen, Jing; Davies, Sherri R; Ellis, Matthew J C; Fenyö, David; Hiltke, Tara; Ketchum, Karen A; Kinsinger, Chris; Kuhn, Eric; Liebler, Daniel C; Liu, Tao; Loss, Michael; MacCoss, Michael J; Qian, Wei-Jun; Rivers, Robert; Rodland, Karin D; Ruggles, Kelly V; Scott, Mitchell G; Smith, Richard D; Thomas, Stefani; Townsend, R Reid; Whiteley, Gordon; Wu, Chaochao; Zhang, Hui; Zhang, Zhen; Rodriguez, Henry; Paulovich, Amanda G

    2016-01-01

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative to other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and posttranslational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.

  3. Computational neuroanatomy using brain deformations: From brain parcellation to multivariate pattern analysis and machine learning

    PubMed Central

    Davatzikos, Christos

    2017-01-01

    The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. PMID:27514582

  4. Effectiveness of orthodontic miniscrew implants in anchorage reinforcement during en-masse retraction: A systematic review and meta-analysis.

    PubMed

    Antoszewska-Smith, Joanna; Sarul, Michał; Łyczek, Jan; Konopka, Tomasz; Kawala, Beata

    2017-03-01

    The aim of this systematic review was to compare the effectiveness of orthodontic miniscrew implants-temporary intraoral skeletal anchorage devices (TISADs)-in anchorage reinforcement during en-masse retraction in relation to conventional methods of anchorage. A search of PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science was performed. The keywords were orthodontic, mini-implants, miniscrews, miniplates, and temporary anchorage device. Relevant articles were assessed for quality according to Cochrane guidelines and the data extracted for statistical analysis. A meta-analysis of raw mean differences concerning anchorage loss, tipping of molars, retraction of incisors, tipping of incisors, and treatment duration was carried out. Initially, we retrieved 10,038 articles. The selection process finally resulted in 14 articles including 616 patients (451 female, 165 male) for detailed analysis. Quality of the included studies was assessed as moderate. Meta-analysis showed that use of TISADs facilitates better anchorage reinforcement compared with conventional methods. On average, TISADs enabled 1.86 mm more anchorage preservation than did conventional methods (P <0.001). The results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2 mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  5. Atlas2 Cloud: a framework for personal genome analysis in the cloud

    PubMed Central

    2012-01-01

    Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663

  6. Atlas2 Cloud: a framework for personal genome analysis in the cloud.

    PubMed

    Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli

    2012-01-01

    Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.

  7. Enabling High-performance Interactive Geoscience Data Analysis Through Data Placement and Movement Optimization

    NASA Astrophysics Data System (ADS)

    Zhu, F.; Yu, H.; Rilee, M. L.; Kuo, K. S.; Yu, L.; Pan, Y.; Jiang, H.

    2017-12-01

    Since the establishment of data archive centers and the standardization of file formats, scientists are required to search metadata catalogs for data needed and download the data files to their local machines to carry out data analysis. This approach has facilitated data discovery and access for decades, but it inevitably leads to data transfer from data archive centers to scientists' computers through low-bandwidth Internet connections. Data transfer becomes a major performance bottleneck in such an approach. Combined with generally constrained local compute/storage resources, they limit the extent of scientists' studies and deprive them of timely outcomes. Thus, this conventional approach is not scalable with respect to both the volume and variety of geoscience data. A much more viable solution is to couple analysis and storage systems to minimize data transfer. In our study, we compare loosely coupled approaches (exemplified by Spark and Hadoop) and tightly coupled approaches (exemplified by parallel distributed database management systems, e.g., SciDB). In particular, we investigate the optimization of data placement and movement to effectively tackle the variety challenge, and boost the popularization of parallelization to address the volume challenge. Our goal is to enable high-performance interactive analysis for a good portion of geoscience data analysis exercise. We show that tightly coupled approaches can concentrate data traffic between local storage systems and compute units, and thereby optimizing bandwidth utilization to achieve a better throughput. Based on our observations, we develop a geoscience data analysis system that tightly couples analysis engines with storages, which has direct access to the detailed map of data partition locations. Through an innovation data partitioning and distribution scheme, our system has demonstrated scalable and interactive performance in real-world geoscience data analysis applications.

  8. Fast interactive exploration of 4D MRI flow data

    NASA Astrophysics Data System (ADS)

    Hennemuth, A.; Friman, O.; Schumann, C.; Bock, J.; Drexl, J.; Huellebrand, M.; Markl, M.; Peitgen, H.-O.

    2011-03-01

    1- or 2-directional MRI blood flow mapping sequences are an integral part of standard MR protocols for diagnosis and therapy control in heart diseases. Recent progress in rapid MRI has made it possible to acquire volumetric, 3-directional cine images in reasonable scan time. In addition to flow and velocity measurements relative to arbitrarily oriented image planes, the analysis of 3-dimensional trajectories enables the visualization of flow patterns, local features of flow trajectories or possible paths into specific regions. The anatomical and functional information allows for advanced hemodynamic analysis in different application areas like stroke risk assessment, congenital and acquired heart disease, aneurysms or abdominal collaterals and cranial blood flow. The complexity of the 4D MRI flow datasets and the flow related image analysis tasks makes the development of fast comprehensive data exploration software for advanced flow analysis a challenging task. Most existing tools address only individual aspects of the analysis pipeline such as pre-processing, quantification or visualization, or are difficult to use for clinicians. The goal of the presented work is to provide a software solution that supports the whole image analysis pipeline and enables data exploration with fast intuitive interaction and visualization methods. The implemented methods facilitate the segmentation and inspection of different vascular systems. Arbitrary 2- or 3-dimensional regions for quantitative analysis and particle tracing can be defined interactively. Synchronized views of animated 3D path lines, 2D velocity or flow overlays and flow curves offer a detailed insight into local hemodynamics. The application of the analysis pipeline is shown for 6 cases from clinical practice, illustrating the usefulness for different clinical questions. Initial user tests show that the software is intuitive to learn and even inexperienced users achieve good results within reasonable processing times.

  9. Advanced Sea Base Enabler (ASE) Capstone Design Project

    DTIC Science & Technology

    2009-09-21

    Additionally, a study that examines a potential fleet architecture , which looks at a combination of sea base enabler platforms in order to close current...This change in premise spawned a post-Cold War naval intellectual renaissance , reflected in several Department of the Navy (DON) “white papers...information collected regarding the various systems is reliable. 3. Primary Areas of Focus Detailed engineering analyses, naval architecture or other

  10. Productization and Commercialization of IT-Enabled Higher Education in Computer Science: A Systematic Literature Review

    ERIC Educational Resources Information Center

    Kankaanpää, Irja; Isomäki, Hannakaisa

    2013-01-01

    This paper reviews research literature on the production and commercialization of IT-enabled higher education in computer science. Systematic literature review (SLR) was carried out in order to find out to what extent this area has been studied, more specifically how much it has been studied and to what detail. The results of this paper make a…

  11. A biomechanical modeling guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2017-03-01

    Four-dimensional (4D) cone-beam computed tomography (CBCT) enables motion tracking of anatomical structures and removes artifacts introduced by motion. However, the imaging time/dose of 4D-CBCT is substantially longer/higher than traditional 3D-CBCT. We previously developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, to reconstruct high-quality 4D-CBCT from limited number of projections to reduce the imaging time/dose. However, the accuracy of SMEIR is limited in reconstructing low-contrast regions with fine structure details. In this study, we incorporate biomechanical modeling into the SMEIR algorithm (SMEIR-Bio), to improve the reconstruction accuracy at low-contrast regions with fine details. The efficacy of SMEIR-Bio is evaluated using 11 lung patient cases and compared to that of the original SMEIR algorithm. Qualitative and quantitative comparisons showed that SMEIR-Bio greatly enhances the accuracy of reconstructed 4D-CBCT volume in low-contrast regions, which can potentially benefit multiple clinical applications including the treatment outcome analysis.

  12. Realized volatility and absolute return volatility: a comparison indicating market risk.

    PubMed

    Zheng, Zeyu; Qiao, Zhi; Takaishi, Tetsuya; Stanley, H Eugene; Li, Baowen

    2014-01-01

    Measuring volatility in financial markets is a primary challenge in the theory and practice of risk management and is essential when developing investment strategies. Although the vast literature on the topic describes many different models, two nonparametric measurements have emerged and received wide use over the past decade: realized volatility and absolute return volatility. The former is strongly favored in the financial sector and the latter by econophysicists. We examine the memory and clustering features of these two methods and find that both enable strong predictions. We compare the two in detail and find that although realized volatility has a better short-term effect that allows predictions of near-future market behavior, absolute return volatility is easier to calculate and, as a risk indicator, has approximately the same sensitivity as realized volatility. Our detailed empirical analysis yields valuable guidelines for both researchers and market participants because it provides a significantly clearer comparison of the strengths and weaknesses of the two methods.

  13. Realized Volatility and Absolute Return Volatility: A Comparison Indicating Market Risk

    PubMed Central

    Takaishi, Tetsuya; Stanley, H. Eugene; Li, Baowen

    2014-01-01

    Measuring volatility in financial markets is a primary challenge in the theory and practice of risk management and is essential when developing investment strategies. Although the vast literature on the topic describes many different models, two nonparametric measurements have emerged and received wide use over the past decade: realized volatility and absolute return volatility. The former is strongly favored in the financial sector and the latter by econophysicists. We examine the memory and clustering features of these two methods and find that both enable strong predictions. We compare the two in detail and find that although realized volatility has a better short-term effect that allows predictions of near-future market behavior, absolute return volatility is easier to calculate and, as a risk indicator, has approximately the same sensitivity as realized volatility. Our detailed empirical analysis yields valuable guidelines for both researchers and market participants because it provides a significantly clearer comparison of the strengths and weaknesses of the two methods. PMID:25054439

  14. Process Design and Economics for the Conversion of Lignocellulosic Biomass to Hydrocarbon Fuels. Thermochemical Research Pathways with In Situ and Ex Situ Upgrading of Fast Pyrolysis Vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, Abhijit; Sahir, Asad; Tan, Eric

    This report was developed as part of the U.S. Department of Energy’s Bioenergy Technologies Office’s efforts to enable the development of technologies for the production of infrastructurecompatible, cost-competitive liquid hydrocarbon fuels from biomass. Specifically, this report details two conceptual designs based on projected product yields and quality improvements via catalyst development and process integration. It is expected that these research improvements will be made within the 2022 timeframe. The two conversion pathways detailed are (1) in situ and (2) ex situ upgrading of vapors produced from the fast pyrolysis of biomass. While the base case conceptual designs and underlying assumptionsmore » outline performance metrics for feasibility, it should be noted that these are only two of many other possibilities in this area of research. Other promising process design options emerging from the research will be considered for future techno-economic analysis.« less

  15. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    PubMed Central

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649

  16. SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, M.; Ding, P.; Aliaga, L.

    The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations aremore » designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.« less

  17. Nanomanipulation-Coupled Matrix-Assisted Laser Desorption/ Ionization-Direct Organelle Mass Spectrometry: A Technique for the Detailed Analysis of Single Organelles

    NASA Astrophysics Data System (ADS)

    Phelps, Mandy S.; Sturtevant, Drew; Chapman, Kent D.; Verbeck, Guido F.

    2016-02-01

    We describe a novel technique combining precise organelle microextraction with deposition and matrix-assisted laser desorption/ionization (MALDI) for a rapid, minimally invasive mass spectrometry (MS) analysis of single organelles from living cells. A dual-positioner nanomanipulator workstation was utilized for both extraction of organelle content and precise co-deposition of analyte and matrix solution for MALDI-direct organelle mass spectrometry (DOMS) analysis. Here, the triacylglycerol (TAG) profiles of single lipid droplets from 3T3-L1 adipocytes were acquired and results validated with nanoelectrospray ionization (NSI) MS. The results demonstrate the utility of the MALDI-DOMS technique as it enabled longer mass analysis time, higher ionization efficiency, MS imaging of the co-deposited spot, and subsequent MS/MS capabilities of localized lipid content in comparison to NSI-DOMS. This method provides selective organellar resolution, which complements current biochemical analyses and prompts for subsequent subcellular studies to be performed where limited samples and analyte volume are of concern.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.

    BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less

  19. Launch vehicle design and GNC sizing with ASTOS

    NASA Astrophysics Data System (ADS)

    Cremaschi, Francesco; Winter, Sebastian; Rossi, Valerio; Wiegand, Andreas

    2018-03-01

    The European Space Agency (ESA) is currently involved in several activities related to launch vehicle designs (Future Launcher Preparatory Program, Ariane 6, VEGA evolutions, etc.). Within these activities, ESA has identified the importance of developing a simulation infrastructure capable of supporting the multi-disciplinary design and preliminary guidance navigation and control (GNC) design of different launch vehicle configurations. Astos Solutions has developed the multi-disciplinary optimization and launcher GNC simulation and sizing tool (LGSST) under ESA contract. The functionality is integrated in the Analysis, Simulation and Trajectory Optimization Software for space applications (ASTOS) and is intended to be used from the early design phases up to phase B1 activities. ASTOS shall enable the user to perform detailed vehicle design tasks and assessment of GNC systems, covering all aspects of rapid configuration and scenario management, sizing of stages, trajectory-dependent estimation of structural masses, rigid and flexible body dynamics, navigation, guidance and control, worst case analysis, launch safety analysis, performance analysis, and reporting.

  20. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.

  1. Computational statistics using the Bayesian Inference Engine

    NASA Astrophysics Data System (ADS)

    Weinberg, Martin D.

    2013-09-01

    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.

  2. Mission Operations with an Autonomous Agent

    NASA Technical Reports Server (NTRS)

    Pell, Barney; Sawyer, Scott R.; Muscettola, Nicola; Smith, Benjamin; Bernard, Douglas E.

    1998-01-01

    The Remote Agent (RA) is an Artificial Intelligence (AI) system which automates some of the tasks normally reserved for human mission operators and performs these tasks autonomously on-board the spacecraft. These tasks include activity generation, sequencing, spacecraft analysis, and failure recovery. The RA will be demonstrated as a flight experiment on Deep Space One (DSI), the first deep space mission of the NASA's New Millennium Program (NMP). As we moved from prototyping into actual flight code development and teamed with ground operators, we made several major extensions to the RA architecture to address the broader operational context in which PA would be used. These extensions support ground operators and the RA sharing a long-range mission profile with facilities for asynchronous ground updates; support ground operators monitoring and commanding the spacecraft at multiple levels of detail simultaneously; and enable ground operators to provide additional knowledge to the RA, such as parameter updates, model updates, and diagnostic information, without interfering with the activities of the RA or leaving the system in an inconsistent state. The resulting architecture supports incremental autonomy, in which a basic agent can be delivered early and then used in an increasingly autonomous manner over the lifetime of the mission. It also supports variable autonomy, as it enables ground operators to benefit from autonomy when L'@ey want it, but does not inhibit them from obtaining a detailed understanding and exercising tighter control when necessary. These issues are critical to the successful development and operation of autonomous spacecraft.

  3. Detailed analysis of the effects of stencil spatial variations with arbitrary high-order finite-difference Maxwell solver

    DOE PAGES

    Vincenti, H.; Vay, J. -L.

    2015-11-22

    Due to discretization effects and truncation to finite domains, many electromagnetic simulations present non-physical modifications of Maxwell's equations in space that may generate spurious signals affecting the overall accuracy of the result. Such modifications for instance occur when Perfectly Matched Layers (PMLs) are used at simulation domain boundaries to simulate open media. Another example is the use of arbitrary order Maxwell solver with domain decomposition technique that may under some condition involve stencil truncations at subdomain boundaries, resulting in small spurious errors that do eventually build up. In each case, a careful evaluation of the characteristics and magnitude of themore » errors resulting from these approximations, and their impact at any frequency and angle, requires detailed analytical and numerical studies. To this end, we present a general analytical approach that enables the evaluation of numerical discretization errors of fully three-dimensional arbitrary order finite-difference Maxwell solver, with arbitrary modification of the local stencil in the simulation domain. The analytical model is validated against simulations of domain decomposition technique and PMLs, when these are used with very high-order Maxwell solver, as well as in the infinite order limit of pseudo-spectral solvers. Results confirm that the new analytical approach enables exact predictions in each case. It also confirms that the domain decomposition technique can be used with very high-order Maxwell solver and a reasonably low number of guard cells with negligible effects on the whole accuracy of the simulation.« less

  4. Clarifying details on a 1930s-era pine-hardwood stand in Arkansas

    Treesearch

    Don C.  Bragg

    2015-01-01

    Data from recently discovered daily-work logs of US Forest Service (USFS) researcher Russell R. Reynolds enabled me to clarify a study I published a decade ago on a 1930s-vintage unmanaged, second-growth Pinus (pine)–hardwood stand in southeastern Arkansas. Though still too vague to reveal every detail, Reynolds’ work logs confirmed a number of...

  5. The Natural History and Treatment Outcomes of Perineural Spread of Malignancy within the Head and Neck.

    PubMed

    Warren, Timothy A; Nagle, Christina M; Bowman, James; Panizza, Benedict J

    2016-04-01

    Understanding the natural history of diseases enables the clinician to better diagnose and treat their patients. Perineural spread of head and neck cancers are poorly understood and often diagnosis is delayed resulting in poorer outcomes and more debilitating treatments. This article reviews a large personal series of head and neck malignancy presenting with perineural spread along almost exclusively the trigeminal and/or facial nerves. A detailed analysis of squamous cell carcinoma of cutaneous origin is presented including an analysis of likely primaries, which most often have occurred months to years prior. The importance of early detection is reinforced by the highly significant (p < 0.0001) differences in disease specific survival, which occur, depending on how far along a cranial nerve the disease has been allowed to spread.

  6. The Natural History and Treatment Outcomes of Perineural Spread of Malignancy within the Head and Neck

    PubMed Central

    Warren, Timothy A.; Nagle, Christina M.; Bowman, James; Panizza, Benedict J.

    2016-01-01

    Understanding the natural history of diseases enables the clinician to better diagnose and treat their patients. Perineural spread of head and neck cancers are poorly understood and often diagnosis is delayed resulting in poorer outcomes and more debilitating treatments. This article reviews a large personal series of head and neck malignancy presenting with perineural spread along almost exclusively the trigeminal and/or facial nerves. A detailed analysis of squamous cell carcinoma of cutaneous origin is presented including an analysis of likely primaries, which most often have occurred months to years prior. The importance of early detection is reinforced by the highly significant (p < 0.0001) differences in disease specific survival, which occur, depending on how far along a cranial nerve the disease has been allowed to spread. PMID:27123386

  7. Diagnosis of Fanconi Anemia: Chromosomal Breakage Analysis

    PubMed Central

    Oostra, Anneke B.; Nieuwint, Aggie W. M.; Joenje, Hans; de Winter, Johan P.

    2012-01-01

    Fanconi anemia (FA) is a rare inherited syndrome with diverse clinical symptoms including developmental defects, short stature, bone marrow failure, and a high risk of malignancies. Fifteen genetic subtypes have been distinguished so far. The mode of inheritance for all subtypes is autosomal recessive, except for FA-B, which is X-linked. Cells derived from FA patients are—by definition—hypersensitive to DNA cross-linking agents, such as mitomycin C, diepoxybutane, or cisplatinum, which becomes manifest as excessive growth inhibition, cell cycle arrest, and chromosomal breakage upon cellular exposure to these drugs. Here we provide a detailed laboratory protocol for the accurate assessment of the FA diagnosis as based on mitomycin C-induced chromosomal breakage analysis in whole-blood cultures. The method also enables a quantitative estimate of the degree of mosaicism in the lymphocyte compartment of the patient. PMID:22693659

  8. Sorting of Streptomyces Cell Pellets Using a Complex Object Parametric Analyzer and Sorter

    PubMed Central

    Petrus, Marloes L. C.; van Veluw, G. Jerre; Wösten, Han A. B.; Claessen, Dennis

    2014-01-01

    Streptomycetes are filamentous soil bacteria that are used in industry for the production of enzymes and antibiotics. When grown in bioreactors, these organisms form networks of interconnected hyphae, known as pellets, which are heterogeneous in size. Here we describe a method to analyze and sort mycelial pellets using a Complex Object Parametric Analyzer and Sorter (COPAS). Detailed instructions are given for the use of the instrument and the basic statistical analysis of the data. We furthermore describe how pellets can be sorted according to user-defined settings, which enables downstream processing such as the analysis of the RNA or protein content. Using this methodology the mechanism underlying heterogeneous growth can be tackled. This will be instrumental for improving streptomycetes as a cell factory, considering the fact that productivity correlates with pellet size. PMID:24561666

  9. Design Through Manufacturing: The Solid Model-Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2002-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts reflecting every detail of the finished product. Ideally, in the aerospace industry, these models should fulfill two very important functions: (1) provide numerical. control information for automated manufacturing of precision parts, and (2) enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in aircraft and space vehicles. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. Presently, the process of preparing CAD models for FEA consumes a great deal of the analyst's time.

  10. An equation-free approach to agent-based computation: Bifurcation analysis and control of stationary states

    NASA Astrophysics Data System (ADS)

    Siettos, C. I.; Gear, C. W.; Kevrekidis, I. G.

    2012-08-01

    We show how the equation-free approach can be exploited to enable agent-based simulators to perform system-level computations such as bifurcation, stability analysis and controller design. We illustrate these tasks through an event-driven agent-based model describing the dynamic behaviour of many interacting investors in the presence of mimesis. Using short bursts of appropriately initialized runs of the detailed, agent-based simulator, we construct the coarse-grained bifurcation diagram of the (expected) density of agents and investigate the stability of its multiple solution branches. When the mimetic coupling between agents becomes strong enough, the stable stationary state loses its stability at a coarse turning point bifurcation. We also demonstrate how the framework can be used to design a wash-out dynamic controller that stabilizes open-loop unstable stationary states even under model uncertainty.

  11. Analysis of neuronal cells of dissociated primary culture on high-density CMOS electrode array

    PubMed Central

    Matsuda, Eiko; Mita, Takeshi; Hubert, Julien; Bakkum, Douglas; Frey, Urs; Hierlemann, Andreas; Takahashi, Hirokazu; Ikegami, Takashi

    2017-01-01

    Spontaneous development of neuronal cells was recorded around 4–34 days in vitro (DIV) with high-density CMOS array, which enables detailed study of the spatio-temporal activity of neuronal culture. We used the CMOS array to characterize the evolution of the inter-spike interval (ISI) distribution from putative single neurons, and estimate the network structure based on transfer entropy analysis, where each node corresponds to a single neuron. We observed that the ISI distributions gradually obeyed the power law with maturation of the network. The amount of information transferred between neurons increased at the early stage of development, but decreased as the network matured. These results suggest that both ISI and transfer entropy were very useful for characterizing the dynamic development of cultured neural cells over a few weeks. PMID:24109870

  12. Decision Analysis for Remediation Technologies (DART) user`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sebo, D.

    1997-09-01

    This user`s manual is an introduction to the use of the Decision Analysis for Remediation Technology (DART) Report Generator. DART provides a user interface to a database containing site data (e.g., contaminants, waste depth, area) for sites within the Subsurface Contaminant Focus Area (SCFA). The database also contains SCFA requirements, needs, and technology information. The manual is arranged in two major sections. The first section describes loading DART onto a user system. The second section describes DART operation. DART operation is organized into sections by the user interface forms. For each form, user input, both optional and required, DART capabilities,more » and the result of user selections will be covered in sufficient detail to enable the user to understand DART, capabilities and determine how to use DART to meet specific needs.« less

  13. Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2003-01-01

    This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.

  14. Basic as well as detailed neurosonograms can be performed by offline analysis of three-dimensional fetal brain volumes.

    PubMed

    Bornstein, E; Monteagudo, A; Santos, R; Strock, I; Tsymbal, T; Lenchner, E; Timor-Tritsch, I E

    2010-07-01

    To evaluate the feasibility and the processing time of offline analysis of three-dimensional (3D) brain volumes to perform a basic, as well as a detailed, targeted, fetal neurosonogram. 3D fetal brain volumes were obtained in 103 consecutive healthy fetuses that underwent routine anatomical survey at 20-23 postmenstrual weeks. Transabdominal gray-scale and power Doppler volumes of the fetal brain were acquired by one of three experienced sonographers (an average of seven volumes per fetus). Acquisition was first attempted in the sagittal and coronal planes. When the fetal position did not enable easy and rapid access to these planes, axial acquisition at the level of the biparietal diameter was performed. Offline analysis of each volume was performed by two of the authors in a blinded manner. A systematic technique of 'volume manipulation' was used to identify a list of 25 brain dimensions/structures comprising a complete basic evaluation, intracranial biometry and a detailed targeted fetal neurosonogram. The feasibility and reproducibility of obtaining diagnostic-quality images of the different structures was evaluated, and processing times were recorded, by the two examiners. Diagnostic-quality visualization was feasible in all of the 25 structures, with an excellent visualization rate (85-100%) reported in 18 structures, a good visualization rate (69-97%) reported in five structures and a low visualization rate (38-54%) reported in two structures, by the two examiners. An average of 4.3 and 5.4 volumes were used to complete the examination by the two examiners, with a mean processing time of 7.2 and 8.8 minutes, respectively. The overall agreement rate for diagnostic visualization of the different brain structures between the two examiners was 89.9%, with a kappa coefficient of 0.5 (P < 0.001). In experienced hands, offline analysis of 3D brain volumes is a reproducible modality that can identify all structures necessary to complete both a basic and a detailed second-trimester fetal neurosonogram. Copyright 2010 ISUOG. Published by John Wiley & Sons, Ltd.

  15. The Effect of Laminar Flow on Rotor Hover Performance

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.; Martin, Preston B.

    2017-01-01

    The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.

  16. Supply Chain Sustainability Analysis of Whole Algae Hydrothermal Liquefaction and Upgrading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pegallapati, Ambica Koushik; Dunn, Jennifer B.; Frank, Edward D.

    2015-04-01

    The Department of Energy's Bioenergy Technology Office (BETO) collaborates with a wide range of institutions towards the development and deployment of biofuels and bioproducts. To facilitate this effort, BETO and its partner national laboratories develop detailed techno-economic assessments (TEA) of biofuel production technologies as part of the development of design cases and state of technology (SOT) analyses. A design case is a TEA that outlines a target case for a particular biofuel pathway. It enables preliminary identification of data gaps and research and development needs and provides goals and targets against which technology progress is assessed. On the other hand,more » an SOT analysis assesses progress within and across relevant technology areas based on actual experimental results relative to technical targets and cost goals from design cases and includes technical, economic, and environmental criteria as available. (SOT) analyses. A design case is a TEA that outlines a target case for a particular biofuel pathway. It enables preliminary identification of data gaps and research and development needs and provides goals and targets against which technology progress is assessed. On the other hand, an SOT analysis assesses progress within and across relevant technology areas based on actual experimental results relative to technical targets and cost goals from design cases and includes technical, economic, and environmental criteria as available. (SOT) analyses. A design case is a TEA that outlines a target case for a particular biofuel pathway. It enables preliminary identification of data gaps and research and development needs and provides goals and targets against which technology progress is assessed. On the other hand, an SOT analysis assesses progress within and across relevant technology areas based on actual experimental results relative to technical targets and cost goals from design cases and includes technical, economic, and environmental criteria as available.« less

  17. Precise Tuning of Facile One-Pot Gelatin Methacryloyl (GelMA) Synthesis

    NASA Astrophysics Data System (ADS)

    Shirahama, Hitomi; Lee, Bae Hoon; Tan, Lay Poh; Cho, Nam-Joon

    2016-08-01

    Gelatin-methacryloyl (GelMA) is one of the most commonly used photopolymerizable biomaterials in bio-applications. However, GelMA synthesis remains suboptimal, as its reaction parameters have not been fully investigated. The goal of this study is to establish an optimal route for effective and controllable GelMA synthesis by systematically examining reaction parameters including carbonate-bicarbonate (CB) buffer molarity, initial pH adjustment, MAA concentration, gelatin concentration, reaction temperature, and reaction time. We employed several analytical techniques in order to determine the degree of substitution (DS) and conducted detailed structural analysis of the synthesized polymer. The results enabled us to optimize GelMA synthesis, showing the optimal conditions to balance the deprotonation of amino groups with minimizing MAA hydrolysis, which led to nearly complete substitution. The optimized conditions (low feed ratio of MAA to gelatin (0.1 mL/g), 0.25 M CB buffer at pH 9, and a gelatin concentration of 10-20%) enable a simplified reaction scheme that produces GelMA with high substitution with just one-step addition of MAA in one pot. Looking forward, these optimal conditions not only enable facile one-pot GelMA synthesis but can also guide researchers to explore the efficient, high methacrylation of other biomacromolecules.

  18. Precise Tuning of Facile One-Pot Gelatin Methacryloyl (GelMA) Synthesis

    PubMed Central

    Shirahama, Hitomi; Lee, Bae Hoon; Tan, Lay Poh; Cho, Nam-Joon

    2016-01-01

    Gelatin-methacryloyl (GelMA) is one of the most commonly used photopolymerizable biomaterials in bio-applications. However, GelMA synthesis remains suboptimal, as its reaction parameters have not been fully investigated. The goal of this study is to establish an optimal route for effective and controllable GelMA synthesis by systematically examining reaction parameters including carbonate-bicarbonate (CB) buffer molarity, initial pH adjustment, MAA concentration, gelatin concentration, reaction temperature, and reaction time. We employed several analytical techniques in order to determine the degree of substitution (DS) and conducted detailed structural analysis of the synthesized polymer. The results enabled us to optimize GelMA synthesis, showing the optimal conditions to balance the deprotonation of amino groups with minimizing MAA hydrolysis, which led to nearly complete substitution. The optimized conditions (low feed ratio of MAA to gelatin (0.1 mL/g), 0.25 M CB buffer at pH 9, and a gelatin concentration of 10–20%) enable a simplified reaction scheme that produces GelMA with high substitution with just one-step addition of MAA in one pot. Looking forward, these optimal conditions not only enable facile one-pot GelMA synthesis but can also guide researchers to explore the efficient, high methacrylation of other biomacromolecules. PMID:27503340

  19. The Use of the STAGS Finite Element Code in Stitched Structures Development

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C.; Lovejoy, Andrew E.

    2014-01-01

    In the last 30 years NASA has worked in collaboration with industry to develop enabling technologies needed to make aircraft more fuel-efficient and more affordable. The focus on the airframe has been to reduce weight, improve damage tolerance and better understand structural behavior under realistic flight and ground loading conditions. Stitched structure is a technology that can address the weight savings, cost reduction, and damage tolerance goals, but only if it is supported by accurate analytical techniques. Development of stitched technology began in the 1990's as a partnership between NASA and Boeing (McDonnell Douglas at the time) under the Advanced Composites Technology Program and has continued under various titles and programs and into the Environmentally Responsible Aviation Project today. These programs contained development efforts involving manufacturing development, design, detailed analysis, and testing. Each phase of development, from coupons to large aircraft components was supported by detailed analysis to prove that the behavior of these structures was well-understood and predictable. The Structural Analysis of General Shells (STAGS) computer code was a critical tool used in the development of many stitched structures. As a key developer of STAGS, Charles Rankin's contribution to the programs was quite significant. Key features of STAGS used in these analyses and discussed in this paper include its accurate nonlinear and post-buckling capabilities, its ability to predict damage growth, and the use of Lagrange constraints and follower forces.

  20. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    PubMed

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  1. An Adjunct Galilean Satellite Orbiter Using a Small Radioisotope Power Source

    NASA Technical Reports Server (NTRS)

    Abelson, Robert Dean; Randolph, J.; Alkalai, L.; Collins, D.; Moore, W.

    2005-01-01

    This is a conceptual mission study intended to demonstrate the range of possible missions and applications that could be enabled were a new generation of Small Radioisotope Power Systems to be developed by NASA and DOE. While such systems are currently being considered by NASA and DOE, they do not currently exist. This study is one of several small RPS-enabled mission concepts that were studied and presented in the NASA/JPL document "Enabling Exploration with Small Radioisotope Power Systems" available at: http://solarsystem.nasa.gov/multimedia/download-detail.cfm?DL_ID=82

  2. [Quality management in emergency departments: Lack of uniform standards for fact-based controlling].

    PubMed

    Ries, M; Christ, M

    2015-11-01

    The general high occupancy of emergency departments during the winter months of 2014/2015 outlined deficits in health politics. Whether on the regional, province, or federal level, verifiable and accepted figures to enable in depth analysis and fact-based controlling of emergency care systems are lacking. As the first step, reasons for the current situation are outlined in order to developed concrete recommendations for individual hospitals. This work is based on a selective literature search with focus on quality management, ratio driven management, and process management within emergency departments as well as personal experience with implementation of a key ratio system in a German maximum care hospital. The insufficient integration of emergencies into the DRG systematic, the role as gatekeeper between inpatient and outpatient care sector, the decentralized organization of emergency departments in many hospitals, and the inconsistent representation within the medical societies can be mentioned as reasons for the lack of key ratio systems. In addition to the important role within treatment procedures, emergency departments also have an immense economic importance. Consequently, the management of individual hospitals should promote implementation of key ratio systems to enable controlling of emergency care processes. Thereby the perspectives finance, employees, processes as well as partners and patients should be equally considered. Within the process perspective, milestones could be used to enable detailed controlling of treatment procedures. An implementation of key ratio systems without IT support is not feasible; thus, existing digital data should be used and future data analysis should already be considered during implementation of new IT systems.

  3. CHAD-Master

    EPA Pesticide Factsheets

    Detailed data on human behavior from 19 studies has been compiled into the Consolidated Human Activity Database (CHAD) , enabling researchers to examine specific population groups for unique activity patterns that influence overall exposure to chemicals.

  4. CHAD-2000

    EPA Pesticide Factsheets

    Detailed data on human behavior from 19 studies has been compiled into the Consolidated Human Activity Database (CHAD), enabling researchers to examine specific population groups for unique activity patterns that influence overall exposure to chemicals.

  5. Cleanups in My Community Data

    EPA Pesticide Factsheets

    Cleanups in My Community (CIMC) enables you to map and list hazardous waste cleanup locations and grant areas, and drill down to details about those cleanups and grants and other, related information.

  6. Developing a multi-method approach to data collection and analysis for explaining the learning during simulation in undergraduate nurse education.

    PubMed

    Bland, Andrew J; Tobbell, Jane

    2015-11-01

    Simulation has become an established feature of undergraduate nurse education and as such requires extensive investigation. Research limited to pre-constructed categories imposed by some questionnaire and interview methods may only provide partial understanding. This is problematic in understanding the mechanisms of learning in simulation-based education as contemporary distributed theories of learning posit that learning can be understood as the interaction of individual identity with context. This paper details a method of data collection and analysis that captures interaction of individuals within the simulation experience which can be analysed through multiple lenses, including context and through the lens of both researcher and learner. The study utilised a grounded theory approach involving 31 under-graduate third year student nurses. Data was collected and analysed through non-participant observation, digital recordings of simulation activity and focus group deconstruction of their recorded simulation by the participants and researcher. Focus group interviews enabled further clarification. The method revealed multiple levels of dynamic data, concluding that in order to better understand how students learn in social and active learning strategies, dynamic data is required enabling researchers and participants to unpack what is happening as it unfolds in action. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Large-scale gene function analysis with the PANTHER classification system.

    PubMed

    Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D

    2013-08-01

    The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.

  8. SimPhospho: a software tool enabling confident phosphosite assignment.

    PubMed

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  9. Quantifying EV battery end-of-life through analysis of travel needs with vehicle powertrain models

    NASA Astrophysics Data System (ADS)

    Saxena, Samveg; Le Floch, Caroline; MacDonald, Jason; Moura, Scott

    2015-05-01

    Electric vehicles enable clean and efficient transportation, however concerns about range anxiety and battery degradation hinder EV adoption. The common definition for battery end-of-life is when 70-80% of original energy capacity remains, however little analysis is available to support this retirement threshold. By applying detailed physics-based models of EVs with data on how drivers use their cars, we show that EV batteries continue to meet daily travel needs of drivers well beyond capacity fade of 80% remaining energy storage capacity. Further, we show that EV batteries with substantial energy capacity fade continue to provide sufficient buffer charge for unexpected trips with long distances. We show that enabling charging in more locations, even if only with 120 V wall outlets, prolongs useful life of EV batteries. Battery power fade is also examined and we show EVs meet performance requirements even down to 30% remaining power capacity. Our findings show that defining battery retirement at 70-80% remaining capacity is inaccurate. Battery retirement should instead be governed by when batteries no longer satisfy daily travel needs of a driver. Using this alternative retirement metric, we present results on the fraction of EV batteries that may be retired with different levels of energy capacity fade.

  10. Quantifying EV battery end-of-life through analysis of travel needs with vehicle powertrain models

    DOE PAGES

    Saxena, Samveg; Le Floch, Caroline; MacDonald, Jason; ...

    2015-05-15

    Electric vehicles enable clean and efficient transportation; however, concerns about range anxiety and battery degradation hinder EV adoption. The common definition for battery end-of-life is when 70-80% of original energy capacity remain;, however, little analysis is available to support this retirement threshold. By applying detailed physics-based models of EVs with data on how drivers use their cars, we show that EV batteries continue to meet daily travel needs of drivers well beyond capacity fade of 80% remaining energy storage capacity. Further, we show that EV batteries with substantial energy capacity fade continue to provide sufficient buffer charge for unexpected tripsmore » with long distances. We show that enabling charging in more locations, even if only with 120 V wall outlets, prolongs useful life of EV batteries. Battery power fade is also examined and we show EVs meet performance requirements even down to 30% remaining power capacity. Our findings show that defining battery retirement at 70-80% remaining capacity is inaccurate. Battery retirement should instead be governed by when batteries no longer satisfy daily travel needs of a driver. Using this alternative retirement metric, we present results on the fraction of EV batteries that may be retired with different levels of energy capacity fade.« less

  11. Quantifying EV battery end-of-life through analysis of travel needs with vehicle powertrain models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saxena, Samveg; Le Floch, Caroline; MacDonald, Jason

    Electric vehicles enable clean and efficient transportation; however, concerns about range anxiety and battery degradation hinder EV adoption. The common definition for battery end-of-life is when 70-80% of original energy capacity remain;, however, little analysis is available to support this retirement threshold. By applying detailed physics-based models of EVs with data on how drivers use their cars, we show that EV batteries continue to meet daily travel needs of drivers well beyond capacity fade of 80% remaining energy storage capacity. Further, we show that EV batteries with substantial energy capacity fade continue to provide sufficient buffer charge for unexpected tripsmore » with long distances. We show that enabling charging in more locations, even if only with 120 V wall outlets, prolongs useful life of EV batteries. Battery power fade is also examined and we show EVs meet performance requirements even down to 30% remaining power capacity. Our findings show that defining battery retirement at 70-80% remaining capacity is inaccurate. Battery retirement should instead be governed by when batteries no longer satisfy daily travel needs of a driver. Using this alternative retirement metric, we present results on the fraction of EV batteries that may be retired with different levels of energy capacity fade.« less

  12. Discovery of novel representatives of bilaterian neuropeptide families and reconstruction of neuropeptide precursor evolution in ophiuroid echinoderms

    PubMed Central

    Abylkassimova, Nikara; Hugall, Andrew F.; O'Hara, Timothy D.; Elphick, Maurice R.

    2017-01-01

    Neuropeptides are a diverse class of intercellular signalling molecules that mediate neuronal regulation of many physiological and behavioural processes. Recent advances in genome/transcriptome sequencing are enabling identification of neuropeptide precursor proteins in species from a growing variety of animal taxa, providing new insights into the evolution of neuropeptide signalling. Here, detailed analysis of transcriptome sequence data from three brittle star species, Ophionotus victoriae, Amphiura filiformis and Ophiopsila aranea, has enabled the first comprehensive identification of neuropeptide precursors in the class Ophiuroidea of the phylum Echinodermata. Representatives of over 30 bilaterian neuropeptide precursor families were identified, some of which occur as paralogues. Furthermore, homologues of endothelin/CCHamide, eclosion hormone, neuropeptide-F/Y and nucleobinin/nesfatin were discovered here in a deuterostome/echinoderm for the first time. The majority of ophiuroid neuropeptide precursors contain a single copy of a neuropeptide, but several precursors comprise multiple copies of identical or non-identical, but structurally related, neuropeptides. Here, we performed an unprecedented investigation of the evolution of neuropeptide copy number over a period of approximately 270 Myr by analysing sequence data from over 50 ophiuroid species, with reference to a robust phylogeny. Our analysis indicates that the composition of neuropeptide ‘cocktails’ is functionally important, but with plasticity over long evolutionary time scales. PMID:28878039

  13. Pacing of deep marine sedimentation in the middle Eocene synorogenic Ainsa Basin, Spanish Pyrenees: deconvolving a 6myr record of tectonic and climate controls

    NASA Astrophysics Data System (ADS)

    Mac Niocaill, C.; Cantalejo, B.; Pickering, K. T.; Grant, M.; Johansen, K.

    2016-12-01

    The Middle Eocene thrust-top Ainsa Basin of Northern Spain preserves world-class exposures of deep-marine submarine fan and related deposits. Detailed paleomagnetic, micropaleontologic, and time-series analysis enable us to deconvolve, for the first time in any ancient deep-marine basin worldwide, both the pacing on deposition of the fine-grained interfan sediments and the main sandbodies (submarine fans) through the history of the deep-marine basin. Our magnetostratigraphy and faunal constraints provide a chronological framework for sedimentation in the basin. We use time-series analysis of a range of geochemical and sedimentologic data to identify likely climatic signals in the sedimentary archive. This has enabled us to test the likely importance of climate versus tectonics in controlling deposition. We show that the fine-grained interfan sedimentation preserves a dominant Milankovitch-like cyclicity, whereas the sandbodies (fans) reflect a complex interplay of controls such as tectonics and climate in the sediment source area, including shallow-marine staging areas for sediment redeposition into deeper water. These results not only provide critical information about the timing of substantial coarse clastic delivery into the Ainsa Basin but also give constraints on sediment flux over a 6 Myr window.

  14. Community-based benchmarking of the CMIP DECK experiments

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2015-12-01

    A diversity of community-based efforts are independently developing "diagnostic packages" with little or no coordination between them. A short list of examples include NCAR's Climate Variability Diagnostics Package (CVDP), ORNL's International Land Model Benchmarking (ILAMB), LBNL's Toolkit for Extreme Climate Analysis (TECA), PCMDI's Metrics Package (PMP), the EU EMBRACE ESMValTool, the WGNE MJO diagnostics package, and CFMIP diagnostics. The full value of these efforts cannot be realized without some coordination. As a first step, a WCRP effort has initiated a catalog to document candidate packages that could potentially be applied in a "repeat-use" fashion to all simulations contributed to the CMIP DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. Some coordination of community-based diagnostics has the additional potential to improve how CMIP modeling groups analyze their simulations during model-development. The fact that most modeling groups now maintain a "CMIP compliant" data stream means that in principal without much effort they could readily adopt a set of well organized diagnostic capabilities specifically designed to operate on CMIP DECK experiments. Ultimately, a detailed listing of and access to analysis codes that are demonstrated to work "out of the box" with CMIP data could enable model developers (and others) to select those codes they wish to implement in-house, potentially enabling more systematic evaluation during the model development process.

  15. A survey of current practices for genomic sequencing test interpretation and reporting processes in US laboratories.

    PubMed

    O'Daniel, Julianne M; McLaughlin, Heather M; Amendola, Laura M; Bale, Sherri J; Berg, Jonathan S; Bick, David; Bowling, Kevin M; Chao, Elizabeth C; Chung, Wendy K; Conlin, Laura K; Cooper, Gregory M; Das, Soma; Deignan, Joshua L; Dorschner, Michael O; Evans, James P; Ghazani, Arezou A; Goddard, Katrina A; Gornick, Michele; Farwell Hagman, Kelly D; Hambuch, Tina; Hegde, Madhuri; Hindorff, Lucia A; Holm, Ingrid A; Jarvik, Gail P; Knight Johnson, Amy; Mighion, Lindsey; Morra, Massimo; Plon, Sharon E; Punj, Sumit; Richards, C Sue; Santani, Avni; Shirts, Brian H; Spinner, Nancy B; Tang, Sha; Weck, Karen E; Wolf, Susan M; Yang, Yaping; Rehm, Heidi L

    2017-05-01

    While the diagnostic success of genomic sequencing expands, the complexity of this testing should not be overlooked. Numerous laboratory processes are required to support the identification, interpretation, and reporting of clinically significant variants. This study aimed to examine the workflow and reporting procedures among US laboratories to highlight shared practices and identify areas in need of standardization. Surveys and follow-up interviews were conducted with laboratories offering exome and/or genome sequencing to support a research program or for routine clinical services. The 73-item survey elicited multiple choice and free-text responses that were later clarified with phone interviews. Twenty-one laboratories participated. Practices highly concordant across all groups included consent documentation, multiperson case review, and enabling patient opt-out of incidental or secondary findings analysis. Noted divergence included use of phenotypic data to inform case analysis and interpretation and reporting of case-specific quality metrics and methods. Few laboratory policies detailed procedures for data reanalysis, data sharing, or patient access to data. This study provides an overview of practices and policies of experienced exome and genome sequencing laboratories. The results enable broader consideration of which practices are becoming standard approaches, where divergence remains, and areas of development in best practice guidelines that may be helpful.Genet Med advance online publication 03 Novemeber 2016.

  16. OBIS-USA and Ocean Acidification: Chemical and Biological Observation Data, Integrated for Discovery and Applications

    NASA Astrophysics Data System (ADS)

    Fornwall, M.; Jewett, L.; Yates, K.; Goldstein, P.

    2012-12-01

    OBIS-USA (usgs.gov/obis-usa), a program of USGS Core Science, Analytics and Synthesis, is the US Regional node of the International Ocean Biogeographic Information System (iobis.org). OBIS data records observations of biological occurrences - identifiable species - at known time and coordinates. Within US research and operational communities, OBIS-USA serves an expanding range of applications by capturing details to accompany each observation: information to understand record quality and suitability for applications, details about observation circumstances such as sampling method and sampling conditions, and biological details such as sex, life stage, behavior and other characteristics. The NOAA Ocean Acidification Program and its associated data management effort (led by National Oceanographic Data Center) aim to enable users to locate, understand and use marine data from multiple sources and of multiple types to address questions related to ocean acidification and it impacts on marine ecosystems. By the nature of researching ocean acidification, data-driven applications require users to find and apply datasets that represent different disciplines as well as different researchers, organizations, agencies, funding models, data management practices and formats, and survey and observation methods. We refer to any collection(s) of data having diverse characteristics on these and other dimensions as "heterogeneous data". However, data management and Internet technologies enable the data itself and many of its diverse characteristics to be discoverable and understandable enough for users to build effective models, applications, and solutions. While it may not be simple to make heterogeneous data uniform or "seamless", current technologies enable at least the data characteristics to be sufficiently well-understood that users can consume data and accommodate its diverse characteristics in their process of generating outputs. Via this abstract and accompanying poster presentation, OBIS-USA and the NOAA Ocean Acidification Program describe proposed methods for obtaining diverse data, such as both chemical observations (those necessary to derive calcium carbonate saturation state) and biological marine observations (species occurrence, abundance), in order to use these sources of information in combined analysis for current and future research on ocean acidification and its relation to observed biology. Current OBIS-USA biological observations represent in-situ observations of marine taxa, and in the context of Ocean Acidification and this poster presentation, OBIS-USA shows a path toward including experimental biology observations as well as in-situ.

  17. Partial Variance of Increments Method in Solar Wind Observations and Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Greco, A.; Matthaeus, W. H.; Perri, S.; Osman, K. T.; Servidio, S.; Wan, M.; Dmitruk, P.

    2018-02-01

    The method called "PVI" (Partial Variance of Increments) has been increasingly used in analysis of spacecraft and numerical simulation data since its inception in 2008. The purpose of the method is to study the kinematics and formation of coherent structures in space plasmas, a topic that has gained considerable attention, leading the development of identification methods, observations, and associated theoretical research based on numerical simulations. This review paper will summarize key features of the method and provide a synopsis of the main results obtained by various groups using the method. This will enable new users or those considering methods of this type to find details and background collected in one place.

  18. Sensors and filters based on nano- and microchannel membranes for biomedical technologies

    NASA Astrophysics Data System (ADS)

    Romanov, S. I.; Pyshnyi, D. V.; Laktionov, P. P.

    2012-02-01

    A new technology is presented in a concise form which enables the silicon membranes to be produced over a wide range of channel dimensions from a few nanometers to tens of micrometers. There is good reason to believe that this method based on rather simple technical processing is competitive with other technologies for fabricating nanofluidic analysis systems. Some of the completed developments involving microchannel membranes, namely, the optical DNA-sensor and the human cell separation system are demonstrated without going into details. The other applications of micro- and nanochannel membranes, namely, the electrical sensor and electrokinetic filters for detecting and separating liquids and biomolecules are shown with the first results and are in progress.

  19. Nanostructured Electrode Materials for Electrochemical Capacitor Applications

    PubMed Central

    Choi, Hojin; Yoon, Hyeonseok

    2015-01-01

    The advent of novel organic and inorganic nanomaterials in recent years, particularly nanostructured carbons, conducting polymers, and metal oxides, has enabled the fabrication of various energy devices with enhanced performance. In this paper, we review in detail different nanomaterials used in the fabrication of electrochemical capacitor electrodes and also give a brief overview of electric double-layer capacitors, pseudocapacitors, and hybrid capacitors. From a materials point of view, the latest trends in electrochemical capacitor research are also discussed through extensive analysis of the literature and by highlighting notable research examples (published mostly since 2013). Finally, a perspective on next-generation capacitor technology is also given, including the challenges that lie ahead. PMID:28347044

  20. Forest Classification Accuracy as Influenced by Multispectral Scanner Spatial Resolution. [Sam Houston National Forest, Texas

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Sadowski, F. E.; Sarno, J. E.

    1976-01-01

    The author has identified the following significant results. A supervised classification within two separate ground areas of the Sam Houston National Forest was carried out for two sq meters spatial resolution MSS data. Data were progressively coarsened to simulate five additional cases of spatial resolution ranging up to 64 sq meters. Similar processing and analysis of all spatial resolutions enabled evaluations of the effect of spatial resolution on classification accuracy for various levels of detail and the effects on area proportion estimation for very general forest features. For very coarse resolutions, a subset of spectral channels which simulated the proposed thematic mapper channels was used to study classification accuracy.

  1. Neutrinos from Hell: the Dawn of Neutrino Geophysics

    ScienceCinema

    Gratta, Giorgio

    2018-02-26

    Seismic waves have been for long time the only messenger reporting on the conditions deep inside the Earth. While global seismology provides amazing details about the structure of our planet, it is only sensitive to the mechanical properties of rocks and not to their chemical composition. In the last 5 years KamLAND and Borexino have started measuring anti-neutrinos produced by Uranium and Thorium inside the Earth. Such "Geoneutrinos" double the number of tools available to study the Earth's interior, enabling a sort of global chemical analysis of the planet, albeit for two elements only. I will discuss the results of these new measurements and put them in the context of the Earth Sciences.

  2. INVENTORY ANALYSIS AND COST ACCOUNTING OF FACILITY MAINTANANCE IN WASTE INCINERATION

    NASA Astrophysics Data System (ADS)

    Morioka, Tohru; Ozaki, Taira; Kitazume, Keiichi; Yamamoto, Tsukasa

    A solid waste incineration plant consists of so many facilities and mechanical parts that it requires periodic careful maintenance of them for stable solid waste management. The current research investigates maintenance costs of the stoker type incinerator and continuous firing plants in detail and develops an accounting model for maintenance of them. This model is able to distinguish among the costs of inspection, repair and renewal by plant with seven process flaw s and three common factors. Parameters based on real data collected by questionnaire surveys give appropriate results in comparison with other plants and enable to apply the model to plants which incinerates 500 - 600 ton solid waste per day.

  3. The computational neurobiology of learning and reward.

    PubMed

    Daw, Nathaniel D; Doya, Kenji

    2006-04-01

    Following the suggestion that midbrain dopaminergic neurons encode a signal, known as a 'reward prediction error', used by artificial intelligence algorithms for learning to choose advantageous actions, the study of the neural substrates for reward-based learning has been strongly influenced by computational theories. In recent work, such theories have been increasingly integrated into experimental design and analysis. Such hybrid approaches have offered detailed new insights into the function of a number of brain areas, especially the cortex and basal ganglia. In part this is because these approaches enable the study of neural correlates of subjective factors (such as a participant's beliefs about the reward to be received for performing some action) that the computational theories purport to quantify.

  4. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  5. Citizen Science on Your Smartphone: An ELSI Research Agenda.

    PubMed

    Rothstein, Mark A; Wilbanks, John T; Brothers, Kyle B

    2015-01-01

    The prospect of newly-emerging, technology-enabled, unregulated citizen science health research poses a substantial challenge for traditional research ethics. Unquestionably, a significant amount of research ethics study is needed to prepare for the inevitable, widespread introduction of citizen science health research. Using the case study of mobile health (mHealth) research, this article provides an ethical, legal, and social implications (ELSI) research agenda for citizen science health research conducted outside conventional research institutions. The issues for detailed analysis include the role of IRBs, recruitment, inclusion and exclusion criteria, informed consent, confidentiality and security, vulnerable participants, incidental findings, and publication and data sharing. © 2015 American Society of Law, Medicine & Ethics, Inc.

  6. A semiconductor radiation imaging pixel detector for space radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Kroupa, Martin; Bahadori, Amir; Campbell-Ricketts, Thomas; Empl, Anton; Hoang, Son Minh; Idarraga-Munoz, John; Rios, Ryan; Semones, Edward; Stoffle, Nicholas; Tlustos, Lukas; Turecek, Daniel; Pinsky, Lawrence

    2015-07-01

    Progress in the development of high-performance semiconductor radiation imaging pixel detectors based on technologies developed for use in high-energy physics applications has enabled the development of a completely new generation of compact low-power active dosimeters and area monitors for use in space radiation environments. Such detectors can provide real-time information concerning radiation exposure, along with detailed analysis of the individual particles incident on the active medium. Recent results from the deployment of detectors based on the Timepix from the CERN-based Medipix2 Collaboration on the International Space Station (ISS) are reviewed, along with a glimpse of developments to come. Preliminary results from Orion MPCV Exploration Flight Test 1 are also presented.

  7. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  8. Using selection bias to explain the observed structure of Internet diffusions

    PubMed Central

    Golub, Benjamin; Jackson, Matthew O.

    2010-01-01

    Recently, large datasets stored on the Internet have enabled the analysis of processes, such as large-scale diffusions of information, at new levels of detail. In a recent study, Liben-Nowell and Kleinberg [(2008) Proc Natl Acad Sci USA 105:4633–4638] observed that the flow of information on the Internet exhibits surprising patterns whereby a chain letter reaches its typical recipient through long paths of hundreds of intermediaries. We show that a basic Galton–Watson epidemic model combined with the selection bias of observing only large diffusions suffices to explain these patterns. Thus, selection biases of which data we observe can radically change the estimation of classical diffusion processes. PMID:20534439

  9. Peptide neuromodulation in invertebrate model systems

    PubMed Central

    Taghert, Paul H.; Nitabach, Michael N.

    2012-01-01

    Neuropeptides modulate neural circuits controlling adaptive animal behaviors and physiological processes, such as feeding/metabolism, reproductive behaviors, circadian rhythms, central pattern generation, and sensorimotor integration. Invertebrate model systems have enabled detailed experimental analysis using combined genetic, behavioral, and physiological approaches. Here we review selected examples of neuropeptide modulation in crustaceans, mollusks, insects, and nematodes, with a particular emphasis on the genetic model organisms Drosophila melanogaster and Caenorhabditis elegans, where remarkable progress has been made. On the basis of this survey, we provide several integrating conceptual principles for understanding how neuropeptides modulate circuit function, and also propose that continued progress in this area requires increased emphasis on the development of richer, more sophisticated behavioral paradigms. PMID:23040808

  10. What do we do with all this video? Better understanding public engagement for image and video annotation

    NASA Astrophysics Data System (ADS)

    Wiener, C.; Miller, A.; Zykov, V.

    2016-12-01

    Advanced robotic vehicles are increasingly being used by oceanographic research vessels to enable more efficient and widespread exploration of the ocean, particularly the deep ocean. With cutting-edge capabilities mounted onto robotic vehicles, data at high resolutions is being generated more than ever before, enabling enhanced data collection and the potential for broader participation. For example, high resolution camera technology not only improves visualization of the ocean environment, but also expands the capacity to engage participants remotely through increased use of telepresence and virtual reality techniques. Schmidt Ocean Institute is a private, non-profit operating foundation established to advance the understanding of the world's oceans through technological advancement, intelligent observation and analysis, and open sharing of information. Telepresence-enabled research is an important component of Schmidt Ocean Institute's science research cruises, which this presentation will highlight. Schmidt Ocean Institute is one of the only research programs that make their entire underwater vehicle dive series available online, creating a collection of video that enables anyone to follow deep sea research in real time. We encourage students, educators and the general public to take advantage of freely available dive videos. Additionally, other SOI-supported internet platforms, have engaged the public in image and video annotation activities. Examples of these new online platforms, which utilize citizen scientists to annotate scientific image and video data will be provided. This presentation will include an introduction to SOI-supported video and image tagging citizen science projects, real-time robot tracking, live ship-to-shore communications, and an array of outreach activities that enable scientists to interact with the public and explore the ocean in fascinating detail.

  11. Mars Blueberry fields for ever

    NASA Astrophysics Data System (ADS)

    Moore, Jeffrey M.

    2004-04-01

    The Mars saga continues. The latest finds -- wide areas covered in balls of haematite, or 'blueberries', and large sulphate deposits in rocks -- enable us to draw in more details of the planet's past climate.

  12. Development and validation of an universal interface for compound-specific stable isotope analysis of chlorine (37Cl/35Cl) by GC-high-temperature conversion (HTC)-MS/IRMS.

    PubMed

    Renpenning, Julian; Hitzfeld, Kristina L; Gilevska, Tetyana; Nijenhuis, Ivonne; Gehre, Matthias; Richnow, Hans-Hermann

    2015-03-03

    A universal application of compound-specific isotope analysis of chlorine was thus far limited by the availability of suitable analysis techniques. In this study, gas chromatography in combination with a high-temperature conversion interface (GC-HTC), converting organic chlorine in the presence of H2 to gaseous HCl, was coupled to a dual-detection system, combining an ion trap mass spectrometer (MS) and isotope-ratio mass spectrometer (IRMS). The combination of the MS/IRMS detection enabled a detailed characterization, optimization, and online monitoring of the high-temperature conversion process via ion trap MS as well as a simultaneous chlorine isotope analysis by the IRMS. Using GC-HTC-MS/IRMS, chlorine isotope analysis at optimized conversion conditions resulted in very accurate isotope values (δ(37)Cl(SMOC)) for measured reference material with known isotope composition, including chlorinated ethylene, chloromethane, hexachlorocyclohexane, and trichloroacetic acids methyl ester. Respective detection limits were determined to be <15 nmol Cl on column with achieved precision of <0.3‰.

  13. Methods for the visualization and analysis of extracellular matrix protein structure and degradation.

    PubMed

    Leonard, Annemarie K; Loughran, Elizabeth A; Klymenko, Yuliya; Liu, Yueying; Kim, Oleg; Asem, Marwa; McAbee, Kevin; Ravosa, Matthew J; Stack, M Sharon

    2018-01-01

    This chapter highlights methods for visualization and analysis of extracellular matrix (ECM) proteins, with particular emphasis on collagen type I, the most abundant protein in mammals. Protocols described range from advanced imaging of complex in vivo matrices to simple biochemical analysis of individual ECM proteins. The first section of this chapter describes common methods to image ECM components and includes protocols for second harmonic generation, scanning electron microscopy, and several histological methods of ECM localization and degradation analysis, including immunohistochemistry, Trichrome staining, and in situ zymography. The second section of this chapter details both a common transwell invasion assay and a novel live imaging method to investigate cellular behavior with respect to collagen and other ECM proteins of interest. The final section consists of common electrophoresis-based biochemical methods that are used in analysis of ECM proteins. Use of the methods described herein will enable researchers to gain a greater understanding of the role of ECM structure and degradation in development and matrix-related diseases such as cancer and connective tissue disorders. © 2018 Elsevier Inc. All rights reserved.

  14. A new predictive multi-zone model for HCCI engine combustion

    DOE PAGES

    Bissoli, Mattia; Frassoldati, Alessio; Cuoci, Alberto; ...

    2016-06-30

    Here, this work introduces a new predictive multi-zone model for the description of combustion in Homogeneous Charge Compression Ignition (HCCI) engines. The model exploits the existing OpenSMOKE++ computational suite to handle detailed kinetic mechanisms, providing reliable predictions of the in-cylinder auto-ignition processes. All the elements with a significant impact on the combustion performances and emissions, like turbulence, heat and mass exchanges, crevices, residual burned gases, thermal and feed stratification are taken into account. Compared to other computational approaches, this model improves the description of mixture stratification phenomena by coupling a wall heat transfer model derived from CFD application with amore » proper turbulence model. Furthermore, the calibration of this multi-zone model requires only three parameters, which can be derived from a non-reactive CFD simulation: these adaptive variables depend only on the engine geometry and remain fixed across a wide range of operating conditions, allowing the prediction of auto-ignition, pressure traces and pollutants. This computational framework enables the use of detail kinetic mechanisms, as well as Rate of Production Analysis (RoPA) and Sensitivity Analysis (SA) to investigate the complex chemistry involved in the auto-ignition and the pollutants formation processes. In the final sections of the paper, these capabilities are demonstrated through the comparison with experimental data.« less

  15. SCALE Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations

    DOE PAGES

    Perfetti, Christopher M.; Rearden, Bradley T.; Martin, William R.

    2016-02-25

    Sensitivity coefficients describe the fractional change in a system response that is induced by changes to system parameters and nuclear data. The Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, including quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the developmentmore » of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Tracklength importance CHaracterization (CLUTCH) and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in the CE-KENO framework of the SCALE code system to enable TSUNAMI-3D to perform eigenvalue sensitivity calculations using continuous-energy Monte Carlo methods. This work provides a detailed description of the theory behind the CLUTCH method and describes in detail its implementation. This work explores the improvements in eigenvalue sensitivity coefficient accuracy that can be gained through the use of continuous-energy sensitivity methods and also compares several sensitivity methods in terms of computational efficiency and memory requirements.« less

  16. Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae.

    PubMed

    Toma, Milan; Bloodworth, Charles H; Pierce, Eric L; Einstein, Daniel R; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S

    2017-03-01

    The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations.

  17. Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae

    PubMed Central

    Toma, Milan; Bloodworth, Charles H.; Pierce, Eric L.; Einstein, Daniel R.; Cochran, Richard P.; Yoganathan, Ajit P.; Kunzelman, Karyn S.

    2016-01-01

    The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations. PMID:27624659

  18. Guinea pig models of asthma.

    PubMed

    McGovern, Alice E; Mazzone, Stuart B

    2014-12-01

    Described in this unit are methods for establishing guinea pig models of asthma. Sufficient detail is provided to enable investigators to study bronchoconstriction, cough, airway hyperresponsiveness, inflammation, and remodeling. Copyright © 2014 John Wiley & Sons, Inc.

  19. 23 CFR 450.312 - Metropolitan planning area boundaries.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... area. (d) MPA boundaries may be established to coincide with the geography of regional economic... descriptions shall be submitted either as a geo-spatial database or described in sufficient detail to enable...

  20. 23 CFR 450.312 - Metropolitan planning area boundaries.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... area. (d) MPA boundaries may be established to coincide with the geography of regional economic... descriptions shall be submitted either as a geo-spatial database or described in sufficient detail to enable...

  1. 23 CFR 450.312 - Metropolitan planning area boundaries.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... area. (d) MPA boundaries may be established to coincide with the geography of regional economic... descriptions shall be submitted either as a geo-spatial database or described in sufficient detail to enable...

  2. A second-generation constrained reaction volume shock tube

    NASA Astrophysics Data System (ADS)

    Campbell, M. F.; Tulgestke, A. M.; Davidson, D. F.; Hanson, R. K.

    2014-05-01

    We have developed a shock tube that features a sliding gate valve in order to mechanically constrain the reactive test gas mixture to an area close to the shock tube endwall, separating it from a specially formulated non-reactive buffer gas mixture. This second-generation Constrained Reaction Volume (CRV) strategy enables near-constant-pressure shock tube test conditions for reactive experiments behind reflected shocks, thereby enabling improved modeling of the reactive flow field. Here we provide details of the design and operation of the new shock tube. In addition, we detail special buffer gas tailoring procedures, analyze the buffer/test gas interactions that occur on gate valve opening, and outline the size range of fuels that can be studied using the CRV technique in this facility. Finally, we present example low-temperature ignition delay time data to illustrate the CRV shock tube's performance.

  3. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  4. Quantitative proteomics and systems analysis of cultured H9C2 cardiomyoblasts during differentiation over time supports a 'function follows form' model of differentiation.

    PubMed

    Kankeu, Cynthia; Clarke, Kylie; Van Haver, Delphi; Gevaert, Kris; Impens, Francis; Dittrich, Anna; Roderick, H Llewelyn; Passante, Egle; Huber, Heinrich J

    2018-05-17

    The rat cardiomyoblast cell line H9C2 has emerged as a valuable tool for studying cardiac development, mechanisms of disease and toxicology. We present here a rigorous proteomic analysis that monitored the changes in protein expression during differentiation of H9C2 cells into cardiomyocyte-like cells over time. Quantitative mass spectrometry followed by gene ontology (GO) enrichment analysis revealed that early changes in H9C2 differentiation are related to protein pathways of cardiac muscle morphogenesis and sphingolipid synthesis. These changes in the proteome were followed later in the differentiation time-course by alterations in the expression of proteins involved in cation transport and beta-oxidation. Studying the temporal profile of the H9C2 proteome during differentiation in further detail revealed eight clusters of co-regulated proteins that can be associated with early, late, continuous and transient up- and downregulation. Subsequent reactome pathway analysis based on these eight clusters further corroborated and detailed the results of the GO analysis. Specifically, this analysis confirmed that proteins related to pathways in muscle contraction are upregulated early and transiently, and proteins relevant to extracellular matrix organization are downregulated early. In contrast, upregulation of proteins related to cardiac metabolism occurs at later time points. Finally, independent validation of the proteomics results by immunoblotting confirmed hereto unknown regulators of cardiac structure and ionic metabolism. Our results are consistent with a 'function follows form' model of differentiation, whereby early and transient alterations of structural proteins enable subsequent changes that are relevant to the characteristic physiology of cardiomyocytes.

  5. Analysis of Co-spatial UV-Optical STIS Spectra of Planetary Nebulae From HST Cycle 19 GO 12600

    NASA Astrophysics Data System (ADS)

    Miller, Timothy R.; Henry, Richard B. C.; Dufour, Reginald J.; Kwitter, Karen B.; Shaw, Richard A.; Balick, Bruce; Corradi, Romano

    2015-01-01

    We present an analysis of five spatially resolved planetary nebulae (PNe), NGC 5315, NGC 5882, NGC 7662, IC 2165, and IC 3568, from observations in the Cycle 19 program GO 12600 using HST STIS. Details of the observations and data are presented in the poster by Dufour et al. in this session. These five observations cover the wavelength range 1150-10,270 Å with 0.2 and 0.5 arcsec wide slits, and are co-spatial to 0.1 arcsec along a 25 arcsec length across each nebula. This unprecedented resolution in both wavelength and spatial coverage enabled detailed studies of physical conditions and abundances from UV line ion emissions (compared to optical lines). We first analyzed the low- and moderate-resolution UV emission lines of carbon using the resolved lines of C III] 1906.68 and 1908.73, which yielded a direct measurement of the density within the volume occupied by doubly-ionized carbon and other similar co-spatial ions. Next, each PN spectrum was divided into spatial sub-regions in order to assess inferred density variations among the sub-regions along the entire slit. Variations in electron temperature and chemical abundances were also probed. Lastly, these nebulae were modeled in detail with the photoionization code CLOUDY. This modeling tested different density profiles in order to reproduce the observed density variations and temperature fluctuations, and constrain central star parameters. We gratefully acknowledge generous support from NASA through grants related to the Cycle 19 program GO 12600, as well as from the University of Oklahoma.

  6. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis.

    PubMed

    Gao, Wei; Emaminejad, Sam; Nyein, Hnin Yin Yin; Challa, Samyuktha; Chen, Kevin; Peck, Austin; Fahad, Hossain M; Ota, Hiroki; Shiraki, Hiroshi; Kiriya, Daisuke; Lien, Der-Hsien; Brooks, George A; Davis, Ronald W; Javey, Ali

    2016-01-28

    Wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual's state of health. Sampling human sweat, which is rich in physiological information, could enable non-invasive monitoring. Previously reported sweat-based and other non-invasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanically flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plastic-based sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing. This application could not have been realized using either of these technologies alone owing to their respective inherent limitations. The wearable system is used to measure the detailed sweat profile of human subjects engaged in prolonged indoor and outdoor physical activities, and to make a real-time assessment of the physiological state of the subjects. This platform enables a wide range of personalized diagnostic and physiological monitoring applications.

  7. Analysis of In-Space Assembly of Modular Systems

    NASA Technical Reports Server (NTRS)

    Moses, Robert W.; VanLaak, James; Johnson, Spencer L.; Chytka, Trina M.; Reeves, John D.; Todd, B. Keith; Moe, Rud V.; Stambolian, Damon B.

    2005-01-01

    Early system-level life cycle assessments facilitate cost effective optimization of system architectures to enable implementation of both modularity and in-space assembly, two key Exploration Systems Research & Technology (ESR&T) Strategic Challenges. Experiences with the International Space Station (ISS) demonstrate that the absence of this rigorous analysis can result in increased cost and operational risk. An effort is underway, called Analysis of In-Space Assembly of Modular Systems, to produce an innovative analytical methodology, including an evolved analysis toolset and proven processes in a collaborative engineering environment, to support the design and evaluation of proposed concepts. The unique aspect of this work is that it will produce the toolset, techniques and initial products to analyze and compare the detailed, life cycle costs and performance of different implementations of modularity for in-space assembly. A multi-Center team consisting of experienced personnel from the Langley Research Center, Johnson Space Center, Kennedy Space Center, and the Goddard Space Flight Center has been formed to bring their resources and experience to this development. At the end of this 30-month effort, the toolset will be ready to support the Exploration Program with an integrated assessment strategy that embodies all life-cycle aspects of the mission from design and manufacturing through operations to enable early and timely selection of an optimum solution among many competing alternatives. Already there are many different designs for crewed missions to the Moon that present competing views of modularity requiring some in-space assembly. The purpose of this paper is to highlight the approach for scoring competing designs.

  8. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis

    NASA Astrophysics Data System (ADS)

    Gao, Wei; Emaminejad, Sam; Nyein, Hnin Yin Yin; Challa, Samyuktha; Chen, Kevin; Peck, Austin; Fahad, Hossain M.; Ota, Hiroki; Shiraki, Hiroshi; Kiriya, Daisuke; Lien, Der-Hsien; Brooks, George A.; Davis, Ronald W.; Javey, Ali

    2016-01-01

    Wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual’s state of health. Sampling human sweat, which is rich in physiological information, could enable non-invasive monitoring. Previously reported sweat-based and other non-invasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanically flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plastic-based sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing. This application could not have been realized using either of these technologies alone owing to their respective inherent limitations. The wearable system is used to measure the detailed sweat profile of human subjects engaged in prolonged indoor and outdoor physical activities, and to make a real-time assessment of the physiological state of the subjects. This platform enables a wide range of personalized diagnostic and physiological monitoring applications.

  9. Array analysis of electromagnetic radiation from radio transmitters for submarine communication

    NASA Astrophysics Data System (ADS)

    Füllekrug, Martin; Mezentsev, Andrew; Watson, Robert; Gaffet, Stéphane; Astin, Ivan; Evans, Adrian

    2014-12-01

    The array analyses used for seismic and infrasound research are adapted and applied here to the electromagnetic radiation from radio transmitters for submarine communication. It is found that the array analysis enables a determination of the slowness and the arrival azimuth of the wave number vectors associated with the electromagnetic radiation. The array analysis is applied to measurements of ˜20-24 kHz radio waves from transmitters for submarine communication with an array of 10 radio receivers distributed over an area of ˜1 km ×1 km. The observed slowness of the observed wave number vectors range from ˜2.7 ns/m to ˜4.1 ns/m, and the deviations between the expected arrival azimuths and the observed arrival azimuths range from ˜-9.7° to ˜14.5°. The experimental results suggest that it is possible to determine the locations of radio sources from transient luminous events above thunderclouds with an array of radio receivers toward detailed investigations of the electromagnetic radiation from sprites.

  10. GPS survey of the western Tien Shan

    NASA Technical Reports Server (NTRS)

    Hager, Bradford H.; Molnar, Peter H.; Hamburger, Michael W.; Reilinger, Robert E.

    1995-01-01

    There were two major developments in 1994 in our collaborative GPS experiment in the Tien Shan of the Former Soviet Union (FSU). Both were motivated by our expectation that we will ultimately obtain better science at lower cost if we involve our colleagues in the FSU more deeply in (1) the collection and (2) the analysis of data. As an experimental test of the concept of having our local collaborators carry out the field work semi-autonomously, we sent 6 MIT receivers to the Tien Shan for a period of 3 months. To enable our collaborators to have the capability for data analysis, we provided computers for two data analysis centers and organized a two-week training session. This report emphasizes the rationale for deeper involvement of FSU scientists, describes the training sessions, discusses the data collection, and presents the results. We also discuss future plans. More detailed discussion of background, general scientific objectives, discussions with collaborators, and results for the campaigns in 1992 and 1993 have been given in previous reports.

  11. Logistics Modeling for Lunar Exploration Systems

    NASA Technical Reports Server (NTRS)

    Andraschko, Mark R.; Merrill, R. Gabe; Earle, Kevin D.

    2008-01-01

    The extensive logistics required to support extended crewed operations in space make effective modeling of logistics requirements and deployment critical to predicting the behavior of human lunar exploration systems. This paper discusses the software that has been developed as part of the Campaign Manifest Analysis Tool in support of strategic analysis activities under the Constellation Architecture Team - Lunar. The described logistics module enables definition of logistics requirements across multiple surface locations and allows for the transfer of logistics between those locations. A key feature of the module is the loading algorithm that is used to efficiently load logistics by type into carriers and then onto landers. Attention is given to the capabilities and limitations of this loading algorithm, particularly with regard to surface transfers. These capabilities are described within the context of the object-oriented software implementation, with details provided on the applicability of using this approach to model other human exploration scenarios. Some challenges of incorporating probabilistics into this type of logistics analysis model are discussed at a high level.

  12. Asynchronous data acquisition and on-the-fly analysis of dose fractionated cryoEM images by UCSFImage

    PubMed Central

    Li, Xueming; Zheng, Shawn; Agard, David A.; Cheng, Yifan

    2015-01-01

    Newly developed direct electron detection cameras have a high image output frame rate that enables recording dose fractionated image stacks of frozen hydrated biological samples by electron cryomicroscopy (cryoEM). Such novel image acquisition schemes provide opportunities to analyze cryoEM data in ways that were previously impossible. The file size of a dose fractionated image stack is 20 ~ 60 times larger than that of a single image. Thus, efficient data acquisition and on-the-fly analysis of a large number of dose-fractionated image stacks become a serious challenge to any cryoEM data acquisition system. We have developed a computer-assisted system, named UCSFImage4, for semi-automated cryo-EM image acquisition that implements an asynchronous data acquisition scheme. This facilitates efficient acquisition, on-the-fly motion correction, and CTF analysis of dose fractionated image stacks with a total time of ~60 seconds/exposure. Here we report the technical details and configuration of this system. PMID:26370395

  13. Latent semantic analysis.

    PubMed

    Evangelopoulos, Nicholas E

    2013-11-01

    This article reviews latent semantic analysis (LSA), a theory of meaning as well as a method for extracting that meaning from passages of text, based on statistical computations over a collection of documents. LSA as a theory of meaning defines a latent semantic space where documents and individual words are represented as vectors. LSA as a computational technique uses linear algebra to extract dimensions that represent that space. This representation enables the computation of similarity among terms and documents, categorization of terms and documents, and summarization of large collections of documents using automated procedures that mimic the way humans perform similar cognitive tasks. We present some technical details, various illustrative examples, and discuss a number of applications from linguistics, psychology, cognitive science, education, information science, and analysis of textual data in general. WIREs Cogn Sci 2013, 4:683-692. doi: 10.1002/wcs.1254 CONFLICT OF INTEREST: The author has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. © 2013 John Wiley & Sons, Ltd.

  14. A high-resolution 7-Tesla fMRI dataset from complex natural stimulation with an audio movie.

    PubMed

    Hanke, Michael; Baumgartner, Florian J; Ibe, Pierre; Kaule, Falko R; Pollmann, Stefan; Speck, Oliver; Zinke, Wolf; Stadler, Jörg

    2014-01-01

    Here we present a high-resolution functional magnetic resonance (fMRI) dataset - 20 participants recorded at high field strength (7 Tesla) during prolonged stimulation with an auditory feature film ("Forrest Gump"). In addition, a comprehensive set of auxiliary data (T1w, T2w, DTI, susceptibility-weighted image, angiography) as well as measurements to assess technical and physiological noise components have been acquired. An initial analysis confirms that these data can be used to study common and idiosyncratic brain response patterns to complex auditory stimulation. Among the potential uses of this dataset are the study of auditory attention and cognition, language and music perception, and social perception. The auxiliary measurements enable a large variety of additional analysis strategies that relate functional response patterns to structural properties of the brain. Alongside the acquired data, we provide source code and detailed information on all employed procedures - from stimulus creation to data analysis. In order to facilitate replicative and derived works, only free and open-source software was utilized.

  15. Analysis of Context Dependence in Social Interaction Networks of a Massively Multiplayer Online Role-Playing Game

    PubMed Central

    Son, Seokshin; Kang, Ah Reum; Kim, Hyun-chul; Kwon, Taekyoung; Park, Juyong; Kim, Huy Kang

    2012-01-01

    Rapid advances in modern computing and information technology have enabled millions of people to interact online via various social network and gaming services. The widespread adoption of such online services have made possible analysis of large-scale archival data containing detailed human interactions, presenting a very promising opportunity to understand the rich and complex human behavior. In collaboration with a leading global provider of Massively Multiplayer Online Role-Playing Games (MMORPGs), here we present a network science-based analysis of the interplay between distinct types of user interaction networks in the virtual world. We find that their properties depend critically on the nature of the context-interdependence of the interactions, highlighting the complex and multilayered nature of human interactions, a robust understanding of which we believe may prove instrumental in the designing of more realistic future virtual arenas as well as provide novel insights to the science of collective human behavior. PMID:22496771

  16. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    NASA Technical Reports Server (NTRS)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  17. Improving surveillance of sexually transmitted infections using mandatory electronic clinical reporting: the genitourinary medicine clinic activity dataset, England, 2009 to 2013.

    PubMed

    Savage, E J; Mohammed, H; Leong, G; Duffell, S; Hughes, G

    2014-12-04

    A new electronic surveillance system for sexually transmitted infections (STIs) was introduced in England in 2009. The genitourinary medicine clinic activity dataset (GUMCAD) is a mandatory, disaggregated, pseudo-anonymised data return submitted by all STI clinics across England. The dataset includes information on all STI diagnoses made and services provided alongside demographic characteristics for every patient attendance at a clinic. The new system enables the timely analysis and publication of routine STI data, detailed analyses of risk groups and longitudinal analyses of clinic attendees. The system offers flexibility so new codes can be introduced to help monitor outbreaks or unusual STI activity. From January 2009 to December 2013 inclusive, over twenty-five million records from a total of 6,668,648 patients of STI clinics have been submitted. This article describes the successful implementation of this new surveillance system and the types of epidemiological outputs and analyses that GUMCAD enables. The challenges faced are discussed and forthcoming developments in STI surveillance in England are described.

  18. High resolution production water footprints of the United States

    NASA Astrophysics Data System (ADS)

    Marston, L.; Yufei, A.; Konar, M.; Mekonnen, M.; Hoekstra, A. Y.

    2017-12-01

    The United States is the largest producer and consumer of goods and services in the world. Rainfall, surface water supplies, and groundwater aquifers represent a fundamental input to this economic production. Despite the importance of water resources to economic activity, we do not have consistent information on water use for specific locations and economic sectors. A national, high-resolution database of water use by sector would provide insight into US utilization and dependence on water resources for economic production. To this end, we calculate the water footprint of over 500 food, energy, mining, services, and manufacturing industries and goods produced in the US. To do this, we employ a data intensive approach that integrates water footprint and input-output techniques into a novel methodological framework. This approach enables us to present the most detailed and comprehensive water footprint analysis of any country to date. This study broadly contributes to our understanding of water in the US economy, enables supply chain managers to assess direct and indirect water dependencies, and provides opportunities to reduce water use through benchmarking.

  19. Gaining a Doctorate in Nursing in Chile: a path not without its difficulties

    PubMed Central

    Valenzuela-Suazo, Sandra; Sanhueza-Alvarado, Olivia

    2015-01-01

    OBJECTIVE: to analyze in detail the current situation of doctorate training in Nursing in Chile. METHODOLOGY: through a historical and contextual analysis of the background to the development of postgraduate education in Nursing, especially at doctorate level. RESULTS: aspects that limit development were identified in national institutionalism of the sciences as well as in higher education and health institutions, especially the limited value placed on nursing as an area of knowledge in this country, the lack of clear institutional policies for postgraduate studies, as well as the postgraduate's re-inclusion into the academic and care area, with access to national research funds difficult. FINAL CONSIDERATIONS: access to grants and funds, together with recognition as an area of knowledge belonging on academic schedules, especially in health institutions, are the main challenges to consolidation. One aspect that would enable a more rapid advance is through national and international inter-institutional agreements, adding together potential, with access to funds for studies and academic and student internships, enabling joint research to go ahead. PMID:26312629

  20. Capturing commemoration: Using mobile recordings within memory research

    PubMed Central

    Birdsall, Carolyn; Drozdzewski, Danielle

    2017-01-01

    This paper details the contribution of mobile devices to capturing commemoration in action. It investigates the incorporation of audio and sound recording devices, observation, and note-taking into a mobile (auto)ethnographic research methodology, to research a large-scale commemorative event in Amsterdam, the Netherlands. On May 4, 2016, the sounds of a Silent March—through the streets of Amsterdam to Dam Square—were recorded and complemented by video grabs of the march’s participants and onlookers. We discuss how the mixed method enabled a multilevel analysis across visual, textual, and aural layers of the commemorative atmosphere. Our visual data aided in our evaluation of the construction of collective spectacle, while the audio data necessitated that we venture into new analytic territory. Using Sonic Visualiser, we uncovered alternative methods of “reading” landscape by identifying different sound signatures in the acoustic environment. Together, this aural and visual representation of the May 4 events enabled the identification of spatial markers and the temporal unfolding of the Silent March and the national 2 minutes’ silence in Amsterdam’s Dam Square. PMID:29780585

  1. CMIS: Crime Map Information System for Safety Environment

    NASA Astrophysics Data System (ADS)

    Kasim, Shahreen; Hafit, Hanayanti; Yee, Ng Peng; Hashim, Rathiah; Ruslai, Husni; Jahidin, Kamaruzzaman; Syafwan Arshad, Mohammad

    2016-11-01

    Crime Map is an online web based geographical information system that assists the public and users to visualize crime activities geographically. It acts as a platform for the public communities to share crime activities they encountered. Crime and violence plague the communities we are living in. As part of the community, crime prevention is everyone's responsibility. The purpose of Crime Map is to provide insights of the crimes occurring around Malaysia and raise the public's awareness on crime activities in their neighbourhood. For that, Crime Map visualizes crime activities on a geographical heat maps, generated based on geospatial data. Crime Map analyse data obtained from crime reports to generate useful information on crime trends. At the end of the development, users should be able to make use of the system to access to details of crime reported, crime analysis and report crimes activities. The development of Crime Map also enable the public to obtain insights about crime activities in their area. Thus, enabling the public to work together with the law enforcer to prevent and fight crime.

  2. Towards validated chemistry at extreme conditions: reactive MD simulations of shocked Polyvinyl Nitrate and Nitromethane

    NASA Astrophysics Data System (ADS)

    Islam, Md Mahbubul; Strachan, Alejandro

    A detailed atomistic-level understanding of the ultrafast chemistry of detonation processes of high energy materials is crucial to understand their performance and safety. Recent advances in laser shocks and ultra-fast spectroscopy is yielding the first direct experimental evidence of chemistry at extreme conditions. At the same time, reactive molecular dynamics (MD) in current high-performance computing platforms enable an atomic description of shock-induced chemistry with length and timescales approaching those of experiments. We use MD simulations with the reactive force field ReaxFF to investigate the shock-induced chemical decomposition mechanisms of polyvinyl nitrate (PVN) and nitromethane (NM). The effect of shock pressure on chemical reaction mechanisms and kinetics of both the materials are investigated. For direct comparison of our simulation results with experimentally derived IR absorption data, we performed spectral analysis using atomistic velocity at various shock conditions. The combination of reactive MD simulations and ultrafast spectroscopy enables both the validation of ReaxFF at extreme conditions and contributes to the interpretation of the experimental data relating changes in spectral features to atomic processes. Office of Naval Research MURI program.

  3. The UCSC Genome Browser: What Every Molecular Biologist Should Know.

    PubMed

    Mangan, Mary E; Williams, Jennifer M; Kuhn, Robert M; Lathe, Warren C

    2014-07-01

    Electronic data resources can enable molecular biologists to quickly get information from around the world that a decade ago would have been buried in papers scattered throughout the library. The ability to access, query, and display these data makes benchwork much more efficient and drives new discoveries. Increasingly, mastery of software resources and corresponding data repositories is required to fully explore the volume of data generated in biomedical and agricultural research, because only small amounts of data are actually found in traditional publications. The UCSC Genome Browser provides a wealth of data and tools that advance understanding of genomic context for many species, enable detailed analysis of data, and provide the ability to interrogate regions of interest across disparate data sets from a wide variety of sources. Researchers can also supplement the standard display with their own data to query and share this with others. Effective use of these resources has become crucial to biological research today, and this unit describes some practical applications of the UCSC Genome Browser. Copyright © 2014 John Wiley & Sons, Inc.

  4. Upright Imaging of Drosophila Egg Chambers

    PubMed Central

    Manning, Lathiena; Starz-Gaiano, Michelle

    2015-01-01

    Drosophila melanogaster oogenesis provides an ideal context for studying varied developmental processes since the ovary is relatively simple in architecture, is well-characterized, and is amenable to genetic analysis. Each egg chamber consists of germ-line cells surrounded by a single epithelial layer of somatic follicle cells. Subsets of follicle cells undergo differentiation during specific stages to become several different cell types. Standard techniques primarily allow for a lateral view of egg chambers, and therefore a limited view of follicle cell organization and identity. The upright imaging protocol describes a mounting technique that enables a novel, vertical view of egg chambers with a standard confocal microscope. Samples are first mounted between two layers of glycerin jelly in a lateral (horizontal) position on a glass microscope slide. The jelly with encased egg chambers is then cut into blocks, transferred to a coverslip, and flipped to position egg chambers upright. Mounted egg chambers can be imaged on either an upright or an inverted confocal microscope. This technique enables the study of follicle cell specification, organization, molecular markers, and egg development with new detail and from a new perspective. PMID:25867882

  5. Micro- and macrostructural characterization of polyvinylpirrolidone rotary-spun fibers.

    PubMed

    Sebe, István; Kállai-Szabó, Barnabás; Kovács, Krisztián Norbert; Szabadi, Enikő; Zelkó, Romána

    2015-01-01

    The application of high-speed rotary spinning can offer a useful mean for either preparation of fibrous intermediate for conventional dosage forms or drug delivery systems. Polyvinylpyrrolidone (PVP) and poly(vinylpyrrolidone-vinylacetate) (PVP VA) micro- and nanofibers of different polymer concentrations and solvent ratios were prepared with a high-speed rotary spinning technique. In order to study the influence of parameters that enable successful fiber production from polymeric viscous solutions, a complex micro- and macrostructural screening method was implemented. The obtained fiber mats were subjected to detailed morphological analysis using scanning electron microscope (SEM), and rheological measurements while the microstructural changes of fiber samples, based on the free volume changes, was analyzed by positron annihilation lifetime spectroscopy (PALS) and compared with their mechanical characteristics. The plasticizing effect of water tracked by ortho-positronium lifetime changes in relation to the mechanical properties of fibers. A concentration range of polyvinylpyrrolidone solutions was defined for the preparation of fibers of optimum fiber morphology and mechanical properties. The method enabled fiber formulation of advantageous functionality-related properties for further formulation of solid dosage forms.

  6. Perturbation Experiments: Approaches for Metabolic Pathway Analysis in Bioreactors.

    PubMed

    Weiner, Michael; Tröndle, Julia; Albermann, Christoph; Sprenger, Georg A; Weuster-Botz, Dirk

    2016-01-01

    In the last decades, targeted metabolic engineering of microbial cells has become one of the major tools in bioprocess design and optimization. For successful application, a detailed knowledge is necessary about the relevant metabolic pathways and their regulation inside the cells. Since in vitro experiments cannot display process conditions and behavior properly, process data about the cells' metabolic state have to be collected in vivo. For this purpose, special techniques and methods are necessary. Therefore, most techniques enabling in vivo characterization of metabolic pathways rely on perturbation experiments, which can be divided into dynamic and steady-state approaches. To avoid any process disturbance, approaches which enable perturbation of cell metabolism in parallel to the continuing production process are reasonable. Furthermore, the fast dynamics of microbial production processes amplifies the need of parallelized data generation. These points motivate the development of a parallelized approach for multiple metabolic perturbation experiments outside the operating production reactor. An appropriate approach for in vivo characterization of metabolic pathways is presented and applied exemplarily to a microbial L-phenylalanine production process on a 15 L-scale.

  7. WE-G-9A-01: Radiation Oncology Outcomes Informatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, C; Miller, R; Sloan, J

    2014-06-15

    The construction of databases and support software to enable routine and systematic aggregation, analysis and reporting of patient outcomes data is emerging as an important area. “How have results for our patients been affected by the improvements we have made in our practice and in the technologies we use?” To answer this type of fundamental question about the overall pattern of efficacy observed, it is necessary to systematically gather and analyze data on all patients treated within a clinic. Clinical trials answer, in great depth and detail, questions about outcomes for the subsets of patients enrolled in a given trial.more » However, routine aggregation and analysis of key treatment parameter data and outcomes information for all patients is necessary to recognize emergent patterns that would be of interest from a public health or practice perspective and could better inform design of clinical trials or the evolution of best practice principals. To address these questions, Radiation Oncology outcomes databases need to be constructed to enable combination essential data from a broad group of data types including: diagnosis and staging, dose volume histogram metrics, patient reported outcomes, toxicity metrics, performance status, treatment plan parameters, demographics, DICOM data and demographics. Developing viable solutions to automate aggregation and analysis of this data requires multidisciplinary efforts to define nomenclatures, modify clinical processes and develop software and database tools requires detailed understanding of both clinical and technical issues. This session will cover the developing area of Radiation Oncology Outcomes Informatics. Learning Objectives: Audience will be able to speak to the technical requirements (software, database, web services) which must be considered in designing an outcomes database. Audience will be able to understand the content and the role of patient reported outcomes as compared to traditional toxicity measures. Audience will be understand approaches, clinical process changes, consensus building efforts and standardizations which must be addressed to succeed in a multi-disciplinary effort to aggregate data for all patients. Audience will be able to discuss technical and process issues related to pooling data among institutions in the context of collaborative studies among the presenting institutions.« less

  8. Speciated Elemental and Isotopic Characterization of Atmospheric Aerosols - Recent Advances

    NASA Astrophysics Data System (ADS)

    Shafer, M.; Majestic, B.; Schauer, J.

    2007-12-01

    Detailed elemental, isotopic, and chemical speciation analysis of aerosol particulate matter (PM) can provide valuable information on PM sources, atmospheric processing, and climate forcing. Certain PM sources may best be resolved using trace metal signatures, and elemental and isotopic fingerprints can supplement and enhance molecular maker analysis of PM for source apportionment modeling. In the search for toxicologically relevant components of PM, health studies are increasingly demanding more comprehensive characterization schemes. It is also clear that total metal analysis is at best a poor surrogate for the bioavailable component, and analytical techniques that address the labile component or specific chemical species are needed. Recent sampling and analytical developments advanced by the project team have facilitated comprehensive characterization of even very small masses of atmospheric PM. Historically; this level of detail was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. These advances have enabled the coupling of advanced chemical characterization to vital field sampling approaches that typically supply only very limited PM mass; e.g. (1) particle size-resolved sampling; (2) personal sampler collections; and (3) fine temporal scale sampling. The analytical tools that our research group is applying include: (1) sector field (high-resolution-HR) ICP-MS, (2) liquid waveguide long-path spectrophotometry (LWG-LPS), and (3) synchrotron x-ray absorption spectroscopy (sXAS). When coupled with an efficient and validated solubilization method, the HR-ICP-MS can provide quantitative elemental information on over 50 elements in microgram quantities of PM. The high mass resolution and enhanced signal-to-noise of HR-ICP-MS significantly advance data quality and quantity over that possible with traditional quadrupole ICP-MS. The LWG-LPS system enables an assessment of the soluble/labile components of PM, while simultaneously providing critical oxidation state speciation data. Importantly, the LWG- LPS can be deployed in a semi-real-time configuration to probe fine temporal scale variations in atmospheric processing or sources of PM. The sXAS is providing complementary oxidation state speciation of bulk PM. Using examples from our research; we will illustrate the capabilities and applications of these new methods.

  9. Development and Validation of a New Blade Element Momentum Skewed-Wake Model within AeroDyn: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ning, S. A.; Hayman, G.; Damiani, R.

    Blade element momentum methods, though conceptually simple, are highly useful for analyzing wind turbines aerodynamics and are widely used in many design and analysis applications. A new version of AeroDyn is being developed to take advantage of new robust solution methodologies, conform to a new modularization framework for National Renewable Energy Laboratory's FAST, utilize advanced skewed-wake analysis methods, fix limitations with previous implementations, and to enable modeling of highly flexible and nonstraight blades. This paper reviews blade element momentum theory and several of the options available for analyzing skewed inflow. AeroDyn implementation details are described for the benefit of usersmore » and developers. These new options are compared to solutions from the previous version of AeroDyn and to experimental data. Finally, recommendations are given on how one might select from the various available solution approaches.« less

  10. Respiratory analysis of coupled mitochondria in cryopreserved liver biopsies.

    PubMed

    García-Roche, Mercedes; Casal, Alberto; Carriquiry, Mariana; Radi, Rafael; Quijano, Celia; Cassina, Adriana

    2018-07-01

    The aim of this work was to develop a cryopreservation method of small liver biopsies for in situ mitochondrial function assessment. Herein we describe a detailed protocol for tissue collection, cryopreservation, high-resolution respirometry using complex I and II substrates, calculation and interpretation of respiratory parameters. Liver biopsies from cow and rat were sequentially frozen in a medium containing dimethylsulfoxide as cryoprotectant and stored for up to 3 months at -80 °C. Oxygen consumption rate studies of fresh and cryopreserved samples revealed that most respiratory parameters remained unchanged. Additionally, outer mitochondrial membrane integrity was assessed adding cytochrome c, proving that our cryopreservation method does not harm mitochondrial structure. In sum, we present a reliable way to cryopreserve small liver biopsies without affecting mitochondrial function. Our protocol will enable the transport and storage of samples, extending and facilitating mitochondrial function analysis of liver biopsies. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Identical linkage and cooperativity of oxygen and carbon monoxide binding to Octopus dofleini hemocyanin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connelly, P.R.; Gill, S.J.; Miller, K.I.

    1989-02-21

    Employment of high-precision thin-layer methods has enabled detailed functional characterization of oxygen and carbon monoxide binding for (1) the fully assembled form with 70 binding sites and (2) the isolated chains with 7 binding sites of octopus dofleini hemocyanin. The striking difference in the cooperativities of the two ligands for the assembled decamer is revealed through an examination of the binding capacities and the partition coefficient, determined as functions of the activities of both ligands. A global analysis of the data sets supported by a two-state allosteric model assuming an allosteric unit of 7. Higher level allosteric interactions were notmore » indicated. This contrasts to results obtained for arthropod hemocyanins. Oxygen and carbon monoxide experiments performed on the isolated subunit chain confirmed the presence of functional heterogeneity reported previously. The analysis shows two types of binding sites in the ratio of 4:3.« less

  12. Negative hallucinations, dreams and hallucinations: The framing structure and its representation in the analytic setting.

    PubMed

    Perelberg, Rosine Jozef

    2016-12-01

    This paper explores the meaning of a patient's hallucinatory experiences in the course of a five times a week analysis. I will locate my understanding within the context of André Green's ideas on the role of the framing structure and the negative hallucination in the structuring of the mind. The understanding of the transference and countertransference was crucial in the creation of meaning and enabling the transformations that took place in the analytic process. Through a detailed analysis of a clinical example the author examines Bion's distinction between hysterical hallucinations and psychotic hallucinations and formulates her own hypothesis about the distinctions between the two. The paper suggests that whilst psychotic hallucinations express a conflict between life and death, in the hysterical hallucination it is between love and hate. The paper also contains some reflections on the dramatic nature of the analytic encounter. Copyright © 2016 Institute of Psychoanalysis.

  13. Equation-free multiscale computation: algorithms and applications.

    PubMed

    Kevrekidis, Ioannis G; Samaey, Giovanni

    2009-01-01

    In traditional physicochemical modeling, one derives evolution equations at the (macroscopic, coarse) scale of interest; these are used to perform a variety of tasks (simulation, bifurcation analysis, optimization) using an arsenal of analytical and numerical techniques. For many complex systems, however, although one observes evolution at a macroscopic scale of interest, accurate models are only given at a more detailed (fine-scale, microscopic) level of description (e.g., lattice Boltzmann, kinetic Monte Carlo, molecular dynamics). Here, we review a framework for computer-aided multiscale analysis, which enables macroscopic computational tasks (over extended spatiotemporal scales) using only appropriately initialized microscopic simulation on short time and length scales. The methodology bypasses the derivation of macroscopic evolution equations when these equations conceptually exist but are not available in closed form-hence the term equation-free. We selectively discuss basic algorithms and underlying principles and illustrate the approach through representative applications. We also discuss potential difficulties and outline areas for future research.

  14. Visualizing time-related data in biology, a review

    PubMed Central

    Secrier, Maria; Schneider, Reinhard

    2014-01-01

    Time is of the essence in biology as in so much else. For example, monitoring disease progression or the timing of developmental defects is important for the processes of drug discovery and therapy trials. Furthermore, an understanding of the basic dynamics of biological phenomena that are often strictly time regulated (e.g. circadian rhythms) is needed to make accurate inferences about the evolution of biological processes. Recent advances in technologies have enabled us to measure timing effects more accurately and in more detail. This has driven related advances in visualization and analysis tools that try to effectively exploit this data. Beyond timeline plots, notable attempts at more involved temporal interpretation have been made in recent years, but awareness of the available resources is still limited within the scientific community. Here, we review some advances in biological visualization of time-driven processes and consider how they aid data analysis and interpretation. PMID:23585583

  15. A Spherical Torus Nuclear Fusion Reactor Space Propulsion Vehicle Concept for Fast Interplanetary Travel

    NASA Technical Reports Server (NTRS)

    Williams, Craig H.; Borowski, Stanley K.; Dudzinski, Leonard A.; Juhasz, Albert J.

    1998-01-01

    A conceptual vehicle design enabling fast outer solar system travel was produced predicated on a small aspect ratio spherical torus nuclear fusion reactor. Initial requirements were for a human mission to Saturn with a greater than 5% payload mass fraction and a one way trip time of less than one year. Analysis revealed that the vehicle could deliver a 108 mt crew habitat payload to Saturn rendezvous in 235 days, with an initial mass in low Earth orbit of 2,941 mt. Engineering conceptual design, analysis, and assessment was performed on all ma or systems including payload, central truss, nuclear reactor (including divertor and fuel injector), power conversion (including turbine, compressor, alternator, radiator, recuperator, and conditioning), magnetic nozzle, neutral beam injector, tankage, start/re-start reactor and battery, refrigeration, communications, reaction control, and in-space operations. Detailed assessment was done on reactor operations, including plasma characteristics, power balance, power utilization, and component design.

  16. A Gateway for Phylogenetic Analysis Powered by Grid Computing Featuring GARLI 2.0

    PubMed Central

    Bazinet, Adam L.; Zwickl, Derrick J.; Cummings, Michael P.

    2014-01-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. [garli, gateway, grid computing, maximum likelihood, molecular evolution portal, phylogenetics, web service.] PMID:24789072

  17. A gateway for phylogenetic analysis powered by grid computing featuring GARLI 2.0.

    PubMed

    Bazinet, Adam L; Zwickl, Derrick J; Cummings, Michael P

    2014-09-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  18. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    NASA Technical Reports Server (NTRS)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  19. Development and validation of an APCI-MS/GC–MS approach for the classification and prediction of Cheddar cheese maturity

    PubMed Central

    Gan, Heng Hui; Yan, Bingnan; Linforth, Robert S.T.; Fisk, Ian D.

    2016-01-01

    Headspace techniques have been extensively employed in food analysis to measure volatile compounds, which play a central role in the perceived quality of food. In this study atmospheric pressure chemical ionisation-mass spectrometry (APCI-MS), coupled with gas chromatography–mass spectrometry (GC–MS), was used to investigate the complex mix of volatile compounds present in Cheddar cheeses of different maturity, processing and recipes to enable characterisation of the cheeses based on their ripening stages. Partial least squares-linear discriminant analysis (PLS-DA) provided a 70% success rate in correct prediction of the age of the cheeses based on their key headspace volatile profiles. In addition to predicting maturity, the analytical results coupled with chemometrics offered a rapid and detailed profiling of the volatile component of Cheddar cheeses, which could offer a new tool for quality assessment and accelerate product development. PMID:26212994

  20. Residual transglutaminase in collagen - effects, detection, quantification, and removal.

    PubMed

    Schloegl, W; Klein, A; Fürst, R; Leicht, U; Volkmer, E; Schieker, M; Jus, S; Guebitz, G M; Stachel, I; Meyer, M; Wiggenhorn, M; Friess, W

    2012-02-01

    In the present study, we developed an enzyme-linked immunosorbent assay (ELISA) for microbial transglutaminase (mTG) from Streptomyces mobaraensis to overcome the lack of a quantification method for mTG. We further performed a detailed follow-on-analysis of insoluble porcine collagen type I enzymatically modified with mTG primarily focusing on residuals of mTG. Repeated washing (4 ×) reduced mTG-levels in the washing fluids but did not quantitatively remove mTG from the material (p < 0.000001). Substantial amounts of up to 40% of the enzyme utilized in the crosslinking mixture remained associated with the modified collagen. Binding was non-covalent as could be demonstrated by Western blot analysis. Acidic and alkaline dialysis of mTG treated collagen material enabled complete removal the enzyme. Treatment with guanidinium chloride, urea, or sodium chloride was less effective in reducing the mTG content. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Development of a GC/Quadrupole-Orbitrap Mass Spectrometer, Part I: Design and Characterization

    PubMed Central

    2015-01-01

    Identification of unknown compounds is of critical importance in GC/MS applications (metabolomics, environmental toxin identification, sports doping, petroleomics, and biofuel analysis, among many others) and remains a technological challenge. Derivation of elemental composition is the first step to determining the identity of an unknown compound by MS, for which high accuracy mass and isotopomer distribution measurements are critical. Here, we report on the development of a dedicated, applications-grade GC/MS employing an Orbitrap mass analyzer, the GC/Quadrupole-Orbitrap. Built from the basis of the benchtop Orbitrap LC/MS, the GC/Quadrupole-Orbitrap maintains the performance characteristics of the Orbitrap, enables quadrupole-based isolation for sensitive analyte detection, and includes numerous analysis modalities to facilitate structural elucidation. We detail the design and construction of the instrument, discuss its key figures-of-merit, and demonstrate its performance for the characterization of unknown compounds and environmental toxins. PMID:25208235

  2. Realizing "2001: A Space Odyssey": Piloted Spherical Torus Nuclear Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Williams, Craig H.; Dudzinski, Leonard A.; Borowski, Stanley K.; Juhasz, Albert J.

    2005-01-01

    A conceptual vehicle design enabling fast, piloted outer solar system travel was created predicated on a small aspect ratio spherical torus nuclear fusion reactor. The initial requirements were satisfied by the vehicle concept, which could deliver a 172 mt crew payload from Earth to Jupiter rendezvous in 118 days, with an initial mass in low Earth orbit of 1,690 mt. Engineering conceptual design, analysis, and assessment was performed on all major systems including artificial gravity payload, central truss, nuclear fusion reactor, power conversion, magnetic nozzle, fast wave plasma heating, tankage, fuel pellet injector, startup/re-start fission reactor and battery bank, refrigeration, reaction control, communications, mission design, and space operations. Detailed fusion reactor design included analysis of plasma characteristics, power balance/utilization, first wall, toroidal field coils, heat transfer, and neutron/x-ray radiation. Technical comparisons are made between the vehicle concept and the interplanetary spacecraft depicted in the motion picture 2001: A Space Odyssey.

  3. Numerical Analysis of Incipient Separation on 53 Deg Swept Diamond Wing

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.

    2015-01-01

    A systematic analysis of incipient separation and subsequent vortex formation from moderately swept blunt leading edges is presented for a 53 deg swept diamond wing. This work contributes to a collective body of knowledge generated within the NATO/STO AVT-183 Task Group titled 'Reliable Prediction of Separated Flow Onset and Progression for Air and Sea Vehicles'. The objective is to extract insights from the experimentally measured and numerically computed flow fields that might enable turbulence experts to further improve their models for predicting swept blunt leading-edge flow separation. Details of vortex formation are inferred from numerical solutions after establishing a good correlation of the global flow field and surface pressure distributions between wind tunnel measurements and computed flow solutions. From this, significant and sometimes surprising insights into the nature of incipient separation and part-span vortex formation are derived from the wealth of information available in the computational solutions.

  4. Modeling Electronic-Nuclear Interactions for Excitation Energy Transfer Processes in Light-Harvesting Complexes.

    PubMed

    Lee, Mi Kyung; Coker, David F

    2016-08-18

    An accurate approach for computing intermolecular and intrachromophore contributions to spectral densities to describe the electronic-nuclear interactions relevant for modeling excitation energy transfer processes in light harvesting systems is presented. The approach is based on molecular dynamics (MD) calculations of classical correlation functions of long-range contributions to excitation energy fluctuations and a separate harmonic analysis and single-point gradient quantum calculations for electron-intrachromophore vibrational couplings. A simple model is also presented that enables detailed analysis of the shortcomings of standard MD-based excitation energy fluctuation correlation function approaches. The method introduced here avoids these problems, and its reliability is demonstrated in accurate predictions for bacteriochlorophyll molecules in the Fenna-Matthews-Olson pigment-protein complex, where excellent agreement with experimental spectral densities is found. This efficient approach can provide instantaneous spectral densities for treating the influence of fluctuations in environmental dissipation on fast electronic relaxation.

  5. Multidimensional Processing and Visual Rendering of Complex 3D Biomedical Images

    NASA Technical Reports Server (NTRS)

    Sams, Clarence F.

    2016-01-01

    The proposed technology uses advanced image analysis techniques to maximize the resolution and utility of medical imaging methods being used during spaceflight. We utilize COTS technology for medical imaging, but our applications require higher resolution assessment of the medical images than is routinely applied with nominal system software. By leveraging advanced data reduction and multidimensional imaging techniques utilized in analysis of Planetary Sciences and Cell Biology imaging, it is possible to significantly increase the information extracted from the onboard biomedical imaging systems. Year 1 focused on application of these techniques to the ocular images collected on ground test subjects and ISS crewmembers. Focus was on the choroidal vasculature and the structure of the optic disc. Methods allowed for increased resolution and quantitation of structural changes enabling detailed assessment of progression over time. These techniques enhance the monitoring and evaluation of crew vision issues during space flight.

  6. Mechanisms of Evolution in High-Consequence Drug Resistance Plasmids

    PubMed Central

    He, Susu; Chandler, Michael; Varani, Alessandro M.; Hickman, Alison B.; Dekker, John P.

    2016-01-01

    ABSTRACT The dissemination of resistance among bacteria has been facilitated by the fact that resistance genes are usually located on a diverse and evolving set of transmissible plasmids. However, the mechanisms generating diversity and enabling adaptation within highly successful resistance plasmids have remained obscure, despite their profound clinical significance. To understand these mechanisms, we have performed a detailed analysis of the mobilome (the entire mobile genetic element content) of a set of previously sequenced carbapenemase-producing Enterobacteriaceae (CPE) from the National Institutes of Health Clinical Center. This analysis revealed that plasmid reorganizations occurring in the natural context of colonization of human hosts were overwhelmingly driven by genetic rearrangements carried out by replicative transposons working in concert with the process of homologous recombination. A more complete understanding of the molecular mechanisms and evolutionary forces driving rearrangements in resistance plasmids may lead to fundamentally new strategies to address the problem of antibiotic resistance. PMID:27923922

  7. FMEA and consideration of real work situations for safer design of production systems.

    PubMed

    Lux, Aurélien; Mawo De Bikond, Johann; Etienne, Alain; Quillerou-Grivot, Edwige

    2016-12-01

    Production equipment designers must ensure the health and safety of future users; in this regard, they augment requirements for standardizing and controlling operator work. This contrasts with the ergonomic view of the activity, which recommends leaving operators leeway (margins for manoeuvre) in performing their task, while safeguarding their health. Following a brief analysis of design practices in the car industry, we detail how the Failure Modes and Effects Analysis (FMEA) approach is implemented in this sector. We then suggest an adaptation that enables designers to consider real work situations. This new protocol, namely, work situation FMEA, allows experience feedback to be used to defend the health standpoint during designer project reviews, which usually only address quality and performance issues. We subsequently illustrate the advantage of this approach using two examples of work situations at car parts manufacturers: the first from the literature and the second from an in-company industrial project.

  8. Photothermal method of determining calorific properties of coal

    DOEpatents

    Amer, N.M.

    1983-05-16

    Predetermined amounts of heat are generated within a coal sample by directing pump light pulses of predetermined energy content into a small surface region of the sample. A beam of probe light is directed along the sample surface and deflection of the probe beam from thermally induced changes of index of refraction in the fluid medium adjacent the heated region are detected. Deflection amplitude and the phase lag of the deflection, relative to the initiating pump light pulse, are indicative of the calorific value and the porosity of the sample. The method provides rapid, accurate and nondestructive analysis of the heat producing capabilities of coal samples. In the preferred form, sequences of pump light pulses of increasing durations are directed into the sample at each of a series of minute regions situated along a raster scan path enabling detailed analysis of variations of thermal properties at different areas of the sample and at different depths.

  9. Risk of Subsequent Leukemia After a Solid Tumor in Childhood: Impact of Bone Marrow Radiation Therapy and Chemotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allodji, Rodrigue S., E-mail: rodrigue.allodji@gustaveroussy.fr; Gustave Roussy, Villejuif; Paris Sud University, Orsay

    Purpose: To investigate the roles of radiation therapy and chemotherapy in the occurrence of subsequent leukemia after childhood cancer. Methods and Materials: We analyzed data from a case-control study with 35 cases and 140 controls. The active bone marrow (ABM) was segmented into 19 compartments, and the radiation dose was estimated in each. The chemotherapy drug doses were also estimated to enable adjustments. Models capable of accounting for radiation dose heterogeneity were implemented for analysis. Results: Univariate analysis showed a significant trend in the increase of secondary leukemia risk with radiation dose, after accounting for dose heterogeneity (P=.046). This trendmore » became nonsignificant after adjustment for doses of epipodophyllotoxins, alkylating agents, and platinum compounds and the first cancer on multivariate analysis (P=.388). The role of the radiation dose appeared to be dwarfed, mostly by the alkylating agents (odds ratio 6.9, 95% confidence interval 1.9-25.0). Among the patients who have received >16 Gy to the ABM, the radiogenic risk of secondary leukemia was about 4 times greater in the subgroup with no alkylating agents than in the subgroup receiving ≥10 g/m{sup 2}. Conclusions: Notwithstanding the limitations resulting from the size of our study population and the quite systematic co-treatment with chemotherapy, the use of detailed information on the radiation dose distribution to ABM enabled consideration of the role of radiation therapy in secondary leukemia induction after childhood cancer.« less

  10. Front-End Electron Transfer Dissociation Coupled to a 21 Tesla FT-ICR Mass Spectrometer for Intact Protein Sequence Analysis

    NASA Astrophysics Data System (ADS)

    Weisbrod, Chad R.; Kaiser, Nathan K.; Syka, John E. P.; Early, Lee; Mullen, Christopher; Dunyach, Jean-Jacques; English, A. Michelle; Anderson, Lissa C.; Blakney, Greg T.; Shabanowitz, Jeffrey; Hendrickson, Christopher L.; Marshall, Alan G.; Hunt, Donald F.

    2017-09-01

    High resolution mass spectrometry is a key technology for in-depth protein characterization. High-field Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) enables high-level interrogation of intact proteins in the most detail to date. However, an appropriate complement of fragmentation technologies must be paired with FTMS to provide comprehensive sequence coverage, as well as characterization of sequence variants, and post-translational modifications. Here we describe the integration of front-end electron transfer dissociation (FETD) with a custom-built 21 tesla FT-ICR mass spectrometer, which yields unprecedented sequence coverage for proteins ranging from 2.8 to 29 kDa, without the need for extensive spectral averaging (e.g., 60% sequence coverage for apo-myoglobin with four averaged acquisitions). The system is equipped with a multipole storage device separate from the ETD reaction device, which allows accumulation of multiple ETD fragment ion fills. Consequently, an optimally large product ion population is accumulated prior to transfer to the ICR cell for mass analysis, which improves mass spectral signal-to-noise ratio, dynamic range, and scan rate. We find a linear relationship between protein molecular weight and minimum number of ETD reaction fills to achieve optimum sequence coverage, thereby enabling more efficient use of instrument data acquisition time. Finally, real-time scaling of the number of ETD reactions fills during method-based acquisition is shown, and the implications for LC-MS/MS top-down analysis are discussed. [Figure not available: see fulltext.

  11. Nanopore sequencing in microgravity

    PubMed Central

    McIntyre, Alexa B R; Rizzardi, Lindsay; Yu, Angela M; Alexander, Noah; Rosen, Gail L; Botkin, Douglas J; Stahl, Sarah E; John, Kristen K; Castro-Wallace, Sarah L; McGrath, Ken; Burton, Aaron S; Feinberg, Andrew P; Mason, Christopher E

    2016-01-01

    Rapid DNA sequencing and analysis has been a long-sought goal in remote research and point-of-care medicine. In microgravity, DNA sequencing can facilitate novel astrobiological research and close monitoring of crew health, but spaceflight places stringent restrictions on the mass and volume of instruments, crew operation time, and instrument functionality. The recent emergence of portable, nanopore-based tools with streamlined sample preparation protocols finally enables DNA sequencing on missions in microgravity. As a first step toward sequencing in space and aboard the International Space Station (ISS), we tested the Oxford Nanopore Technologies MinION during a parabolic flight to understand the effects of variable gravity on the instrument and data. In a successful proof-of-principle experiment, we found that the instrument generated DNA reads over the course of the flight, including the first ever sequenced in microgravity, and additional reads measured after the flight concluded its parabolas. Here we detail modifications to the sample-loading procedures to facilitate nanopore sequencing aboard the ISS and in other microgravity environments. We also evaluate existing analysis methods and outline two new approaches, the first based on a wave-fingerprint method and the second on entropy signal mapping. Computationally light analysis methods offer the potential for in situ species identification, but are limited by the error profiles (stays, skips, and mismatches) of older nanopore data. Higher accuracies attainable with modified sample processing methods and the latest version of flow cells will further enable the use of nanopore sequencers for diagnostics and research in space. PMID:28725742

  12. Extended possibilities of pharmaceuticals delivery to patients using dematerialized prescriptions.

    PubMed

    Urbański, Andrzej P

    2004-01-01

    This paper considers the requirement for e-commerce enabled delivery of pharmaceutical prescriptions. First, currently available solutions are critically reviewed and an ideal solution is specified and then the concept of the proposed solution--Dematerialized Prescription (DP), is outlined. Next, the information flows required to enable such a service are considered. The paper then considers a number of possible services which could be made available with DP to deliver medicals to patients. Finally, a proposed solution, which enables physician to fill dematerialized prescriptions online using inexpensive mobile Internet devices is presented in detail, the advantages of such a model are summarized and future research directions are suggested.

  13. A flexible ontology for inference of emergent whole cell function from relationships between subcellular processes.

    PubMed

    Hansen, Jens; Meretzky, David; Woldesenbet, Simeneh; Stolovitzky, Gustavo; Iyengar, Ravi

    2017-12-18

    Whole cell responses arise from coordinated interactions between diverse human gene products functioning within various pathways underlying sub-cellular processes (SCP). Lower level SCPs interact to form higher level SCPs, often in a context specific manner to give rise to whole cell function. We sought to determine if capturing such relationships enables us to describe the emergence of whole cell functions from interacting SCPs. We developed the Molecular Biology of the Cell Ontology based on standard cell biology and biochemistry textbooks and review articles. Currently, our ontology contains 5,384 genes, 753 SCPs and 19,180 expertly curated gene-SCP associations. Our algorithm to populate the SCPs with genes enables extension of the ontology on demand and the adaption of the ontology to the continuously growing cell biological knowledge. Since whole cell responses most often arise from the coordinated activity of multiple SCPs, we developed a dynamic enrichment algorithm that flexibly predicts SCP-SCP relationships beyond the current taxonomy. This algorithm enables us to identify interactions between SCPs as a basis for higher order function in a context dependent manner, allowing us to provide a detailed description of how SCPs together can give rise to whole cell functions. We conclude that this ontology can, from omics data sets, enable the development of detailed SCP networks for predictive modeling of emergent whole cell functions.

  14. A 0.5 Tesla Transverse-Field Alternating Magnetic Field Demagnetizer

    NASA Astrophysics Data System (ADS)

    Schillinger, W. E.; Morris, E. R.; Finn, D. R.; Coe, R. S.

    2015-12-01

    We have built an alternating field demagnetizer that can routinely achieve a maximum field of 0.5 Tesla. It uses an amorphous magnetic core with an air-cooled coil. We have started with a 0.5 T design, which satisfies most of our immediate needs, but we can certainly achieve higher fields. In our design, the magnetic field is transverse to the bore and uniform to 1% over a standard (25 mm) paleomagnetic sample. It is powered by a 1 kW power amplifier and is compatible with our existing sample handler for automated demagnetization and measurement (Morris et al., 2009). It's much higher peak field has enabled us to completely demagnetize many of the samples that previously we could not with commercial equipment. This capability is especially needed for high-coercivity sedimentary and igneous rocks that contain magnetic minerals that alter during thermal demagnetization. It will also enable detailed automated demagnetization of high coercivity phases in extraterrestrial samples, such as native iron, iron-alloy and sulfide minerals that are common in lunar rocks and meteorites. Furthermore, it has opened the door for us to use the rock-magnetic technique of component analysis, using coercivity distributions derived from very detailed AF demagnetization of NRM and remanence produced in the laboratory to characterize the magnetic mineralogy of sedimentary rocks. In addition to the many benefits this instrument has brought to our own research, a much broader potential impact is to replace the transverse coils in automated AF demagnetization systems, which typically are limited to peak fields around 0.1 T.

  15. Simulation of nitrate reduction in groundwater - An upscaling approach from small catchments to the Baltic Sea basin

    NASA Astrophysics Data System (ADS)

    Hansen, A. L.; Donnelly, C.; Refsgaard, J. C.; Karlsson, I. B.

    2018-01-01

    This paper describes a modeling approach proposed to simulate the impact of local-scale, spatially targeted N-mitigation measures for the Baltic Sea Basin. Spatially targeted N-regulations aim at exploiting the considerable spatial differences in the natural N-reduction taking place in groundwater and surface water. While such measures can be simulated using local-scale physically-based catchment models, use of such detailed models for the 1.8 million km2 Baltic Sea basin is not feasible due to constraints on input data and computing power. Large-scale models that are able to simulate the Baltic Sea basin, on the other hand, do not have adequate spatial resolution to simulate some of the field-scale measures. Our methodology combines knowledge and results from two local-scale physically-based MIKE SHE catchment models, the large-scale and more conceptual E-HYPE model, and auxiliary data in order to enable E-HYPE to simulate how spatially targeted regulation of agricultural practices may affect N-loads to the Baltic Sea. We conclude that the use of E-HYPE with this upscaling methodology enables the simulation of the impact on N-loads of applying a spatially targeted regulation at the Baltic Sea basin scale to the correct order-of-magnitude. The E-HYPE model together with the upscaling methodology therefore provides a sound basis for large-scale policy analysis; however, we do not expect it to be sufficiently accurate to be useful for the detailed design of local-scale measures.

  16. A review of job-exposure matrix methodology for application to workers exposed to radiation from internally deposited plutonium or other radioactive materials.

    PubMed

    Liu, Hanhua; Wakeford, Richard; Riddell, Anthony; O'Hagan, Jacqueline; MacGregor, David; Agius, Raymond; Wilson, Christine; Peace, Mark; de Vocht, Frank

    2016-03-01

    Any potential health effects of radiation emitted from radionuclides deposited in the bodies of workers exposed to radioactive materials can be directly investigated through epidemiological studies. However, estimates of radionuclide exposure and consequent tissue-specific doses, particularly for early workers for whom monitoring was relatively crude but exposures tended to be highest, can be uncertain, limiting the accuracy of risk estimates. We review the use of job-exposure matrices (JEMs) in peer-reviewed epidemiological and exposure assessment studies of nuclear industry workers exposed to radioactive materials as a method for addressing gaps in exposure data, and discuss methodology and comparability between studies. We identified nine studies of nuclear worker cohorts in France, Russia, the USA and the UK that had incorporated JEMs in their exposure assessments. All these JEMs were study or cohort-specific, and although broadly comparable methodologies were used in their construction, this is insufficient to enable the transfer of any one JEM to another study. Moreover there was often inadequate detail on whether, or how, JEMs were validated. JEMs have become more detailed and more quantitative, and this trend may eventually enable better comparison across, and the pooling of, studies. We conclude that JEMs have been shown to be a valuable exposure assessment methodology for imputation of missing exposure data for nuclear worker cohorts with data not missing at random. The next step forward for direct comparison or pooled analysis of complete cohorts would be the use of transparent and transferable methods.

  17. DNA Bipedal Motor Achieves a Large Number of Steps Due to Operation Using Microfluidics-Based Interface.

    PubMed

    Tomov, Toma E; Tsukanov, Roman; Glick, Yair; Berger, Yaron; Liber, Miran; Avrahami, Dorit; Gerber, Doron; Nir, Eyal

    2017-04-25

    Realization of bioinspired molecular machines that can perform many and diverse operations in response to external chemical commands is a major goal in nanotechnology, but current molecular machines respond to only a few sequential commands. Lack of effective methods for introduction and removal of command compounds and low efficiencies of the reactions involved are major reasons for the limited performance. We introduce here a user interface based on a microfluidics device and single-molecule fluorescence spectroscopy that allows efficient introduction and removal of chemical commands and enables detailed study of the reaction mechanisms involved in the operation of synthetic molecular machines. The microfluidics provided 64 consecutive DNA strand commands to a DNA-based motor system immobilized inside the microfluidics, driving a bipedal walker to perform 32 steps on a DNA origami track. The microfluidics enabled removal of redundant strands, resulting in a 6-fold increase in processivity relative to an identical motor operated without strand removal and significantly more operations than previously reported for user-controlled DNA nanomachines. In the motor operated without strand removal, redundant strands interfere with motor operation and reduce its performance. The microfluidics also enabled computer control of motor direction and speed. Furthermore, analysis of the reaction kinetics and motor performance in the absence of redundant strands, made possible by the microfluidics, enabled accurate modeling of the walker processivity. This enabled identification of dynamic boundaries and provided an explanation, based on the "trap state" mechanism, for why the motor did not perform an even larger number of steps. This understanding is very important for the development of future motors with significantly improved performance. Our universal interface enables two-way communication between user and molecular machine and, relying on concepts similar to that of solid-phase synthesis, removes limitations on the number of external stimuli. This interface, therefore, is an important step toward realization of reliable, processive, reproducible, and useful externally controlled DNA nanomachines.

  18. The Elimination of Transfer Distances Is an Important Part of Hospital Design.

    PubMed

    Karvonen, Sauli; Nordback, Isto; Elo, Jussi; Havulinna, Jouni; Laine, Heikki-Jussi

    2017-04-01

    The objective of the present study was to describe how a specific patient flow analysis with from-to charts can be used in hospital design and layout planning. As part of a large renewal project at a university hospital, a detailed patient flow analysis was applied to planning the musculoskeletal surgery unit (orthopedics and traumatology, hand surgery, and plastic surgery). First, the main activities of the unit were determined. Next, the routes of all patients treated over the course of 1 year were studied, and their physical movements in the current hospital were calculated. An ideal layout of the new hospital was then generated to minimize transfer distances by placing the main activities with close to each other, according to the patient flow analysis. The actual architectural design was based on the ideal layout plan. Finally, we compared the current transfer distances to the distances patients will move in the new hospital. The methods enabled us to estimate an approximate 50% reduction in transfer distances for inpatients (from 3,100 km/year to 1,600 km/year) and 30% reduction for outpatients (from 2,100 km/year to 1,400 km/year). Patient transfers are nonvalue-added activities. This study demonstrates that a detailed patient flow analysis with from-to charts can substantially shorten transfer distances, thereby minimizing extraneous patient and personnel movements. This reduction supports productivity improvement, cross-professional teamwork, and patient safety by placing all patient flow activities close to each other. Thus, this method is a valuable additional tool in hospital design.

  19. Picowatt Resolution Calorimetry for Micro and Nanoscale Energy Transport Studies

    NASA Astrophysics Data System (ADS)

    Sadat, Seid H.

    Precise quantification of energy transport is key to obtaining insights into a wide range of phenomena across various disciplines including physics, chemistry, biology and engineering. This thesis describes technical advancements into heat-flow calorimetry which enable measurement of energy transport at micro and nanoscales with picowatt resolution. I have developed two types of microfabricated calorimeter devices and demonstrated single digit picowatt resolution at room temperature. Both devices incorporate two distinct features; an active area isolated by a thermal conductance (GTh) of less than 1 microW/K and a high resolution thermometer with temperature resolution (DeltaTres) in the micro kelvin regime. These features enable measurements of heat currents (q) with picowatt resolution (q= Th xDeltaTres). In the first device the active area is suspended via silicon nitride beams with excellent thermal isolation (~600 nW/K) and a bimaterial cantilever (BMC) thermometer with temperature resolution of ~6 microK. Taken together this design enabled calorimetric measurements with 4 pW resolution. In the second device, the BMC thermometry technique is replaced by a high-resolution resistance thermometry scheme. A detailed noise analysis of resistance thermometers, confirmed by experimental data, enabled me to correctly predict the resolution of different measurement schemes and propose techniques to achieve an order of magnitude improvement in the resolution of resistive thermometers. By incorporating resistance thermometers with temperature resolution of ~30 microK, combined with a thermal isolation of ~150 nW/K, I demonstrated an all-electrical calorimeter device with a resolution of ~ 5 pW. Finally, I used these calorimeters to study Near-Field Radiative Heat Transfer (NF-RHT). Using these devices, we studied--for the first time--the effect of film thickness on the NF-RHT between two dielectric surfaces. We showed that even a very thin film (~50 nm) of silicon dioxide deposited on a gold surface dramatically enhances NF-RHT between the coated surface and a second silica surface. Specifically, we find that the resulting heat fluxes are very similar to those between two bulk silicon dioxide surfaces when the gap size is reduced to be comparable to that of the film thickness. This interesting effect is understood on the basis of detailed computational analysis, which shows that the NF-RHT in gaps comparable to film thickness is completely dominated by the contributions from surface phonon-polaritons whose effective skin depth is comparable to the film thickness. These results are expected to hold true for various dielectric surfaces where heat transport is dominated by surface phonon-polaritons and have important implications for near-field based thermo photovoltaic devices and for near-field based thermal management.

  20. Analysis of the Waggle Dance Motion of Honeybees for the Design of a Biomimetic Honeybee Robot

    PubMed Central

    Landgraf, Tim; Rojas, Raúl; Nguyen, Hai; Kriegel, Fabian; Stettin, Katja

    2011-01-01

    The honeybee dance “language” is one of the most popular examples of information transfer in the animal world. Today, more than 60 years after its discovery it still remains unknown how follower bees decode the information contained in the dance. In order to build a robotic honeybee that allows a deeper investigation of the communication process we have recorded hundreds of videos of waggle dances. In this paper we analyze the statistics of visually captured high-precision dance trajectories of European honeybees (Apis mellifera carnica). The trajectories were produced using a novel automatic tracking system and represent the most detailed honeybee dance motion information available. Although honeybee dances seem very variable, some properties turned out to be invariant. We use these properties as a minimal set of parameters that enables us to model the honeybee dance motion. We provide a detailed statistical description of various dance properties that have not been characterized before and discuss the role of particular dance components in the commmunication process. PMID:21857906

  1. Damped-driven granular chains: An ideal playground for dark breathers and multibreathers

    NASA Astrophysics Data System (ADS)

    Chong, C.; Li, F.; Yang, J.; Williams, M. O.; Kevrekidis, I. G.; Kevrekidis, P. G.; Daraio, C.

    2014-03-01

    By applying an out-of-phase actuation at the boundaries of a uniform chain of granular particles, we demonstrate experimentally that time-periodic and spatially localized structures with a nonzero background (so-called dark breathers) emerge for a wide range of parameter values and initial conditions. We demonstrate a remarkable control over the number of breathers within the multibreather pattern that can be "dialed in" by varying the frequency or amplitude of the actuation. The values of the frequency (or amplitude) where the transition between different multibreather states occurs are predicted accurately by the proposed theoretical model, which is numerically shown to support exact dark breather and multibreather solutions. Moreover, we visualize detailed temporal and spatial profiles of breathers and, especially, of multibreathers using a full-field probing technology and enable a systematic favorable comparison among theory, computation, and experiments. A detailed bifurcation analysis reveals that the dark and multibreather families are connected in a "snaking" pattern, providing a roadmap for the identification of such fundamental states and their bistability in the laboratory.

  2. A parallel multi-domain solution methodology applied to nonlinear thermal transport problems in nuclear fuel pins

    DOE PAGES

    Philip, Bobby; Berrill, Mark A.; Allu, Srikanth; ...

    2015-01-26

    We describe an efficient and nonlinearly consistent parallel solution methodology for solving coupled nonlinear thermal transport problems that occur in nuclear reactor applications over hundreds of individual 3D physical subdomains. Efficiency is obtained by leveraging knowledge of the physical domains, the physics on individual domains, and the couplings between them for preconditioning within a Jacobian Free Newton Krylov method. Details of the computational infrastructure that enabled this work, namely the open source Advanced Multi-Physics (AMP) package developed by the authors are described. The details of verification and validation experiments, and parallel performance analysis in weak and strong scaling studies demonstratingmore » the achieved efficiency of the algorithm are presented. Moreover, numerical experiments demonstrate that the preconditioner developed is independent of the number of fuel subdomains in a fuel rod, which is particularly important when simulating different types of fuel rods. Finally, we demonstrate the power of the coupling methodology by considering problems with couplings between surface and volume physics and coupling of nonlinear thermal transport in fuel rods to an external radiation transport code.« less

  3. GPU-accelerated FDTD modeling of radio-frequency field-tissue interactions in high-field MRI.

    PubMed

    Chi, Jieru; Liu, Feng; Weber, Ewald; Li, Yu; Crozier, Stuart

    2011-06-01

    The analysis of high-field RF field-tissue interactions requires high-performance finite-difference time-domain (FDTD) computing. Conventional CPU-based FDTD calculations offer limited computing performance in a PC environment. This study presents a graphics processing unit (GPU)-based parallel-computing framework, producing substantially boosted computing efficiency (with a two-order speedup factor) at a PC-level cost. Specific details of implementing the FDTD method on a GPU architecture have been presented and the new computational strategy has been successfully applied to the design of a novel 8-element transceive RF coil system at 9.4 T. Facilitated by the powerful GPU-FDTD computing, the new RF coil array offers optimized fields (averaging 25% improvement in sensitivity, and 20% reduction in loop coupling compared with conventional array structures of the same size) for small animal imaging with a robust RF configuration. The GPU-enabled acceleration paves the way for FDTD to be applied for both detailed forward modeling and inverse design of MRI coils, which were previously impractical.

  4. Integrating Data Clustering and Visualization for the Analysis of 3D Gene Expression Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Data Analysis and Visualization; nternational Research Training Group ``Visualization of Large and Unstructured Data Sets,'' University of Kaiserslautern, Germany; Computational Research Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720, USA

    2008-05-12

    The recent development of methods for extracting precise measurements of spatial gene expression patterns from three-dimensional (3D) image data opens the way for new analyses of the complex gene regulatory networks controlling animal development. We present an integrated visualization and analysis framework that supports user-guided data clustering to aid exploration of these new complex datasets. The interplay of data visualization and clustering-based data classification leads to improved visualization and enables a more detailed analysis than previously possible. We discuss (i) integration of data clustering and visualization into one framework; (ii) application of data clustering to 3D gene expression data; (iii)more » evaluation of the number of clusters k in the context of 3D gene expression clustering; and (iv) improvement of overall analysis quality via dedicated post-processing of clustering results based on visualization. We discuss the use of this framework to objectively define spatial pattern boundaries and temporal profiles of genes and to analyze how mRNA patterns are controlled by their regulatory transcription factors.« less

  5. A New System to Monitor Data Analyses and Results of Physics Data Validation Between Pulses at DIII-D

    NASA Astrophysics Data System (ADS)

    Flanagan, S.; Schachter, J. M.; Schissel, D. P.

    2001-10-01

    A Data Analysis Monitoring (DAM) system has been developed to monitor between pulse physics analysis at the DIII-D National Fusion Facility. The system allows for rapid detection of discrepancies in diagnostic measurements or the results from physics analysis codes. This enables problems to be detected and possibly fixed between pulses as opposed to after the experimental run has concluded thus increasing the efficiency of experimental time. An example of a consistency check is comparing the stored energy from integrating the measured kinetic profiles to that calculated from magnetic measurements by EFIT. This new system also tracks the progress of MDSplus dispatching of software for data analysis and the loading of analyzed data into MDSplus. DAM uses a Java Servlet to receive messages, Clips to implement expert system logic, and displays its results to multiple web clients via HTML. If an error is detected by DAM, users can view more detailed information so that steps can be taken to eliminate the error for the next pulse. A demonstration of this system including a simulated DIII-D pulse cycle will be presented.

  6. Stirling engine design manual

    NASA Technical Reports Server (NTRS)

    Martini, W. R.

    1978-01-01

    This manual is intended to serve both as an introduction to Stirling engine analysis methods and as a key to the open literature on Stirling engines. Over 800 references are listed and these are cross referenced by date of publication, author and subject. Engine analysis is treated starting from elementary principles and working through cycles analysis. Analysis methodologies are classified as first, second or third order depending upon degree of complexity and probable application; first order for preliminary engine studies, second order for performance prediction and engine optimization, and third order for detailed hardware evaluation and engine research. A few comparisons between theory and experiment are made. A second order design procedure is documented step by step with calculation sheets and a worked out example to follow. Current high power engines are briefly described and a directory of companies and individuals who are active in Stirling engine development is included. Much remains to be done. Some of the more complicated and potentially very useful design procedures are now only referred to. Future support will enable a more thorough job of comparing all available design procedures against experimental data which should soon be available.

  7. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  8. MiMiR – an integrated platform for microarray data sharing, mining and analysis

    PubMed Central

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-01-01

    Background Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. Results A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. Conclusion The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies. PMID:18801157

  9. MiMiR--an integrated platform for microarray data sharing, mining and analysis.

    PubMed

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-09-18

    Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies.

  10. Wavelet-enabled progressive data Access and Storage Protocol (WASP)

    NASA Astrophysics Data System (ADS)

    Clyne, J.; Frank, L.; Lesperance, T.; Norton, A.

    2015-12-01

    Current practices for storing numerical simulation outputs hail from an era when the disparity between compute and I/O performance was not as great as it is today. The memory contents for every sample, computed at every grid point location, are simply saved at some prescribed temporal frequency. Though straightforward, this approach fails to take advantage of the coherency in neighboring grid points that invariably exists in numerical solutions to mathematical models. Exploiting such coherence is essential to digital multimedia; DVD-Video, digital cameras, streaming movies and audio are all possible today because of transform-based compression schemes that make substantial reductions in data possible by taking advantage of the strong correlation between adjacent samples in both space and time. Such methods can also be exploited to enable progressive data refinement in a manner akin to that used in ubiquitous digital mapping applications: views from far away are shown in coarsened detail to provide context, and can be progressively refined as the user zooms in on a localized region of interest. The NSF funded WASP project aims to provide a common, NetCDF-compatible software framework for supporting wavelet-based, multi-scale, progressive data, enabling interactive exploration of large data sets for the geoscience communities. This presentation will provide an overview of this work in progress to develop community cyber-infrastructure for the efficient analysis of very large data sets.

  11. An Automated Platform for High-Resolution Tissue Imaging Using Nanospray Desorption Electrospray Ionization Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lanekoff, Ingela T.; Heath, Brandi S.; Liyu, Andrey V.

    2012-10-02

    An automated platform has been developed for acquisition and visualization of mass spectrometry imaging (MSI) data using nanospray desorption electrospray ionization (nano-DESI). The new system enables robust operation of the nano-DESI imaging source over many hours. This is achieved by controlling the distance between the sample and the probe by mounting the sample holder onto an automated XYZ stage and defining the tilt of the sample plane. This approach is useful for imaging of relatively flat samples such as thin tissue sections. Custom software called MSI QuickView was developed for visualization of large data sets generated in imaging experiments. MSImore » QuickView enables fast visualization of the imaging data during data acquisition and detailed processing after the entire image is acquired. The performance of the system is demonstrated by imaging rat brain tissue sections. High resolution mass analysis combined with MS/MS experiments enabled identification of lipids and metabolites in the tissue section. In addition, high dynamic range and sensitivity of the technique allowed us to generate ion images of low-abundance isobaric lipids. High-spatial resolution image acquired over a small region of the tissue section revealed the spatial distribution of an abundant brain metabolite, creatine, in the white and gray matter that is consistent with the literature data obtained using magnetic resonance spectroscopy.« less

  12. Induced Pluripotent Stem Cell Models to Enable In Vitro Models for Screening in the Central Nervous System.

    PubMed

    Hunsberger, Joshua G; Efthymiou, Anastasia G; Malik, Nasir; Behl, Mamta; Mead, Ivy L; Zeng, Xianmin; Simeonov, Anton; Rao, Mahendra

    2015-08-15

    There is great need to develop more predictive drug discovery tools to identify new therapies to treat diseases of the central nervous system (CNS). Current nonpluripotent stem cell-based models often utilize non-CNS immortalized cell lines and do not enable the development of personalized models of disease. In this review, we discuss why in vitro models are necessary for translational research and outline the unique advantages of induced pluripotent stem cell (iPSC)-based models over those of current systems. We suggest that iPSC-based models can be patient specific and isogenic lines can be differentiated into many neural cell types for detailed comparisons. iPSC-derived cells can be combined to form small organoids, or large panels of lines can be developed that enable new forms of analysis. iPSC and embryonic stem cell-derived cells can be readily engineered to develop reporters for lineage studies or mechanism of action experiments further extending the utility of iPSC-based systems. We conclude by describing novel technologies that include strategies for the development of diversity panels, novel genomic engineering tools, new three-dimensional organoid systems, and modified high-content screens that may bring toxicology into the 21st century. The strategic integration of these technologies with the advantages of iPSC-derived cell technology, we believe, will be a paradigm shift for toxicology and drug discovery efforts.

  13. Composition and Morphology Control of Metal Dichalcogenides via Chemical Vapor Deposition for Photovoltaic and Nanoelectronic Applications

    NASA Astrophysics Data System (ADS)

    Samad, Leith L. J.

    The body of work reviewed here encompasses a variety of metal dichalcogenides all synthesized using chemical vapor deposition (CVD) for solar and electronics applications. The first reported phase-pure CVD synthesis of iron pyrite thin films is presented with detailed structural and electrochemical analysis. The phase-pure thin film and improved crystal growth on a metallic backing material represents one of the best options for potential solar applications using iron pyrite. Large tin-sulfur-selenide solid solution plates with tunable bandgaps were also synthesized via CVD as single-crystals with a thin film geometry. Solid solution tin-sulfur-selenide plates were demonstrated to be a new material for solar cells with the first observed solar conversion efficiencies up to 3.1%. Finally, a low temperature molybdenum disulfide vertical heterostructure CVD synthesis with layered controlled growth was achieved with preferential growth enabled by Van der Waals epitaxy. Through recognition of additional reaction parameters, a fully regulated CVD synthesis enabled the controlled growth of 1-6 molybdenum disulfide monolayers for nanoelectronic applications. The improvements in synthesis and materials presented here were all enabled by the control afforded by CVD such that advances in phase purity, growth, and composition control of several metal dichalcogenides were achieved. Further work will be able to take full advantage of these advances for future solar and electronics technologies.

  14. Geomega: MEGAlib's Uniform Geometry and Detector Description Tool for Geant3, MGGPOD, and Geant4

    NASA Astrophysics Data System (ADS)

    Zoglauer, Andreas C.; Andritschke, R.; Schopper, F.; Wunderer, C. B.

    2006-09-01

    The Medium Energy Gamma-ray Astronomy library MEGAlib is a set of software tools for the analysis of low to medium energy gamma-ray telescopes, especially Compton telescopes. It comprises all necessary data analysis steps from simulation/measurements via event reconstruction to image reconstruction and enables detailed performance assessments. In the energy range of Compton telescopes (with energy deposits from a few keV up to hundreds of MeV), the Geant Monte-Carlo software packages (Geant3 with its MGGPOD extension as well as Geant4) are widely used. Since each tool has its unique advantages, MEGAlib contains a geometry and detector description library, called Geomega, which allows to use those tools in a uniform way. It incorporates the versatile 3D display facilities available within the ROOT libraries. The same geometry, material, trigger, and detector description can be used for all simulation tools as well as for the later event analysis in the MEGAlib framework. This is done by converting the MEGAlib geometry into the Geant3 or MGGPOD format or directly linking the Geomega library into Geant4. The geometry description can handle most (and can be extended to handle all) volumes common to Geant3, Geant4 and ROOT. In Geomega a list of features is implemented which are especially useful for optimizing detector geometries: It allows to define constants, can handle mathematical operations, enables volume scaling, checks for overlaps of detector volumes, does mass calculations, etc. Used in combination with MEGAlib, Geomega enables discretization, application of detector noise, thresholds, various trigger conditions, defective pixels, etc. The highly modular and completely object-oriented library is written in C++ and based on ROOT. It has been originally developed for the tracking Compton scattering and Pair creation telescope MEGA and has been successfully applied to a wide variety of telescopes, such as ACT, NuSTAR, or GRI.

  15. High-throughput analysis of sub-visible mAb aggregate particles using automated fluorescence microscopy imaging.

    PubMed

    Paul, Albert Jesuran; Bickel, Fabian; Röhm, Martina; Hospach, Lisa; Halder, Bettina; Rettich, Nina; Handrick, René; Herold, Eva Maria; Kiefer, Hans; Hesse, Friedemann

    2017-07-01

    Aggregation of therapeutic proteins is a major concern as aggregates lower the yield and can impact the efficacy of the drug as well as the patient's safety. It can occur in all production stages; thus, it is essential to perform a detailed analysis for protein aggregates. Several methods such as size exclusion high-performance liquid chromatography (SE-HPLC), light scattering, turbidity, light obscuration, and microscopy-based approaches are used to analyze aggregates. None of these methods allows determination of all types of higher molecular weight (HMW) species due to a limited size range. Furthermore, quantification and specification of different HMW species are often not possible. Moreover, automation is a perspective challenge coming up with automated robotic laboratory systems. Hence, there is a need for a fast, high-throughput-compatible method, which can detect a broad size range and enable quantification and classification. We describe a novel approach for the detection of aggregates in the size range 1 to 1000 μm combining fluorescent dyes for protein aggregate labelling and automated fluorescence microscope imaging (aFMI). After appropriate selection of the dye and method optimization, our method enabled us to detect various types of HMW species of monoclonal antibodies (mAbs). Using 10 μmol L -1 4,4'-dianilino-1,1'-binaphthyl-5,5'-disulfonate (Bis-ANS) in combination with aFMI allowed the analysis of mAb aggregates induced by different stresses occurring during downstream processing, storage, and administration. Validation of our results was performed by SE-HPLC, UV-Vis spectroscopy, and dynamic light scattering. With this new approach, we could not only reliably detect different HMW species but also quantify and classify them in an automated approach. Our method achieves high-throughput requirements and the selection of various fluorescent dyes enables a broad range of applications.

  16. ATLAS Test Program Generator II (AGEN II). Volume I. Executive Software System.

    DTIC Science & Technology

    1980-08-01

    features. l-1 C. To provide detailed descriptions of each of the system components and modules and their corresponding flowcharts. D. To describe methods of...contains the FORTRAN source code listings to enable programmer to do the expansions and modifications. The methods and details of adding another...characteristics of the network. The top-down implementa- tion method is therefore suggested. This method starts at the top by designing the IVT modules in

  17. A New Mechanism for a Brain

    ERIC Educational Resources Information Center

    Andreae, John H.; Cleary, John G.

    1976-01-01

    The new mechanism, PUSS, enables experience of any complex environment to be accumulated in a predictive model. PURR-PUSS is a teachable robot system based on the new mechanism. Cumulative learning is demonstrated by a detailed example. (Author)

  18. Enhanced culvert inspections - best practices guidebook : final report.

    DOT National Transportation Integrated Search

    2017-06-01

    Culvert inspection is a key enabler that allows MnDOT to manage the states highway culvert system. When quantitative detail on culvert condition is required, an inspector will need to use enhanced inspection technologies. Enhanced inspection techn...

  19. Let's push things forward: disruptive technologies and the mechanics of tissue assembly.

    PubMed

    Varner, Victor D; Nelson, Celeste M

    2013-09-01

    Although many of the molecular mechanisms that regulate tissue assembly in the embryo have been delineated, the physical forces that couple these mechanisms to actual changes in tissue form remain unclear. Qualitative studies suggest that mechanical loads play a regulatory role in development, but clear quantitative evidence has been lacking. This is partly owing to the complex nature of these problems - embryonic tissues typically undergo large deformations and exhibit evolving, highly viscoelastic material properties. Still, despite these challenges, new disruptive technologies are enabling study of the mechanics of tissue assembly in unprecedented detail. Here, we present novel experimental techniques that enable the study of each component of these physical problems: kinematics, forces, and constitutive properties. Specifically, we detail advances in light sheet microscopy, optical coherence tomography, traction force microscopy, fluorescence force spectroscopy, microrheology and micropatterning. Taken together, these technologies are helping elucidate a more quantitative understanding of the mechanics of tissue assembly.

  20. Viewing the functional consequences of traumatic brain injury by using brain SPECT.

    PubMed

    Pavel, D; Jobe, T; Devore-Best, S; Davis, G; Epstein, P; Sinha, S; Kohn, R; Craita, I; Liu, P; Chang, Y

    2006-03-01

    High-resolution brain SPECT is increasingly benefiting from improved image processing software and multiple complementary display capabilities. This enables detailed functional mapping of the disturbances in relative perfusion occurring after TBI. The patient population consisted of 26 cases (ages 8-61 years)between 3 months and 6 years after traumatic brain injury.A very strong case can be made for the routine use of Brain SPECT in TBI. Indeed it can provide a detailed evaluation of multiple functional consequences after TBI and is thus capable of supplementing the clinical evaluation and tailoring the therapeutic strategies needed. In so doing it also provides significant additional information beyond that available from MRI/CT. The critical factor for Brain SPECT's clinical relevance is a carefully designed technical protocol, including displays which should enable a comprehensive description of the patterns found, in a user friendly mode.

  1. Let's push things forward: disruptive technologies and the mechanics of tissue assembly

    PubMed Central

    Varner, Victor D.; Nelson, Celeste M.

    2013-01-01

    Although many of the molecular mechanisms that regulate tissue assembly in the embryo have been delineated, the physical forces that couple these mechanisms to actual changes in tissue form remain unclear. Qualitative studies suggest that mechanical loads play a regulatory role in development, but clear quantitative evidence has been lacking. This is partly owing to the complex nature of these problems – embryonic tissues typically undergo large deformations and exhibit evolving, highly viscoelastic material properties. Still, despite these challenges, new disruptive technologies are enabling study of the mechanics of tissue assembly in unprecedented detail. Here, we present novel experimental techniques that enable the study of each component of these physical problems: kinematics, forces, and constitutive properties. Specifically, we detail advances in light sheet microscopy, optical coherence tomography, traction force microscopy, fluorescence force spectroscopy, microrheology and micropatterning. Taken together, these technologies are helping elucidate a more quantitative understanding of the mechanics of tissue assembly. PMID:23907401

  2. A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems

    PubMed Central

    Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang

    2016-01-01

    With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach. PMID:26999141

  3. A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems.

    PubMed

    Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang

    2016-03-17

    With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach.

  4. The FLIGHT Drosophila RNAi database

    PubMed Central

    Bursteinas, Borisas; Jain, Ekta; Gao, Qiong; Baum, Buzz; Zvelebil, Marketa

    2010-01-01

    FLIGHT (http://flight.icr.ac.uk/) is an online resource compiling data from high-throughput Drosophila in vivo and in vitro RNAi screens. FLIGHT includes details of RNAi reagents and their predicted off-target effects, alongside RNAi screen hits, scores and phenotypes, including images from high-content screens. The latest release of FLIGHT is designed to enable users to upload, analyze, integrate and share their own RNAi screens. Users can perform multiple normalizations, view quality control plots, detect and assign screen hits and compare hits from multiple screens using a variety of methods including hierarchical clustering. FLIGHT integrates RNAi screen data with microarray gene expression as well as genomic annotations and genetic/physical interaction datasets to provide a single interface for RNAi screen analysis and datamining in Drosophila. PMID:20855970

  5. Broadband computation of the scattering coefficients of infinite arbitrary cylinders.

    PubMed

    Blanchard, Cédric; Guizal, Brahim; Felbacq, Didier

    2012-07-01

    We employ a time-domain method to compute the near field on a contour enclosing infinitely long cylinders of arbitrary cross section and constitution. We therefore recover the cylindrical Hankel coefficients of the expansion of the field outside the circumscribed circle of the structure. The recovered coefficients enable the wideband analysis of complex systems, e.g., the determination of the radar cross section becomes straightforward. The prescription for constructing such a numerical tool is provided in great detail. The method is validated by computing the scattering coefficients for a homogeneous circular cylinder illuminated by a plane wave, a problem for which an analytical solution exists. Finally, some radiation properties of an optical antenna are examined by employing the proposed technique.

  6. Field enhancement of multiphoton induced luminescence processes in ZnO nanorods

    NASA Astrophysics Data System (ADS)

    Hyyti, Janne; Perestjuk, Marko; Mahler, Felix; Grunwald, Rüdiger; Güell, Frank; Gray, Ciarán; McGlynn, Enda; Steinmeyer, Günter

    2018-03-01

    The near-ultraviolet photoluminescence of ZnO nanorods induced by multiphoton absorption of unamplified Ti:sapphire pulses is investigated. Power dependence measurements have been conducted with an adaptation of the ultrashort pulse characterization method of interferometric frequency-resolved optical gating. These measurements enable the separation of second harmonic and photoluminescence bands due to their distinct coherence properties. A detailed analysis yields fractional power dependence exponents in the range of 3-4, indicating the presence of multiple nonlinear processes. The range in measured exponents is attributed to differences in local field enhancement, which is supported by independent photoluminescence and structural measurements. Simulations based on Keldysh theory suggest contributions by three- and four-photon absorption as well as avalanche ionization in agreement with experimental findings.

  7. A Review of Criticality Accidents 2000 Revision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas P. McLaughlin; Shean P. Monahan; Norman L. Pruvost

    Criticality accidents and the characteristics of prompt power excursions are discussed. Sixty accidental power excursions are reviewed. Sufficient detail is provided to enable the reader to understand the physical situation, the chemistry and material flow, and when available the administrative setting leading up to the time of the accident. Information on the power history, energy release, consequences, and causes are also included when available. For those accidents that occurred in process plants, two new sections have been included in this revision. The first is an analysis and summary of the physical and neutronic features of the chain reacting systems. Themore » second is a compilation of observations and lessons learned. Excursions associated with large power reactors are not included in this report.« less

  8. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  9. Development of Laboratory Seismic Exploration Experiment for Education and Demonstration

    NASA Astrophysics Data System (ADS)

    Kuwano, O.; Nakanishi, A.

    2016-12-01

    We developed a laboratory experiment to simulate a seismic refraction survey for educational purposes. The experiment is tabletop scaled experiment using the soft hydrogel as an analogue material of a layered crust. So, we can conduct the seismic exploration experiment in a laboratory or a classroom. The softness and the transparency of the gel material enable us to observe the wave propagation with our naked eyes, using the photoelastic technique. By analyzing the waveforms obtained by the image analysis of the movie of the experiment, one can estimate the velocities and the structure of the gel specimen in the same way as an actual seismic survey. We report details of the practical course and the public outreach activities using the experiment.

  10. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE PAGES

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...

    2017-01-01

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  11. Large and linked in scientific publishing

    PubMed Central

    2012-01-01

    We are delighted to announce the launch of GigaScience, an online open-access journal that focuses on research using or producing large datasets in all areas of biological and biomedical sciences. GigaScience is a new type of journal that provides standard scientific publishing linked directly to a database that hosts all the relevant data. The primary goals for the journal, detailed in this editorial, are to promote more rapid data release, broader use and reuse of data, improved reproducibility of results, and direct, easy access between analyses and their data. Direct and permanent connections of scientific analyses and their data (achieved by assigning all hosted data a citable DOI) will enable better analysis and deeper interpretation of the data in the future. PMID:23587310

  12. Large and linked in scientific publishing.

    PubMed

    Goodman, Laurie; Edmunds, Scott C; Basford, Alexandra T

    2012-07-12

    We are delighted to announce the launch of GigaScience, an online open-access journal that focuses on research using or producing large datasets in all areas of biological and biomedical sciences. GigaScience is a new type of journal that provides standard scientific publishing linked directly to a database that hosts all the relevant data. The primary goals for the journal, detailed in this editorial, are to promote more rapid data release, broader use and reuse of data, improved reproducibility of results, and direct, easy access between analyses and their data. Direct and permanent connections of scientific analyses and their data (achieved by assigning all hosted data a citable DOI) will enable better analysis and deeper interpretation of the data in the future.

  13. Use of computational fluid dynamics in respiratory medicine.

    PubMed

    Fernández Tena, Ana; Casan Clarà, Pere

    2015-06-01

    Computational Fluid Dynamics (CFD) is a computer-based tool for simulating fluid movement. The main advantages of CFD over other fluid mechanics studies include: substantial savings in time and cost, the analysis of systems or conditions that are very difficult to simulate experimentally (as is the case of the airways), and a practically unlimited level of detail. We used the Ansys-Fluent CFD program to develop a conducting airway model to simulate different inspiratory flow rates and the deposition of inhaled particles of varying diameters, obtaining results consistent with those reported in the literature using other procedures. We hope this approach will enable clinicians to further individualize the treatment of different respiratory diseases. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.

  14. Facilitating Stewardship of scientific data through standards based workflows

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  15. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    PubMed

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative analysis of heterogeneous data types in the development of complex botanicals such as polyphenols for eventual clinical and translational applications.

  16. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis.

    PubMed

    Wen, Na; Zhao, Zhan; Fan, Beiyuan; Chen, Deyong; Men, Dong; Wang, Junbo; Chen, Jian

    2016-07-05

    This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1) prototype demonstration of single-cell encapsulation in microfluidic droplets; (2) technical improvements of single-cell encapsulation in microfluidic droplets; (3) microfluidic droplets enabling single-cell proteomic analysis; (4) microfluidic droplets enabling single-cell genomic analysis; and (5) integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  17. MEMS: Enabled Drug Delivery Systems.

    PubMed

    Cobo, Angelica; Sheybani, Roya; Meng, Ellis

    2015-05-01

    Drug delivery systems play a crucial role in the treatment and management of medical conditions. Microelectromechanical systems (MEMS) technologies have allowed the development of advanced miniaturized devices for medical and biological applications. This Review presents the use of MEMS technologies to produce drug delivery devices detailing the delivery mechanisms, device formats employed, and various biomedical applications. The integration of dosing control systems, examples of commercially available microtechnology-enabled drug delivery devices, remaining challenges, and future outlook are also discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Kalman approach to accuracy management for interoperable heterogeneous model abstraction within an HLA-compliant simulation

    NASA Astrophysics Data System (ADS)

    Leskiw, Donald M.; Zhau, Junmei

    2000-06-01

    This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.

  19. Broadband radio jet emission and variability of γ-ray blazars

    NASA Astrophysics Data System (ADS)

    Nestoras, Ioannis

    2015-07-01

    AGN (Active Galactic Nuclei) and in particular their subclass blazars, are among the most energetic objects observed in the universe, featuring extreme phenomenological characteristics such as rapid broadband flux density and polarization variability, fast super--luminal motion, high degree of polarization and a broadband, double-humped spectral energy distribution (SED). The details of the emission processes and violent variability of blazars are still poorly understood. Variability studies give important clues about the size, structure, physics and dynamics of the emitting region making AGN/blazar monitoring programs of uttermost importance in providing the necessary constraints for understanding the origin of energy production. In this framework the F-gamma program was initiated, monitoring monthly 60 fermi detected AGN/blazars at 12 frequencies between 2.6 and 345GHz since 2007. For the thesis in hand observations and data analysis were performed within the realms of the F-gamma program, using the Effelsberg (EB) 100m and Pico Veleta (PV) 30m telescopes at 10 frequency bands ranging from 2.64 to 142GHz. The cm to short-mm variability/spectral characteristics are monitored for a sample of 59 sources for a period of five years enabling for the first time a detailed study of the observed flaring activity in both the light curve and spectral domains for such a large number of sources and such high cadence. Also the observing systems and methods are introduced as well as the data reduction techniques. The thesis at hand is structured as follows: Chapter 3 presents the reduction methods and post measurement corrections applied to the data such as pointing offsets, gain--elevation and sensitivity corrections as well as specific corrections applied for each of the Effelsberg and Pico Veleta observing systems respectively. Chapter 4 presents the analysis tools and methods that were used such as: variability characteristics, flare amplitudes with a new method for estimating the intrinsic standard deviation, flare time scales using Structure Function analysis, spectral indices and spectral peak estimations. Chapter 5 presents the results of the analysis performed upon the five year light curves. The significance of variability through a x^2 test is estimated as well as the flare amplitudes using the intrinsic variability of the light curves along with a new proposed k--index. The introduction of the k--index enables the characterization of the observed variability amplitudes across frequency, thus permitting us to limit the parameter space of various physical models. Also flare time scales, brightness temperatures and Doppler factors are reported. Chapter 6 presents the corresponding analysis in the spectral domain, including results for spectral indices and an S_max - v_max analysis. By determining the spectral peak of every spectra for a selected number of sources, it is possible to track the evolution of the flaring activity in the S_max - v_max plane, enabling us to discriminate between different underlying physical mechanisms that are in action. Finally Chapter 7 includes the overall discussion and a summary of results obtained.

  20. Facilitating Navigation Through Large Archives

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Troung, Dat; Hodgson, Terry R.

    2005-01-01

    Automated Visual Access (AVA) is a computer program that effectively makes a large collection of information visible in a manner that enables a user to quickly and efficiently locate information resources, with minimal need for conventional keyword searches and perusal of complex hierarchical directory systems. AVA includes three key components: (1) a taxonomy that comprises a collection of words and phrases, clustered according to meaning, that are used to classify information resources; (2) a statistical indexing and scoring engine; and (3) a component that generates a graphical user interface that uses the scoring data to generate a visual map of resources and topics. The top level of an AVA display is a pictorial representation of an information archive. The user enters the depicted archive by either clicking on a depiction of subject area cluster, selecting a topic from a list, or entering a query into a text box. The resulting display enables the user to view candidate information entities at various levels of detail. Resources are grouped spatially by topic with greatest generality at the top layer and increasing detail with depth. The user can zoom in or out of specific sites or into greater or lesser content detail.

  1. Molecular cytogenetic analysis consistently identifies translocations involving chromosomes 1, 2 and 15 in five embryonal rhabdomyosarcoma cell lines and a PAX-FOXO1A fusion gene negative alveolar rhabdomyosarcoma cell line.

    PubMed

    Roberts, I; Gordon, A; Wang, R; Pritchard-Jones, K; Shipley, J; Coleman, N

    2001-01-01

    Rhabdomyosarcoma in children is a "small round blue cell tumour" that displays skeletal muscle differentiation. Two main histological variants are recognised, alveolar (ARMS) and embryonal (ERMS) rhabdomyosarcoma. Whereas consistent chromosome translocations characteristic of ARMS have been reported, no such cytogenetic abnormality has yet been described in ERMS. We have used multiple colour chromosome painting to obtain composite karyotypes for five ERMS cell lines and one PAX-FOXO1A fusion gene negative ARMS. The cell lines were assessed by spectral karyotyping (SKY), tailored multi-fluorophore fluorescence in situ hybridisation (M-FISH) using series of seven colour paint sets generated to examine specific abnormalities, and comparative genomic hybridisation (CGH). This approach enabled us to obtain karyotypes of the cell lines in greater detail than previously possible. Several recurring cytogenetic abnormalities were demonstrated, including translocations involving chromosomes 1 and 15 and chromosomes 2 and 15, in 4/6 and 2/6 cell lines respectively. All six cell lines demonstrated abnormalities of chromosome 15. Translocations between chromosomes 1 and 15 have previously been recorded in two primary cases of ERMS by conventional cytogenetics. Analysis of the translocation breakpoints may suggest mechanisms of ERMS tumourigenesis and may enable the development of novel approaches to the clinical management of this tumour. Copyright 2002 S. Karger AG, Basel

  2. CloVR-Comparative: automated, cloud-enabled comparative microbial genome sequence analysis pipeline.

    PubMed

    Agrawal, Sonia; Arze, Cesar; Adkins, Ricky S; Crabtree, Jonathan; Riley, David; Vangala, Mahesh; Galens, Kevin; Fraser, Claire M; Tettelin, Hervé; White, Owen; Angiuoli, Samuel V; Mahurkar, Anup; Fricke, W Florian

    2017-04-27

    The benefit of increasing genomic sequence data to the scientific community depends on easy-to-use, scalable bioinformatics support. CloVR-Comparative combines commonly used bioinformatics tools into an intuitive, automated, and cloud-enabled analysis pipeline for comparative microbial genomics. CloVR-Comparative runs on annotated complete or draft genome sequences that are uploaded by the user or selected via a taxonomic tree-based user interface and downloaded from NCBI. CloVR-Comparative runs reference-free multiple whole-genome alignments to determine unique, shared and core coding sequences (CDSs) and single nucleotide polymorphisms (SNPs). Output includes short summary reports and detailed text-based results files, graphical visualizations (phylogenetic trees, circular figures), and a database file linked to the Sybil comparative genome browser. Data up- and download, pipeline configuration and monitoring, and access to Sybil are managed through CloVR-Comparative web interface. CloVR-Comparative and Sybil are distributed as part of the CloVR virtual appliance, which runs on local computers or the Amazon EC2 cloud. Representative datasets (e.g. 40 draft and complete Escherichia coli genomes) are processed in <36 h on a local desktop or at a cost of <$20 on EC2. CloVR-Comparative allows anybody with Internet access to run comparative genomics projects, while eliminating the need for on-site computational resources and expertise.

  3. A correlative and quantitative imaging approach enabling characterization of primary cell-cell communication: Case of human CD4+ T cell-macrophage immunological synapses.

    PubMed

    Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie

    2018-05-22

    Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.

  4. A Standardized Interface for Obtaining Digital Planetary and Heliophysics Time Series Data

    NASA Astrophysics Data System (ADS)

    Vandegriff, Jon; Weigel, Robert; Faden, Jeremy; King, Todd; Candey, Robert

    2016-10-01

    We describe a low level interface for accessing digital Planetary and Heliophysics data, focusing primarily on time-series data from in-situ instruments. As the volume and variety of planetary data has increased, it has become harder to merge diverse datasets into a common analysis environment. Thus we are building low-level computer-to-computer infrastructure to enable data from different missions or archives to be able to interoperate. The key to enabling interoperability is a simple access interface that standardizes the common capabilities available from any data server: 1. identify the data resources that can be accessed; 2. describe each resource; and 3. get the data from a resource. We have created a standardized way for data servers to perform each of these three activities. We are also developing a standard streaming data format for the actual data content to be returned (i.e., the result of item 3). Our proposed standard access interface is simple enough that it could be implemented on top of or beside existing data services, or it could even be fully implemented by a small data provider as a way to ensure that the provider's holdings can participate in larger data systems or joint analysis with other datasets. We present details of the interface and of the streaming format, including a sample server designed to illustrate the data request and streaming capabilities.

  5. Acidithiobacillus ferrooxidans's comprehensive model driven analysis of the electron transfer metabolism and synthetic strain design for biomining applications.

    PubMed

    Campodonico, Miguel A; Vaisman, Daniela; Castro, Jean F; Razmilic, Valeria; Mercado, Francesca; Andrews, Barbara A; Feist, Adam M; Asenjo, Juan A

    2016-12-01

    Acidithiobacillus ferrooxidans is a gram-negative chemolithoautotrophic γ-proteobacterium. It typically grows at an external pH of 2 using the oxidation of ferrous ions by oxygen, producing ferric ions and water, while fixing carbon dioxide from the environment. A. ferrooxidans is of great interest for biomining and environmental applications, as it can process mineral ores and alleviate the negative environmental consequences derived from the mining processes. In this study, the first genome-scale metabolic reconstruction of A. ferrooxidans ATCC 23270 was generated ( i MC507). A total of 587 metabolic and transport/exchange reactions, 507 genes and 573 metabolites organized in over 42 subsystems were incorporated into the model. Based on a new genetic algorithm approach, that integrates flux balance analysis, chemiosmotic theory, and physiological data, the proton translocation stoichiometry for a number of enzymes and maintenance parameters under aerobic chemolithoautotrophic conditions using three different electron donors were estimated. Furthermore, a detailed electron transfer and carbon flux distributions during chemolithoautotrophic growth using ferrous ion, tetrathionate and thiosulfate were determined and reported. Finally, 134 growth-coupled designs were calculated that enables Extracellular Polysaccharide production. i MC507 serves as a knowledgebase for summarizing and categorizing the information currently available for A. ferrooxidans and enables the understanding and engineering of Acidithiobacillus and similar species from a comprehensive model-driven perspective for biomining applications.

  6. Clinical modeling--a critical analysis.

    PubMed

    Blobel, Bernd; Goossen, William; Brochhausen, Mathias

    2014-01-01

    Modeling clinical processes (and their informational representation) is a prerequisite for optimally enabling and supporting high quality and safe care through information and communication technology and meaningful use of gathered information. The paper investigates existing approaches to clinical modeling, thereby systematically analyzing the underlying principles, the consistency with and the integration opportunity to other existing or emerging projects, as well as the correctness of representing the reality of health and health services. The analysis is performed using an architectural framework for modeling real-world systems. In addition, fundamental work on the representation of facts, relations, and processes in the clinical domain by ontologies is applied, thereby including the integration of advanced methodologies such as translational and system medicine. The paper demonstrates fundamental weaknesses and different maturity as well as evolutionary potential in the approaches considered. It offers a development process starting with the business domain and its ontologies, continuing with the Reference Model-Open Distributed Processing (RM-ODP) related conceptual models in the ICT ontology space, the information and the computational view, and concluding with the implementation details represented as engineering and technology view, respectively. The existing approaches reflect at different levels the clinical domain, put the main focus on different phases of the development process instead of first establishing the real business process representation and therefore enable quite differently and partially limitedly the domain experts' involvement. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Four-Formation In-Track Configuration Maintenance Strategy

    NASA Technical Reports Server (NTRS)

    Lamy, Alain; Costes, Thierry

    2007-01-01

    The aim of this paper is to present the analysis conducted by CNES for the maintenance of a formation made of several LEO satellites (typically 4) in several planes (typically 2), 100 km or so apart from each other. The along-track separations between the satellites have to be controlled to within 15 km thanks to orbit correction maneuvers supposed to be performed every 2 weeks. The main difficulty is related to solar activity which is expected to be close to its maximum for the entire mission s lifespan. As a matter of fact, a high solar activity makes orbit prediction harder, and makes it impossible to keep the altitude of the formation constant. Thus, a specific relative maintenance strategy had to be devised in order to meet the mission's requirements. The first part provides a few elements on the mission analysis process that has taken place. The method used for the evaluation of the maneuver frequency is detailed, based on the evaluation of the effects of atmospheric drag on the orbit. The second part is dedicated to the maintenance strategy that has been designed, and particularly to the computation of the reference orbits and of the velocity increments that enable the in-track inter-satellite distances to be maintained within the desired bounds. Finally a few simulation results are presented; they enable the performance of the maintenance strategy to be checked in a more realistic context.

  8. Statical longitudinal stability of airplanes

    NASA Technical Reports Server (NTRS)

    Warner, Edward P

    1921-01-01

    This report, which is a continuation of the "Preliminary report on free flight testing" (report no. NACA-TR-70), presents a detailed theoretical analysis of statical stability with free and locked controls and also the results of many free flight test on several types of airplanes. In developing the theory of stability with locked controls an expression for pitching moment is derived in simple terms by considering the total moment as the sum of the moments due to wings and tail surface. This expression, when differentiated with respect to angle of incidence, enables an analysis to be made of the factors contributing to the pitching moment. The effects of slipstream and down wash are also considered and it is concluded that the C. G. Location has but slight effect or stability, and that stability is much improved by increasing the efficiency of the tail surfaces, which may be done by using an "inverted" tail plane. The results of free flight tests with locked controls are discussed at length and it is shown that the agreement between the experimental results and theory is very satisfactory. The theory of stability with free controls is not amendable to the simple mathematical treatment used in the case of locked controls, but a clear statement of the conditions enables several conclusions to be drawn, one of which is that the fixed tail surfaces should be much larger than the movable surfaces.

  9. Correlated Heterospectral Lipidomics for Biomolecular Profiling of Remyelination in Multiple Sclerosis

    PubMed Central

    2017-01-01

    Analyzing lipid composition and distribution within the brain is important to study white matter pathologies that present focal demyelination lesions, such as multiple sclerosis. Some lesions can endogenously re-form myelin sheaths. Therapies aim to enhance this repair process in order to reduce neurodegeneration and disability progression in patients. In this context, a lipidomic analysis providing both precise molecular classification and well-defined localization is crucial to detect changes in myelin lipid content. Here we develop a correlated heterospectral lipidomic (HSL) approach based on coregistered Raman spectroscopy, desorption electrospray ionization mass spectrometry (DESI-MS), and immunofluorescence imaging. We employ HSL to study the structural and compositional lipid profile of demyelination and remyelination in an induced focal demyelination mouse model and in multiple sclerosis lesions from patients ex vivo. Pixelwise coregistration of Raman spectroscopy and DESI-MS imaging generated a heterospectral map used to interrelate biomolecular structure and composition of myelin. Multivariate regression analysis enabled Raman-based assessment of highly specific lipid subtypes in complex tissue for the first time. This method revealed the temporal dynamics of remyelination and provided the first indication that newly formed myelin has a different lipid composition compared to normal myelin. HSL enables detailed molecular myelin characterization that can substantially improve upon the current understanding of remyelination in multiple sclerosis and provides a strategy to assess remyelination treatments in animal models. PMID:29392175

  10. Comparison of ASTER- and AVIRIS-Derived Mineraland Vegetation Maps of the White Horse Replacement Alunite Deposit and Surrounding Area, Marysvale Volcanic Field, Utah

    USGS Publications Warehouse

    Rockwell, Barnaby W.

    2009-01-01

    This report presents and compares mineral and vegetation maps of parts of the Marysvale volcanic field in west-central Utah that were published in a recent paper describing the White Horse replacement alunite deposit. Detailed, field-verified maps of the deposit were produced from Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data acquired from a low-altitude Twin Otter turboprop airborne platform. Reconnaissance-level maps of surrounding areas including the central and northern Tushar Mountains, Pahvant Range, and portions of the Sevier Plateau to the east were produced from visible, near-infrared, and shortwave-infrared data acquired by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) sensor carried aboard the Terra satellite platform. These maps are also compared to a previously published mineral map of the same area generated from AVIRIS data acquired from the high-altitude NASA ER-2 jet platform. All of the maps were generated by similar analysis methods, enabling the direct comparison of the spatial scale and mineral composition of surface geologic features that can be identified using the three types of remote sensing data. The high spatial (2-17 meter) and spectral (224 bands) resolution AVIRIS data can be used to generate detailed mineral and vegetation maps suitable for geologic and geoenvironmental studies of individual deposits, mines, and smelters. The lower spatial (15-30 meter) and spectral (9 bands) resolution ASTER data are better suited to less detailed mineralogical studies of lithology and alteration across entire hydrothermal systems and mining districts, including regional mineral resource and geoenvironmental assessments. The results presented here demonstrate that minerals and mineral mixtures can be directly identified using AVIRIS and ASTER data to elucidate spatial patterns of mineralogic zonation; AVIRIS data can enable the generation of maps with significantly greater detail and accuracy. The vegetation mapping results suggest that ASTER data may provide an efficient alternative to spectroscopic data for studies of burn severity after wildland fires. A new, semiautomated methodology for the analysis of ASTER data is presented that is currently being applied to ASTER data coverage of large areas for regional assessments of mineral-resource potential and mineral-environmental effects. All maps are presented in a variety of digital formats, including jpeg, pdf, and ERDAS Imagine (.img). The Imagine format files are georeferenced and suitable for viewing with other geospatial data in Imagine, ArcGIS, and ENVI. The mineral and vegetation maps are attributed so that the material identified for a pixel can be determined easily in ArcMap by using the Identify tool and in Imagine by using the Inquire Cursor tool.

  11. Calculation of the nuclear material inventory in a sealed vault by 3D radiation mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adsley, Ian; Klepikov, Alexander; Tur, Yevgeniy

    2013-07-01

    The paper relates to the determination of the amount of nuclear material contained in a closed, concrete lined vault at the Aktau fast breeder reactor in Kazakhstan. This material had been disposed into the vault after examination in an experimental hot cell directly above the vault. In order to comply with IAEA Safeguards requirements it was necessary to determine the total quantities of nuclear materials - enriched uranium and plutonium - that were held with Kazakhstan. Although it was possible to determine the inventory of all of the accessible nuclear material - the quantity remaining in the vault was unknown.more » As part of the Global Threat Reduction Programme the UK Government funded a project to determine the inventory of these nuclear materials in this vault. This involved drilling three penetrations through the concrete lined roof of the vault; this enabled the placement of lights and a camera into the vault through two penetrations; while the third penetration enabled a lightweight manipulator arm to be introduced into the vault. This was used to provide a detailed 3D mapping of the dose rate within the vault and it also enabled the collection of samples for radionuclide analysis. The deconvolution of the 3D dose rate profile within the vault enabled the determination of the gamma emitting source distribution on the floor and walls of the vault. The samples were analysed to determine the fingerprint of those radionuclides producing the gamma dose - namely {sup 137}Cs and {sup 60}Co - to the nuclear materials. The combination of the dose rate source terms on the surfaces of the vault and the fingerprint then enabled the quantities of nuclear materials to be determined. The project was a major success and enabled the Kazakhstan Government to comply with IAEA Safeguards requirements. It also enabled the UK DECC Ministry to develop a technology of national (and international) use. Finally the technology was well received by IAEA Safeguards as an acceptable methodology for future studies. (authors)« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Salvador B.

    SNL has a combination of experimental facilities, nuclear engineering, nuclear security, severe nuclear accidents, and nuclear safeguards expertise that can enable significant progress towards molten salts and fuels for Molten Salt Reactors (MSRs). The following areas and opportunities are discussed in more detail in this white paper.

  13. Construction of prestressed concrete single-tee bridge superstructures.

    DOT National Transportation Integrated Search

    1977-01-01

    This report discusses in detail the construction of the first five precast, prestressed concrete, single-tee beam bridge superstructures to be let to contract in Virginia. The data suggest that this single-tee beam enables efficient construction of t...

  14. Geologic Exploration Enabled by Optimized Science Operations on the Lunar Surface

    NASA Astrophysics Data System (ADS)

    Heldmann, J. L.; Lim, D. S. S.; Colaprete, A.; Garry, W. B.; Hughes, S. S.; Kobs Nawotniak, S.; Sehlke, A.; Neish, C.; Osinski, G. R.; Hodges, K.; Abercromby, A.; Cohen, B. A.; Cook, A.; Elphic, R.; Mallonee, H.; Matiella Novak, A.; Rader, E.; Sears, D.; Sears, H.; Finesse Team; Basalt Team

    2017-10-01

    We present detailed geologic field studies that can best be accomplished through in situ investigations on the Moon, and the associated recommendations for human and robotic mission capabilities and concepts of operations for lunar surface missions.

  15. 37 CFR 1.71 - Detailed description and specification of the invention.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... enable any person skilled in the art or science to which the invention or discovery appertains, or with... specification must include a written description of the invention or discovery and of the manner and process of...

  16. 37 CFR 1.71 - Detailed description and specification of the invention.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... enable any person skilled in the art or science to which the invention or discovery appertains, or with... specification must include a written description of the invention or discovery and of the manner and process of...

  17. A Transpiration Experiment Requiring Critical Thinking Skills.

    ERIC Educational Resources Information Center

    Ford, Rosemary H.

    1998-01-01

    Details laboratory procedures that enable students to understand the concept of how differences in water potential drive the movement of water within a plant in response to transpiration. Students compare transpiration rates for upper and lower surfaces of leaves. (DDR)

  18. Issues and opportunities: beam simulations for heavy ion fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, A

    1999-07-15

    UCRL- JC- 134975 PREPRINT code offering 3- D, axisymmetric, and ''transverse slice'' (steady flow) geometries, with a hierarchy of models for the ''lattice'' of focusing, bending, and accelerating elements. Interactive and script- driven code steering is afforded through an interpreter interface. The code runs with good parallel scaling on the T3E. Detailed simulations of machine segments and of complete small experiments, as well as simplified full- system runs, have been carried out, partially benchmarking the code. A magnetoinductive model, with module impedance and multi- beam effects, is under study. experiments, including an injector scalable to multi- beam arrays, a high-more » current beam transport and acceleration experiment, and a scaled final- focusing experiment. These ''phase I'' projects are laying the groundwork for the next major step in HIF development, the Integrated Research Experiment (IRE). Simulations aimed directly at the IRE must enable us to: design a facility with maximum power on target at minimal cost; set requirements for hardware tolerances, beam steering, etc.; and evaluate proposed chamber propagation modes. Finally, simulations must enable us to study all issues which arise in the context of a fusion driver, and must facilitate the assessment of driver options. In all of this, maximum advantage must be taken of emerging terascale computer architectures, requiring an aggressive code development effort. An organizing principle should be pursuit of the goal of integrated and detailed source- to- target simulation. methods for analysis of the beam dynamics in the various machine concepts, using moment- based methods for purposes of design, waveform synthesis, steering algorithm synthesis, etc. Three classes of discrete- particle models should be coupled: (1) electrostatic/ magnetoinductive PIC simulations should track the beams from the source through the final- focusing optics, passing details of the time- dependent distribution function to (2) electromagnetic or magnetoinductive PIC or hybrid PIG/ fluid simulations in the fusion chamber (which would finally pass their particle trajectory information to the radiation- hydrodynamics codes used for target design); in parallel, (3) detailed PIC, delta- f, core/ test- particle, and perhaps continuum Vlasov codes should be used to study individual sections of the driver and chamber very carefully; consistency may be assured by linking data from the PIC sequence, and knowledge gained may feed back into that sequence.« less

  19. Time-resolved confocal fluorescence microscopy: novel technical features and applications for FLIM, FRET and FCS using a sophisticated data acquisition concept in TCSPC

    NASA Astrophysics Data System (ADS)

    Koberling, Felix; Krämer, Benedikt; Kapusta, Peter; Patting, Matthias; Wahl, Michael; Erdmann, Rainer

    2007-05-01

    In recent years time-resolved fluorescence measurement and analysis techniques became a standard in single molecule microscopy. However, considering the equipment and experimental implementation they are typically still an add-on and offer only limited possibilities to study the mutual dependencies with common intensity and spectral information. In contrast, we are using a specially designed instrument with an unrestricted photon data acquisition approach which allows to store spatial, temporal, spectral and intensity information in a generalized format preserving the full experimental information. This format allows us not only to easily study dependencies between various fluorescence parameters but also to use, for example, the photon arrival time for sorting and weighting the detected photons to improve the significance in common FCS and FRET analysis schemes. The power of this approach will be demonstrated for different techniques: In FCS experiments the concentration determination accuracy can be easily improved by a simple time-gated photon analysis to suppress the fast decaying background signal. A more detailed analysis of the arrival times allows even to separate FCS curves for species which differ in their fluorescence lifetime but, for example, cannot be distinguished spectrally. In multichromophoric systems like a photonic wire which undergoes unidirectional multistep FRET the lifetime information complements significantly the intensity based analysis and helps to assign the respective FRET partners. Moreover, together with pulsed excitation the time-correlated analysis enables directly to take advantage of alternating multi-colour laser excitation. This pulsed interleaved excitation (PIE) can be used to identify and rule out inactive FRET molecules which cause interfering artefacts in standard FRET efficiency analysis. We used a piezo scanner based confocal microscope with compact picosecond diode lasers as excitation sources. The timing performance can be significantly increased by using new SPAD detectors which enable, in conjunction with new TCSPC electronics, an overall IRF width of less than 120 ps maintaining single molecule sensitivity.

  20. Detailed analysis of evolution of the state of polarization in all-fiber polarization transformers.

    PubMed

    Zhu, Xiushan; Jain, Ravinder K

    2006-10-30

    We present a detailed analysis of key attributes and performance characteristics of controllably-spun birefringent-fiber-based all-fiber waveplates or "all fiber polarization transformers" (AFPTs), first proposed and demonstrated by Huang [11]; these AFPTs consist essentially of a long carefully-designed "spin-twisted" high-birefringence fiber, fabricated by slowly varying the spin rate of a birefringent fiber preform (either from very fast to very slow or vice versa) while the fiber is being drawn. The evolution of the eigenstate from a linear polarization state to a circular polarization state, induced by slow variation of the intrinsic structure from linear anisotropy at the unspun end to circular anisotropy at the fast-spun end, enables the AFPT to behave like an all-fiber quarter-wave plate independent of the wavelength of operation. Power coupling between local eigenstates causes unique evolution of the polarization state along the fiber, and has been studied to gain insight into - as well as to understand detailed characteristics of -- the polarization transformation behavior. This has been graphically illustrated via plots of the relative power in these local eigenstates as a function of distance along the length of the fiber and plots of the extinction ratio of the output state of polarization (SOP) as a function of distance and the normalized spin rate. Deeper understanding of such polarization transformers has been further elucidated by quantitative calculations related to two crucial requirements for fabricating practical AFPT devices. Our calculations have also indicated that the polarization mode dispersion behaviour of the AFPT is much smaller than that of the original birefringent fiber. Finally, a specific AFPT was experimentally investigated at two widely-separated wavelengths (1310 nm and 1550 nm) of interest in telecommunications systems applications, further demonstrating and elucidating the broadband character of such AFPTs.

  1. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis

    PubMed Central

    Nyein, Hnin Yin Yin; Challa, Samyuktha; Chen, Kevin; Peck, Austin; Fahad, Hossain M.; Ota, Hiroki; Shiraki, Hiroshi; Kiriya, Daisuke; Lien, Der-Hsien; Brooks, George A.; Davis, Ronald W.; Javey, Ali

    2016-01-01

    Wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual's state of health1–12. Sampling human sweat, which is rich in physiological information13, could enable non-invasive monitoring. Previously reported sweat-based and other non-invasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state14–18. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanically flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plastic-based sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing. This application could not have been realized using either of these technologies alone owing to their respective inherent limitations. The wearable system is used to measure the detailed sweat profile of human subjects engaged in prolonged indoor and outdoor physical activities, and to make a real-time assessment of the physiological state of the subjects. This platform enables a wide range of personalized diagnostic and physiological monitoring applications. PMID:26819044

  2. Complete genome sequence and comparative analysis of Acetobacter pasteurianus 386B, a strain well-adapted to the cocoa bean fermentation ecosystem.

    PubMed

    Illeghems, Koen; De Vuyst, Luc; Weckx, Stefan

    2013-08-01

    Acetobacter pasteurianus 386B, an acetic acid bacterium originating from a spontaneous cocoa bean heap fermentation, proved to be an ideal functional starter culture for coca bean fermentations. It is able to dominate the fermentation process, thereby resisting high acetic acid concentrations and temperatures. However, the molecular mechanisms underlying its metabolic capabilities and niche adaptations are unknown. In this study, whole-genome sequencing and comparative genome analysis was used to investigate this strain's mechanisms to dominate the cocoa bean fermentation process. The genome sequence of A. pasteurianus 386B is composed of a 2.8-Mb chromosome and seven plasmids. The annotation of 2875 protein-coding sequences revealed important characteristics, including several metabolic pathways, the occurrence of strain-specific genes such as an endopolygalacturonase, and the presence of mechanisms involved in tolerance towards various stress conditions. Furthermore, the low number of transposases in the genome and the absence of complete phage genomes indicate that this strain might be more genetically stable compared with other A. pasteurianus strains, which is an important advantage for the use of this strain as a functional starter culture. Comparative genome analysis with other members of the Acetobacteraceae confirmed the functional properties of A. pasteurianus 386B, such as its thermotolerant nature and unique genetic composition. Genome analysis of A. pasteurianus 386B provided detailed insights into the underlying mechanisms of its metabolic features, niche adaptations, and tolerance towards stress conditions. Combination of these data with previous experimental knowledge enabled an integrated, global overview of the functional characteristics of this strain. This knowledge will enable improved fermentation strategies and selection of appropriate acetic acid bacteria strains as functional starter culture for cocoa bean fermentation processes.

  3. Comprehensive processing of high-throughput small RNA sequencing data including quality checking, normalization, and differential expression analysis using the UEA sRNA Workbench

    PubMed Central

    Beckers, Matthew; Mohorianu, Irina; Stocks, Matthew; Applegate, Christopher; Dalmay, Tamas; Moulton, Vincent

    2017-01-01

    Recently, high-throughput sequencing (HTS) has revealed compelling details about the small RNA (sRNA) population in eukaryotes. These 20 to 25 nt noncoding RNAs can influence gene expression by acting as guides for the sequence-specific regulatory mechanism known as RNA silencing. The increase in sequencing depth and number of samples per project enables a better understanding of the role sRNAs play by facilitating the study of expression patterns. However, the intricacy of the biological hypotheses coupled with a lack of appropriate tools often leads to inadequate mining of the available data and thus, an incomplete description of the biological mechanisms involved. To enable a comprehensive study of differential expression in sRNA data sets, we present a new interactive pipeline that guides researchers through the various stages of data preprocessing and analysis. This includes various tools, some of which we specifically developed for sRNA analysis, for quality checking and normalization of sRNA samples as well as tools for the detection of differentially expressed sRNAs and identification of the resulting expression patterns. The pipeline is available within the UEA sRNA Workbench, a user-friendly software package for the processing of sRNA data sets. We demonstrate the use of the pipeline on a H. sapiens data set; additional examples on a B. terrestris data set and on an A. thaliana data set are described in the Supplemental Information. A comparison with existing approaches is also included, which exemplifies some of the issues that need to be addressed for sRNA analysis and how the new pipeline may be used to do this. PMID:28289155

  4. Fluorescence multi-scale endoscopy and its applications in the study and diagnosis of gastro-intestinal diseases: set-up design and software implementation

    NASA Astrophysics Data System (ADS)

    Gómez-García, Pablo Aurelio; Arranz, Alicia; Fresno, Manuel; Desco, Manuel; Mahmood, Umar; Vaquero, Juan José; Ripoll, Jorge

    2015-06-01

    Endoscopy is frequently used in the diagnosis of several gastro-intestinal pathologies as Crohn disease, ulcerative colitis or colorectal cancer. It has great potential as a non-invasive screening technique capable of detecting suspicious alterations in the intestinal mucosa, such as inflammatory processes. However, these early lesions usually cannot be detected with conventional endoscopes, due to lack of cellular detail and the absence of specific markers. Due to this lack of specificity, the development of new endoscopy technologies, which are able to show microscopic changes in the mucosa structure, are necessary. We here present a confocal endomicroscope, which in combination with a wide field fluorescence endoscope offers fast and specific macroscopic information through the use of activatable probes and a detailed analysis at cellular level of the possible altered tissue areas. This multi-modal and multi-scale imaging module, compatible with commercial endoscopes, combines near-infrared fluorescence (NIRF) measurements (enabling specific imaging of markers of disease and prognosis) and confocal endomicroscopy making use of a fiber bundle, providing a cellular level resolution. The system will be used in animal models exhibiting gastro-intestinal diseases in order to analyze the use of potential diagnostic markers in colorectal cancer. In this work, we present in detail the set-up design and the software implementation in order to obtain simultaneous RGB/NIRF measurements and short confocal scanning times.

  5. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  6. An atmospheric pressure high-temperature laminar flow reactor for investigation of combustion and related gas phase reaction systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oßwald, Patrick; Köhler, Markus

    A new high-temperature flow reactor experiment utilizing the powerful molecular beam mass spectrometry (MBMS) technique for detailed observation of gas phase kinetics in reacting flows is presented. The reactor design provides a consequent extension of the experimental portfolio of validation experiments for combustion reaction kinetics. Temperatures up to 1800 K are applicable by three individually controlled temperature zones with this atmospheric pressure flow reactor. Detailed speciation data are obtained using the sensitive MBMS technique, providing in situ access to almost all chemical species involved in the combustion process, including highly reactive species such as radicals. Strategies for quantifying the experimentalmore » data are presented alongside a careful analysis of the characterization of the experimental boundary conditions to enable precise numeric reproduction of the experimental results. The general capabilities of this new analytical tool for the investigation of reacting flows are demonstrated for a selected range of conditions, fuels, and applications. A detailed dataset for the well-known gaseous fuels, methane and ethylene, is provided and used to verify the experimental approach. Furthermore, application for liquid fuels and fuel components important for technical combustors like gas turbines and engines is demonstrated. Besides the detailed investigation of novel fuels and fuel components, the wide range of operation conditions gives access to extended combustion topics, such as super rich conditions at high temperature important for gasification processes, or the peroxy chemistry governing the low temperature oxidation regime. These demonstrations are accompanied by a first kinetic modeling approach, examining the opportunities for model validation purposes.« less

  7. Rapid-estimation method for assessing scour at highway bridges

    USGS Publications Warehouse

    Holnbeck, Stephen R.

    1998-01-01

    A method was developed by the U.S. Geological Survey for rapid estimation of scour at highway bridges using limited site data and analytical procedures to estimate pier, abutment, and contraction scour depths. The basis for the method was a procedure recommended by the Federal Highway Administration for conducting detailed scour investigations, commonly referred to as the Level 2 method. Using pier, abutment, and contraction scour results obtained from Level 2 investigations at 122 sites in 10 States, envelope curves and graphical relations were developed that enable determination of scour-depth estimates at most bridge sites in a matter of a few hours. Rather than using complex hydraulic variables, surrogate variables more easily obtained in the field were related to calculated scour-depth data from Level 2 studies. The method was tested by having several experienced individuals apply the method in the field, and results were compared among the individuals and with previous detailed analyses performed for the sites. Results indicated that the variability in predicted scour depth among individuals applying the method generally was within an acceptable range, and that conservatively greater scour depths generally were obtained by the rapid-estimation method compared to the Level 2 method. The rapid-estimation method is considered most applicable for conducting limited-detail scour assessments and as a screening tool to determine those bridge sites that may require more detailed analysis. The method is designed to be applied only by a qualified professional possessing knowledge and experience in the fields of bridge scour, hydraulics, and flood hydrology, and having specific expertise with the Level 2 method.

  8. Pea Border Cell Maturation and Release Involve Complex Cell Wall Structural Dynamics1[OPEN

    PubMed Central

    2017-01-01

    The adhesion of plant cells is vital for support and protection of the plant body and is maintained by a variety of molecular associations between cell wall components. In some specialized cases, though, plant cells are programmed to detach, and root cap-derived border cells are examples of this. Border cells (in some species known as border-like cells) provide an expendable barrier between roots and the environment. Their maturation and release is an important but poorly characterized cell separation event. To gain a deeper insight into the complex cellular dynamics underlying this process, we undertook a systematic, detailed analysis of pea (Pisum sativum) root tip cell walls. Our study included immunocarbohydrate microarray profiling, monosaccharide composition determination, Fourier-transformed infrared microspectroscopy, quantitative reverse transcription-PCR of cell wall biosynthetic genes, analysis of hydrolytic activities, transmission electron microscopy, and immunolocalization of cell wall components. Using this integrated glycobiology approach, we identified multiple novel modes of cell wall structural and compositional rearrangement during root cap growth and the release of border cells. Our findings provide a new level of detail about border cell maturation and enable us to develop a model of the separation process. We propose that loss of adhesion by the dissolution of homogalacturonan in the middle lamellae is augmented by an active biophysical process of cell curvature driven by the polarized distribution of xyloglucan and extensin epitopes. PMID:28400496

  9. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 78 FR 66929 - Intent To Conduct a Detailed Economic Impact Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... EXPORT-IMPORT BANK Intent To Conduct a Detailed Economic Impact Analysis AGENCY: Policy and... Federal Register notice informing the public of its intent to conduct a detailed economic impact analysis... subject to a detailed economic impact analysis. DATES: The Federal Register notice published on August 5...

  11. Resurfacing event observed in Morpheos basin (Eridania Planitia) and the implications to the formation and timing of Waikato and Reull Valles, Mars

    NASA Astrophysics Data System (ADS)

    Kostama, V.-P.; Kukkonen, S.; Raitala, J.

    2017-06-01

    The large scale outflow channels of the Hellas impact basin are characteristic to its eastern rim region. Although the majority of the valles are located in the large-scale topographic trough connecting Hesperia Planum and Hellas basin, the most far-reaching of them, Reull Vallis is situated to the south-southeast of this trough cutting through Promethei Terra. Reull Vallis and the general geology of the region has been studied in the past, but new higher resolution image data enables us to look into the details of the features implicating the fluvial history of the region. Photogeological mapping using the available data and extensive crater counting utilizing CTX, HiRISE and HRSC provided new insights to the timing of the regional events and episodes. The study resulted in more detailed age constraints compared to the previous results from Viking images. These calculations and the geological study of the upper WMR system (Waikato Vallis - Morpheos basin - Reull Vallis) region and southern Hesperia Planum enabled us to estimate the time-frame of the (fluid) infilling of this reservoir to a model time period of 3.67-3.52 Ga which is thus also the time of the visible activity of the upper Reull Vallis and Waikato Vallis outflow channels. The results also more explicitly defined the size of previously identified Morpheos basin (confined to the 500-550 m contour lines). We also present a geological analysis of the upper parts of the WMR system, and using the observations and calculations, present an updated view of the evolution of the system and associated region.

  12. Direct Scaling of Leaf-Resolving Biophysical Models from Leaves to Canopies

    NASA Astrophysics Data System (ADS)

    Bailey, B.; Mahaffee, W.; Hernandez Ochoa, M.

    2017-12-01

    Recent advances in the development of biophysical models and high-performance computing have enabled rapid increases in the level of detail that can be represented by simulations of plant systems. However, increasingly detailed models typically require increasingly detailed inputs, which can be a challenge to accurately specify. In this work, we explore the use of terrestrial LiDAR scanning data to accurately specify geometric inputs for high-resolution biophysical models that enables direct up-scaling of leaf-level biophysical processes. Terrestrial LiDAR scans generate "clouds" of millions of points that map out the geometric structure of the area of interest. However, points alone are often not particularly useful in generating geometric model inputs, as additional data processing techniques are required to provide necessary information regarding vegetation structure. A new method was developed that directly reconstructs as many leaves as possible that are in view of the LiDAR instrument, and uses a statistical backfilling technique to ensure that the overall leaf area and orientation distribution matches that of the actual vegetation being measured. This detailed structural data is used to provide inputs for leaf-resolving models of radiation, microclimate, evapotranspiration, and photosynthesis. Model complexity is afforded by utilizing graphics processing units (GPUs), which allows for simulations that resolve scales ranging from leaves to canopies. The model system was used to explore how heterogeneity in canopy architecture at various scales affects scaling of biophysical processes from leaves to canopies.

  13. Three-dimensional bright-field scanning transmission electron microscopy elucidate novel nanostructure in microbial biofilms.

    PubMed

    Hickey, William J; Shetty, Ameesha R; Massey, Randall J; Toso, Daniel B; Austin, Jotham

    2017-01-01

    Bacterial biofilms play key roles in environmental and biomedical processes, and understanding their activities requires comprehension of their nanoarchitectural characteristics. Electron microscopy (EM) is an essential tool for nanostructural analysis, but conventional EM methods are limited in that they either provide topographical information alone, or are suitable for imaging only relatively thin (<300 nm) sample volumes. For biofilm investigations, these are significant restrictions. Understanding structural relations between cells requires imaging of a sample volume sufficiently large to encompass multiple cells and the capture of both external and internal details of cell structure. An emerging EM technique with such capabilities is bright-field scanning transmission electron microscopy (BF-STEM) and in the present report BF-STEM was coupled with tomography to elucidate nanostructure in biofilms formed by the polycyclic aromatic hydrocarbon-degrading soil bacterium, Delftia acidovorans Cs1-4. Dual-axis BF-STEM enabled high-resolution 3-D tomographic recontructions (6-10 nm) visualization of thick (1250 and 1500 nm) sections. The 3-D data revealed that novel extracellular structures, termed nanopods, were polymorphic and formed complex networks within cell clusters. BF-STEM tomography enabled visualization of conduits formed by nanopods that could enable intercellular movement of outer membrane vesicles, and thereby enable direct communication between cells. This report is the first to document application of dual-axis BF-STEM tomography to obtain high-resolution 3-D images of novel nanostructures in bacterial biofilms. Future work with dual-axis BF-STEM tomography combined with correlative light electron microscopy may provide deeper insights into physiological functions associated with nanopods as well as other nanostructures. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  14. Technology Investment Agendas to Expand Human Space Futures

    NASA Technical Reports Server (NTRS)

    Sherwood, Brent

    2012-01-01

    The paper develops four alternative core-technology advancement specifications, one for each of the four strategic goal options for government investment in human space flight. Already discussed in the literature, these are: Explore Mars; Settle the Moon; accelerate commercial development of Space Passenger Travel; and enable industrial scale-up of Space Solar Power for Earth. In the case of the Explore Mars goal, the paper starts with the contemporary NASA accounting of ?55 Mars-enabling technologies. The analysis decomposes that technology agenda into technologies applicable only to the Explore Mars goal, versus those applicable more broadly to the other three options. Salient technology needs of all four options are then elaborated to a comparable level of detail. The comparison differentiates how technologies or major developments that may seem the same at the level of budget lines or headlines (e.g., heavy-lift Earth launch) would in fact diverge widely if developed in the service of one or another of the HSF goals. The paper concludes that the explicit choice of human space flight goal matters greatly; an expensive portfolio of challenging technologies would not only enable a particular option, it would foreclose the others. Technologies essential to enable human exploration of Mars cannot prepare interchangeably for alternative futures; they would not allow us to choose later to Settle the Moon, unleash robust growth of Space Passenger Travel industries, or help the transition to a post-petroleum future with Space Solar Power for Earth. The paper concludes that a decades-long decision in the U.S.--whether made consciously or by default--to focus technology investment toward achieving human exploration of Mars someday would effectively preclude the alternative goals in our lifetime.

  15. Progress in coherent lithography using table-top extreme ultraviolet lasers

    NASA Astrophysics Data System (ADS)

    Li, Wei

    Nanotechnology has drawn a wide variety of attention as interesting phenomena occurs when the dimension of the structures is in the nanometer scale. The particular characteristics of nanoscale structures had enabled new applications in different fields in science and technology. Our capability to fabricate these nanostructures routinely for sure will impact the advancement of nanoscience. Apart from the high volume manufacturing in semiconductor industry, a small-scale but reliable nanofabrication tool can dramatically help the research in the field of nanotechnology. This dissertation describes alternative extreme ultraviolet (EUV) lithography techniques which combine table-top EUV laser and various cost-effective imaging strategies. For each technique, numerical simulations, system design, experiment result and its analysis will be presented. In chapter II, a brief review of the main characteristics of table-top EUV lasers will be addressed concentrating on its high power and large coherence radius that enable the lithography application described herein. The development of a Talbot EUV lithography system which is capable of printing 50nm half pitch nanopatterns will be illustrated in chapter III. A detailed discussion of its resolution limit will be presented followed by the development of X-Y-Z positioning stage, the fabrication protocol for diffractive EUV mask, and the pattern transfer using self- developed ion beam etching, and the dose control unit. In addition, this dissertation demonstrated the capability to fabricate functional periodic nanostructures using Talbot EUV lithography. After that, resolution enhancement techniques like multiple exposure, displacement Talbot EUV lithography, fractional Talbot EUV lithography, and Talbot lithography using 18.9nm amplified spontaneous emission laser will be demonstrated. Chapter IV will describe a hybrid EUV lithography which combines the Talbot imaging and interference lithography rendering a high resolution interference pattern whose lattice is modified by a custom designed Talbot mask. In other words, this method enables filling the arbitrary Talbot cell with ultra-fine interference nanofeatures. Detailed optics modeling, system design and experiment results using He-Ne laser and table top EUV laser are included. The last part of chapter IV will analyze its exclusive advantages over traditional Talbot or interference lithography.

  16. CMOS Enabled Microfluidic Systems for Healthcare Based Applications.

    PubMed

    Khan, Sherjeel M; Gumus, Abdurrahman; Nassar, Joanna M; Hussain, Muhammad M

    2018-04-01

    With the increased global population, it is more important than ever to expand accessibility to affordable personalized healthcare. In this context, a seamless integration of microfluidic technology for bioanalysis and drug delivery and complementary metal oxide semiconductor (CMOS) technology enabled data-management circuitry is critical. Therefore, here, the fundamentals, integration aspects, and applications of CMOS-enabled microfluidic systems for affordable personalized healthcare systems are presented. Critical components, like sensors, actuators, and their fabrication and packaging, are discussed and reviewed in detail. With the emergence of the Internet-of-Things and the upcoming Internet-of-Everything for a people-process-data-device connected world, now is the time to take CMOS-enabled microfluidics technology to as many people as possible. There is enormous potential for microfluidic technologies in affordable healthcare for everyone, and CMOS technology will play a major role in making that happen. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. 75 FR 48329 - Tribal Drinking Water Operator Certification Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-10

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9186-8] Tribal Drinking Water Operator Certification Program... details of EPA's voluntary Tribal Drinking Water Operator Certification Program, effective October 1, 2010. The program enables qualified drinking water operators at public water systems in Indian country to be...

  18. UKRmol: a low-energy electron- and positron-molecule scattering suite

    NASA Astrophysics Data System (ADS)

    Carr, J. M.; Galiatsatos, P. G.; Gorfinkiel, J. D.; Harvey, A. G.; Lysaght, M. A.; Madden, D.; Mašín, Z.; Plummer, M.; Tennyson, J.; Varambhia, H. N.

    2012-03-01

    We describe the UK computational implementation of the R-matrix method for the treatment of electron and positron scattering from molecules. Recent developments in the UKRmol suite are detailed together with the collision processes it is enabling us to treat.

  19. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1993-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  20. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1992-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  1. NASA's Long-range Technology Goals

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This document is part of the Final Report performed under contract NASW-3864, titled "NASA's Long-Range Technology Goals". The objectives of the effort were: To identify technologies whose development falls within NASA's capability and purview, and which have high potential for leapfrog advances in the national industrial posture in the 2005-2010 era. To define which of these technologies can also enable quantum jumps in the national space program. To assess mechanisms of interaction between NASA and industry constituencies for realizing the leapfrog technologies. This Volume details the findings pertaining to the advanced space-enabling technologies.

  2. Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McMillan, Michelle L.; Mackie, Scott A.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James L.

    2011-01-01

    Fail-safe, hybrid, flow control (HFC) is a promising technology for meeting high-speed cruise efficiency, low-noise signature, and reduced fuel-burn goals for future, Hybrid-Wing-Body (HWB) aircraft with embedded engines. This report details the development of HFC technology that enables improved inlet performance in HWB vehicles with highly integrated inlets and embedded engines without adversely affecting vehicle performance. In addition, new test techniques for evaluating Boundary-Layer-Ingesting (BLI)-inlet flow-control technologies developed and demonstrated through this program are documented, including the ability to generate a BLI-like inlet-entrance flow in a direct-connect, wind-tunnel facility, as well as, the use of D-optimal, statistically designed experiments to optimize test efficiency and enable interpretation of results. Validated improvements in numerical analysis tools and methods accomplished through this program are also documented, including Reynolds-Averaged Navier-Stokes CFD simulations of steady-state flow physics for baseline, BLI-inlet diffuser flow, as well as, that created by flow-control devices. Finally, numerical methods were employed in a ground-breaking attempt to directly simulate dynamic distortion. The advances in inlet technologies and prediction tools will help to meet and exceed "N+2" project goals for future HWB aircraft.

  3. NTCP-Reconstituted In Vitro HBV Infection System.

    PubMed

    Sun, Yinyan; Qi, Yonghe; Peng, Bo; Li, Wenhui

    2017-01-01

    Sodium taurocholate cotransporting polypeptide (NTCP) has been identified as a functional receptor for hepatitis B virus (HBV). Expressing human NTCP in human hepatoma HepG2 cells (HepG2-NTCP) renders these cells susceptible for HBV infection. The HepG2-NTCP stably transfected cell line provides a much-needed and easily accessible platform for studying the virus. HepG2-NTCP cells could also be used to identify chemicals targeting key steps of the virus life cycle including HBV covalent closed circular (ccc) DNA, and enable the development of novel antivirals against the infection.Many factors may contribute to the efficiency of HBV infection on HepG2-NTCP cells, with clonal differences among cell line isolates, the source of viral inoculum, and infection medium among the most critical ones. Here, we provide detailed protocols for efficient HBV infection of HepG2-NTCP cells in culture; generation and selection of single cell clones of HepG2-NTCP; production of infectious HBV virion stock through DNA transfection of recombinant plasmid that enables studying primary clinical HBV isolates; and assessing the infection with immunostaining of HBV antigens and Southern blot analysis of HBV cccDNA.

  4. Nonparametric Online Learning Control for Soft Continuum Robot: An Enabling Technique for Effective Endoscopic Navigation

    PubMed Central

    Lee, Kit-Hang; Fu, Denny K.C.; Leong, Martin C.W.; Chow, Marco; Fu, Hing-Choi; Althoefer, Kaspar; Sze, Kam Yim; Yeung, Chung-Kwong

    2017-01-01

    Abstract Bioinspired robotic structures comprising soft actuation units have attracted increasing research interest. Taking advantage of its inherent compliance, soft robots can assure safe interaction with external environments, provided that precise and effective manipulation could be achieved. Endoscopy is a typical application. However, previous model-based control approaches often require simplified geometric assumptions on the soft manipulator, but which could be very inaccurate in the presence of unmodeled external interaction forces. In this study, we propose a generic control framework based on nonparametric and online, as well as local, training to learn the inverse model directly, without prior knowledge of the robot's structural parameters. Detailed experimental evaluation was conducted on a soft robot prototype with control redundancy, performing trajectory tracking in dynamically constrained environments. Advanced element formulation of finite element analysis is employed to initialize the control policy, hence eliminating the need for random exploration in the robot's workspace. The proposed control framework enabled a soft fluid-driven continuum robot to follow a 3D trajectory precisely, even under dynamic external disturbance. Such enhanced control accuracy and adaptability would facilitate effective endoscopic navigation in complex and changing environments. PMID:29251567

  5. A Progressive Damage Model for unidirectional Fibre Reinforced Composites with Application to Impact and Penetration Simulation

    NASA Astrophysics Data System (ADS)

    Kerschbaum, M.; Hopmann, C.

    2016-06-01

    The computationally efficient simulation of the progressive damage behaviour of continuous fibre reinforced plastics is still a challenging task with currently available computer aided engineering methods. This paper presents an original approach for an energy based continuum damage model which accounts for stress-/strain nonlinearities, transverse and shear stress interaction phenomena, quasi-plastic shear strain components, strain rate effects, regularised damage evolution and consideration of load reversal effects. The physically based modelling approach enables experimental determination of all parameters on ply level to avoid expensive inverse analysis procedures. The modelling strategy, implementation and verification of this model using commercially available explicit finite element software are detailed. The model is then applied to simulate the impact and penetration of carbon fibre reinforced cross-ply specimens with variation of the impact speed. The simulation results show that the presented approach enables a good representation of the force-/displacement curves and especially well agreement with the experimentally observed fracture patterns. In addition, the mesh dependency of the results were assessed for one impact case showing only very little change of the simulation results which emphasises the general applicability of the presented method.

  6. Nonparametric Online Learning Control for Soft Continuum Robot: An Enabling Technique for Effective Endoscopic Navigation.

    PubMed

    Lee, Kit-Hang; Fu, Denny K C; Leong, Martin C W; Chow, Marco; Fu, Hing-Choi; Althoefer, Kaspar; Sze, Kam Yim; Yeung, Chung-Kwong; Kwok, Ka-Wai

    2017-12-01

    Bioinspired robotic structures comprising soft actuation units have attracted increasing research interest. Taking advantage of its inherent compliance, soft robots can assure safe interaction with external environments, provided that precise and effective manipulation could be achieved. Endoscopy is a typical application. However, previous model-based control approaches often require simplified geometric assumptions on the soft manipulator, but which could be very inaccurate in the presence of unmodeled external interaction forces. In this study, we propose a generic control framework based on nonparametric and online, as well as local, training to learn the inverse model directly, without prior knowledge of the robot's structural parameters. Detailed experimental evaluation was conducted on a soft robot prototype with control redundancy, performing trajectory tracking in dynamically constrained environments. Advanced element formulation of finite element analysis is employed to initialize the control policy, hence eliminating the need for random exploration in the robot's workspace. The proposed control framework enabled a soft fluid-driven continuum robot to follow a 3D trajectory precisely, even under dynamic external disturbance. Such enhanced control accuracy and adaptability would facilitate effective endoscopic navigation in complex and changing environments.

  7. Probing different regimes of strong field light-matter interaction with semiconductor quantum dots and few cavity photons

    NASA Astrophysics Data System (ADS)

    Hargart, F.; Roy-Choudhury, K.; John, T.; Portalupi, S. L.; Schneider, C.; Höfling, S.; Kamp, M.; Hughes, S.; Michler, P.

    2016-12-01

    In this work we present an extensive experimental and theoretical investigation of different regimes of strong field light-matter interaction for cavity-driven quantum dot (QD) cavity systems. The electric field enhancement inside a high-Q micropillar cavity facilitates exceptionally strong interaction with few cavity photons, enabling the simultaneous investigation for a wide range of QD-laser detuning. In case of a resonant drive, the formation of dressed states and a Mollow triplet sideband splitting of up to 45 μeV is measured for a mean cavity photon number < {n}c> ≤slant 1. In the asymptotic limit of the linear AC Stark effect we systematically investigate the power and detuning dependence of more than 400 QDs. Some QD-cavity systems exhibit an unexpected anomalous Stark shift, which can be explained by an extended dressed 4-level QD model. We provide a detailed analysis of the QD-cavity systems properties enabling this novel effect. The experimental results are successfully reproduced using a polaron master equation approach for the QD-cavity system, which includes the driving laser field, exciton-cavity and exciton-phonon interactions.

  8. Towards programmable plant genetic circuits.

    PubMed

    Medford, June I; Prasad, Ashok

    2016-07-01

    Synthetic biology enables the construction of genetic circuits with predictable gene functions in plants. Detailed quantitative descriptions of the transfer function or input-output function for genetic parts (promoters, 5' and 3' untranslated regions, etc.) are collected. These data are then used in computational simulations to determine their robustness and desired properties, thereby enabling the best components to be selected for experimental testing in plants. In addition, the process forms an iterative workflow which allows vast improvement to validated elements with sub-optimal function. These processes enable computational functions such as digital logic in living plants and follow the pathway of technological advances which took us from vacuum tubes to cell phones. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  9. Revolutionary Deep Space Science Missions Enabled by Onboard Autonomy

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Debban, Theresa; Yen, Chen wan; Sherwood, Robert; Castano, Rebecca; Cichy, Benjamin; Davies, Ashley; Brul, Michael; Fukunaga, Alex; Fukunaga, Alex; hide

    2003-01-01

    Breakthrough autonomy technologies enable a new range of spire missions that acquire vast amounts of data and return only the most scientifically important data to Earth. These missions would monitor science phenomena in great detail (either with frequent observations or at extremely high spatial resolution) and onboard analyze the data to detect specific science events of interest. These missions would monitor volcanic eruptions, formation and movement of aeolian features. and atmospheric phenomena. The autonomous spacecraft would respond to science events by planning its future operations to revisit or perform complementary observations. In this paradigm, the spacecraft represents the scientists agent enabling optimization of the downlink data volume resource. This paper describes preliminary efforts to define and design such missions.

  10. Magnetization dynamics in dilute Pd1-xFex thin films and patterned microstructures considered for superconducting electronics

    NASA Astrophysics Data System (ADS)

    Golovchanskiy, I. A.; Bolginov, V. V.; Abramov, N. N.; Stolyarov, V. S.; Ben Hamida, A.; Chichkov, V. I.; Roditchev, D.; Ryazanov, V. V.

    2016-10-01

    Motivated by recent burst of applications of ferromagnetic layers in superconducting digital and quantum elements, we study the magnetism of thin films and patterned microstructures of Pd0.99Fe0.01. In this diluted ferromagnetic system, a high-sensitivity ferromagnetic resonance (FMR) experiment reveals spectroscopic signatures of re-magnetization and enables the estimation of the saturation magnetization, the anisotropy field, and the Gilbert damping constant. The detailed analysis of FMR spectra links the observed unexpectedly high reduced anisotropy field (0.06-0.14) with the internal anisotropy, points towards a cluster nature of the ferromagnetism, and allows estimating characteristic time scale for magnetization dynamics in Pd-Fe based cryogenic memory elements to ( 3 - 5 ) × 10 - 9 s.

  11. Squeeze film dampers with oil hole feed

    NASA Technical Reports Server (NTRS)

    Chen, P. Y. P.; Hahn, E. J.

    1994-01-01

    To improve the damping capability of squeeze film dampers, oil hole feed rather than circumferential groove feed is a practical proposition. However, circular orbit response can no longer be assumed, significantly complicating the design analysis. This paper details a feasible transient solution procedure for such dampers, with particular emphasis on the additional difficulties due to the introduction of oil holes. It is shown how a cosine power series solution may be utilized to evaluate the oil hole pressure contributions, enabling appropriate tabular data to be compiled. The solution procedure is shown to be applicable even in the presence of flow restrictors, albeit at the expense of introducing an iteration at each time step. Though not of primary interest, the procedure is also applicable to dynamically loaded journal bearings with oil hole feed.

  12. Modeling Photo-Bleaching Kinetics to Create High Resolution Maps of Rod Rhodopsin in the Human Retina

    PubMed Central

    Ehler, Martin; Dobrosotskaya, Julia; Cunningham, Denise; Wong, Wai T.; Chew, Emily Y.; Czaja, Wojtek; Bonner, Robert F.

    2015-01-01

    We introduce and describe a novel non-invasive in-vivo method for mapping local rod rhodopsin distribution in the human retina over a 30-degree field. Our approach is based on analyzing the brightening of detected lipofuscin autofluorescence within small pixel clusters in registered imaging sequences taken with a commercial 488nm confocal scanning laser ophthalmoscope (cSLO) over a 1 minute period. We modeled the kinetics of rhodopsin bleaching by applying variational optimization techniques from applied mathematics. The physical model and the numerical analysis with its implementation are outlined in detail. This new technique enables the creation of spatial maps of the retinal rhodopsin and retinal pigment epithelium (RPE) bisretinoid distribution with an ≈ 50μm resolution. PMID:26196397

  13. Cavity-Enhanced Raman Spectroscopy for Food Chain Management

    PubMed Central

    Sandfort, Vincenz; Goldschmidt, Jens; Wöllenstein, Jürgen

    2018-01-01

    Comprehensive food chain management requires the monitoring of many parameters including temperature, humidity, and multiple gases. The latter is highly challenging because no low-cost technology for the simultaneous chemical analysis of multiple gaseous components currently exists. This contribution proposes the use of cavity enhanced Raman spectroscopy to enable online monitoring of all relevant components using a single laser source. A laboratory scale setup is presented and characterized in detail. Power enhancement of the pump light is achieved in an optical resonator with a Finesse exceeding 2500. A simulation for the light scattering behavior shows the influence of polarization on the spatial distribution of the Raman scattered light. The setup is also used to measure three relevant showcase gases to demonstrate the feasibility of the approach, including carbon dioxide, oxygen and ethene. PMID:29495501

  14. Environmental control system transducer development study

    NASA Technical Reports Server (NTRS)

    Brudnicki, M. J.

    1973-01-01

    A failure evaluation of the transducers used in the environmental control systems of the Apollo command service module, lunar module, and portable life support system is presented in matrix form for several generic categories of transducers to enable identification of chronic failure modes. Transducer vendors were contacted and asked to supply detailed information. The evaluation data generated for each category of transducer were compiled and published in failure design evaluation reports. The evaluation reports also present a review of the failure and design data for the transducers and suggest both design criteria to improve reliability of the transducers and, where necessary, design concepts for required redesign of the transducers. Remedial designs were implemented on a family of pressure transducers and on the oxygen flow transducer. The design concepts were subjected to analysis, breadboard fabrication, and verification testing.

  15. Stereotype threat and group differences in test performance: a question of measurement invariance.

    PubMed

    Wicherts, Jelte M; Dolan, Conor V; Hessen, David J

    2005-11-01

    Studies into the effects of stereotype threat (ST) on test performance have shed new light on race and sex differences in achievement and intelligence test scores. In this article, the authors relate ST theory to the psychometric concept of measurement invariance and show that ST effects may be viewed as a source of measurement bias. As such, ST effects are detectable by means of multi-group confirmatory factor analysis. This enables research into the generalizability of ST effects to real-life or high-stakes testing. The modeling approach is described in detail and applied to 3 experiments in which the amount of ST for minorities and women was manipulated. Results indicate that ST results in measurement bias of intelligence and mathematics tests. ((c) 2005 APA, all rights reserved).

  16. The role of genetic and epigenetic alterations in neuroblastoma disease pathogenesis

    PubMed Central

    Domingo-Fernandez, Raquel; Watters, Karen; Piskareva, Olga; Bray, Isabella

    2013-01-01

    Neuroblastoma is a highly heterogeneous tumor accounting for 15 % of all pediatric cancer deaths. Clinical behavior ranges from the spontaneous regression of localized, asymptomatic tumors, as well as metastasized tumors in infants, to rapid progression and resistance to therapy. Genomic amplification of the MYCN oncogene has been used to predict outcome in neuroblastoma for over 30 years, however, recent methodological advances including miR-NA and mRNA profiling, comparative genomic hybridization (array-CGH), and whole-genome sequencing have enabled the detailed analysis of the neuroblastoma genome, leading to the identification of new prognostic markers and better patient stratification. In this review, we will describe the main genetic factors responsible for these diverse clinical phenotypes in neuroblastoma, the chronology of their discovery, and the impact on patient prognosis. PMID:23274701

  17. Evaluation of the feasibility and viability of modular pumped storage hydro (m-PSH) in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witt, Adam M.; Hadjerioua, Boualem; Martinez, Rocio

    The viability of modular pumped storage hydro (m-PSH) is examined in detail through the conceptual design, cost scoping, and economic analysis of three case studies. Modular PSH refers to both the compactness of the project design and the proposed nature of product fabrication and performance. A modular project is assumed to consist of pre-fabricated standardized components and equipment, tested and assembled into modules before arrival on site. This technology strategy could enable m-PSH projects to deploy with less substantial civil construction and equipment component costs. The concept of m-PSH is technically feasible using currently available conventional pumping and turbine equipment,more » and may offer a path to reducing the project development cycle from inception to commissioning.« less

  18. Evidence of Phyllosilicate in Wooly Patch+: An Altered Rock Encountered on the Spirit Rover Traverse

    NASA Technical Reports Server (NTRS)

    Wang, Alian; Haskin, Larry A.; Korotev, Randy L.; Jolliff, Brad L.; deSouza, Paulo, Jr.; Kusack, Alastair G.

    2005-01-01

    In the course of examining rocks along the traverse of the Spirit rover toward the Columbia Hills [1, 9], we noticed that the chemistry of a rock named "Wooly Patch" was neither basaltic as the rocks near the landing site [8] nor slightly altered basalt inferred from regolith in plains trenches [10]. The major cation ratios appear to match those of phyllosilicates [11]. The presence of phyllosilicate minerals on Mars has been predicted [12]; reasons for the rarity or absence of phyllosilicates have also been discussed [13]. We have thus done as detailed an analysis of Wooly Patch as the data enable, which suggests phyllosilicates of kaolinite, serpentine, and chlorite types, plus some feldspar and pyroxene are prime candidates to constitute Wooly Patch.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallarno, George; Rogers, James H; Maxwell, Don E

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learnedmore » in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.« less

  20. Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.

    PubMed

    Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R

    2017-02-01

    Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Single Cell Gene Expression Profiling of Skeletal Muscle-Derived Cells.

    PubMed

    Gatto, Sole; Puri, Pier Lorenzo; Malecova, Barbora

    2017-01-01

    Single cell gene expression profiling is a fundamental tool for studying the heterogeneity of a cell population by addressing the phenotypic and functional characteristics of each cell. Technological advances that have coupled microfluidic technologies with high-throughput quantitative RT-PCR analyses have enabled detailed analyses of single cells in various biological contexts. In this chapter, we describe the procedure for isolating the skeletal muscle interstitial cells termed Fibro-Adipogenic Progenitors (FAPs ) and their gene expression profiling at the single cell level. Moreover, we accompany our bench protocol with bioinformatics analysis designed to process raw data as well as to visualize single cell gene expression data. Single cell gene expression profiling is therefore a useful tool in the investigation of FAPs heterogeneity and their contribution to muscle homeostasis.

  2. Targeted Research and Technology Within NASA's Living With a Star Program

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2003-01-01

    NASA's Living With a Star (LWS) initiative is a systematic, goal-oriented research program targeting those aspects of the Sun-Earth system that affect society. The Targeted Research and Technology (TR&T) component of LWS provides the theory, modeling, and data analysis necessary to enable an integrated, system-wide picture of Sun-Earth connection science with societal relevance. Recognizing the central and essential role that TR&T would have for the success of the LWS initiative, the LWS Science Architecture Team (SAT) recommended that a Science Definition Team (SDT), with the same status as a flight mission definition team, be formed to design and coordinate a TR&T program having prioritized goals and objectives that focused on practical societal benefits. This report details the SDT recommendations for the TR&T program.

  3. Aerodynamic loads on buses due to crosswind gusts: extended analysis

    NASA Astrophysics Data System (ADS)

    Drugge, Lars; Juhlin, Magnus

    2010-12-01

    The objective of this work is to use inverse simulations on measured vehicle data in order to estimate the aerodynamic loads on a bus when exposed to crosswind situations. Tyre forces, driver input, wind velocity and vehicle response were measured on a typical coach when subjected to natural crosswind gusts. Based on these measurements and a detailed MBS vehicle model, the aerodynamic loads were estimated through inverse simulations. In order to estimate the lift force, roll and pitch moments in addition to the lateral force and yaw moment, the simulation model was extended by also incorporating the estimation of the vertical road disturbances. The proposed method enables the estimation of aerodynamic loads due to crosswind gusts without using a full scale wind tunnel adapted for crosswind excitation.

  4. Clustering and assembly dynamics of a one-dimensional microphase former.

    PubMed

    Hu, Yi; Charbonneau, Patrick

    2018-05-23

    Both ordered and disordered microphases ubiquitously form in suspensions of particles that interact through competing short-range attraction and long-range repulsion (SALR). While ordered microphases are more appealing materials targets, understanding the rich structural and dynamical properties of their disordered counterparts is essential to controlling their mesoscale assembly. Here, we study the disordered regime of a one-dimensional (1D) SALR model, whose simplicity enables detailed analysis by transfer matrices and Monte Carlo simulations. We first characterize the signature of the clustering process on macroscopic observables, and then assess the equilibration dynamics of various simulation algorithms. We notably find that cluster moves markedly accelerate the mixing time, but that event chains are of limited help in the clustering regime. These insights will inspire further study of three-dimensional microphase formers.

  5. An Electronic Tree Inventory for Arboriculture Management

    NASA Astrophysics Data System (ADS)

    Tait, Roger J.; Allen, Tony J.; Sherkat, Nasser; Bellett-Travers, Marcus D.

    The integration of Global Positioning System (GPS) technology into mobile devices provides them with an awareness of their physical location. This geospatial context can be employed in a wide range of applications including locating nearby places of interest as well as guiding emergency services to incidents. In this research, a GPS-enabled Personal Digital Assistant (PDA) is used to create a computerised tree inventory for the management of arboriculture. Using the General Packet Radio Service (GPRS), GPS information and arboreal image data are sent to a web-server. An office-based PC running customised Geographical Information Software (GIS) then automatically retrieves the GPS tagged image data for display and analysis purposes. The resulting application allows an expert user to view the condition of individual trees in greater detail than is possible using remotely sensed imagery.

  6. Surface plasmon resonances of arbitrarily shaped nanometallic structures in the small-screening-length limit

    PubMed Central

    Giannini, Vincenzo; Maier, Stefan A.; Craster, Richard V.

    2016-01-01

    According to the hydrodynamic Drude model, surface plasmon resonances of metallic nanostructures blueshift owing to the non-local response of the metal’s electron gas. The screening length characterizing the non-local effect is often small relative to the overall dimensions of the metallic structure, which enables us to derive a coarse-grained non-local description using matched asymptotic expansions; a perturbation theory for the blueshifts of arbitrary-shaped nanometallic structures is then developed. The effect of non-locality is not always a perturbation and we present a detailed analysis of the ‘bonding’ modes of a dimer of nearly touching nanowires where the leading-order eigenfrequencies and eigenmode distributions are shown to be a renormalization of those predicted assuming a local metal permittivity. PMID:27493575

  7. Band Excitation for Scanning Probe Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jesse, Stephen

    2017-01-02

    The Band Excitation (BE) technique for scanning probe microscopy uses a precisely determined waveform that contains specific frequencies to excite the cantilever or sample in an atomic force microscope to extract more information, and more reliable information from a sample. There are a myriad of details and complexities associated with implementing the BE technique. There is therefore a need to have a user friendly interface that allows typical microscopists access to this methodology. This software enables users of atomic force microscopes to easily: build complex band-excitation waveforms, set-up the microscope scanning conditions, configure the input and output electronics for generatemore » the waveform as a voltage signal and capture the response of the system, perform analysis on the captured response, and display the results of the measurement.« less

  8. A semiconductor radiation imaging pixel detector for space radiation dosimetry.

    PubMed

    Kroupa, Martin; Bahadori, Amir; Campbell-Ricketts, Thomas; Empl, Anton; Hoang, Son Minh; Idarraga-Munoz, John; Rios, Ryan; Semones, Edward; Stoffle, Nicholas; Tlustos, Lukas; Turecek, Daniel; Pinsky, Lawrence

    2015-07-01

    Progress in the development of high-performance semiconductor radiation imaging pixel detectors based on technologies developed for use in high-energy physics applications has enabled the development of a completely new generation of compact low-power active dosimeters and area monitors for use in space radiation environments. Such detectors can provide real-time information concerning radiation exposure, along with detailed analysis of the individual particles incident on the active medium. Recent results from the deployment of detectors based on the Timepix from the CERN-based Medipix2 Collaboration on the International Space Station (ISS) are reviewed, along with a glimpse of developments to come. Preliminary results from Orion MPCV Exploration Flight Test 1 are also presented. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.

  9. Layer 1 VPN services in distributed next-generation SONET/SDH networks with inverse multiplexing

    NASA Astrophysics Data System (ADS)

    Ghani, N.; Muthalaly, M. V.; Benhaddou, D.; Alanqar, W.

    2006-05-01

    Advances in next-generation SONET/SDH along with GMPLS control architectures have enabled many new service provisioning capabilities. In particular, a key services paradigm is the emergent Layer 1 virtual private network (L1 VPN) framework, which allows multiple clients to utilize a common physical infrastructure and provision their own 'virtualized' circuit-switched networks. This precludes expensive infrastructure builds and increases resource utilization for carriers. Along these lines, a novel L1 VPN services resource management scheme for next-generation SONET/SDH networks is proposed that fully leverages advanced virtual concatenation and inverse multiplexing features. Additionally, both centralized and distributed GMPLS-based implementations are also tabled to support the proposed L1 VPN services model. Detailed performance analysis results are presented along with avenues for future research.

  10. Microfluidic Arrayed Lab-On-A-Chip for Electrochemical Capacitive Detection of DNA Hybridization Events.

    PubMed

    Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza

    2017-01-01

    A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.

  11. Nmrglue: an open source Python package for the analysis of multidimensional NMR data.

    PubMed

    Helmus, Jonathan J; Jaroniec, Christopher P

    2013-04-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  12. Nmrglue: An Open Source Python Package for the Analysis of Multidimensional NMR Data

    PubMed Central

    Helmus, Jonathan J.; Jaroniec, Christopher P.

    2013-01-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license. PMID:23456039

  13. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program: Boeing Helicopters airframe finite element modeling

    NASA Technical Reports Server (NTRS)

    Gabel, R.; Lang, P.; Reed, D.

    1993-01-01

    Mathematical models based on the finite element method of structural analysis, as embodied in the NASTRAN computer code, are routinely used by the helicopter industry to calculate airframe static internal loads used for sizing structural members. Historically, less reliance has been placed on the vibration predictions based on these models. Beginning in the early 1980's NASA's Langley Research Center initiated an industry wide program with the objective of engendering the needed trust in vibration predictions using these models and establishing a body of modeling guides which would enable confident future prediction of airframe vibration as part of the regular design process. Emphasis in this paper is placed on the successful modeling of the Army/Boeing CH-47D which showed reasonable correlation with test data. A principal finding indicates that improved dynamic analysis requires greater attention to detail and perhaps a finer mesh, especially the mass distribution, than the usual stress model. Post program modeling efforts show improved correlation placing key modal frequencies in the b/rev range with 4 percent of the test frequencies.

  14. A scoring metric for multivariate data for reproducibility analysis using chemometric methods

    PubMed Central

    Sheen, David A.; de Carvalho Rocha, Werickson Fortunato; Lippa, Katrice A.; Bearden, Daniel W.

    2017-01-01

    Process quality control and reproducibility in emerging measurement fields such as metabolomics is normally assured by interlaboratory comparison testing. As a part of this testing process, spectral features from a spectroscopic method such as nuclear magnetic resonance (NMR) spectroscopy are attributed to particular analytes within a mixture, and it is the metabolite concentrations that are returned for comparison between laboratories. However, data quality may also be assessed directly by using binned spectral data before the time-consuming identification and quantification. Use of the binned spectra has some advantages, including preserving information about trace constituents and enabling identification of process difficulties. In this paper, we demonstrate the use of binned NMR spectra to conduct a detailed interlaboratory comparison and composition analysis. Spectra of synthetic and biologically-obtained metabolite mixtures, taken from a previous interlaboratory study, are compared with cluster analysis using a variety of distance and entropy metrics. The individual measurements are then evaluated based on where they fall within their clusters, and a laboratory-level scoring metric is developed, which provides an assessment of each laboratory’s individual performance. PMID:28694553

  15. Comparative structural analysis of human DEAD-box RNA helicases.

    PubMed

    Schütz, Patrick; Karlberg, Tobias; van den Berg, Susanne; Collins, Ruairi; Lehtiö, Lari; Högbom, Martin; Holmberg-Schiavone, Lovisa; Tempel, Wolfram; Park, Hee-Won; Hammarström, Martin; Moche, Martin; Thorsell, Ann-Gerd; Schüler, Herwig

    2010-09-30

    DEAD-box RNA helicases play various, often critical, roles in all processes where RNAs are involved. Members of this family of proteins are linked to human disease, including cancer and viral infections. DEAD-box proteins contain two conserved domains that both contribute to RNA and ATP binding. Despite recent advances the molecular details of how these enzymes convert chemical energy into RNA remodeling is unknown. We present crystal structures of the isolated DEAD-domains of human DDX2A/eIF4A1, DDX2B/eIF4A2, DDX5, DDX10/DBP4, DDX18/myc-regulated DEAD-box protein, DDX20, DDX47, DDX52/ROK1, and DDX53/CAGE, and of the helicase domains of DDX25 and DDX41. Together with prior knowledge this enables a family-wide comparative structural analysis. We propose a general mechanism for opening of the RNA binding site. This analysis also provides insights into the diversity of DExD/H- proteins, with implications for understanding the functions of individual family members.

  16. Comparative Structural Analysis of Human DEAD-Box RNA Helicases

    PubMed Central

    Schütz, Patrick; Karlberg, Tobias; van den Berg, Susanne; Collins, Ruairi; Lehtiö, Lari; Högbom, Martin; Holmberg-Schiavone, Lovisa; Tempel, Wolfram; Park, Hee-Won; Hammarström, Martin; Moche, Martin; Thorsell, Ann-Gerd; Schüler, Herwig

    2010-01-01

    DEAD-box RNA helicases play various, often critical, roles in all processes where RNAs are involved. Members of this family of proteins are linked to human disease, including cancer and viral infections. DEAD-box proteins contain two conserved domains that both contribute to RNA and ATP binding. Despite recent advances the molecular details of how these enzymes convert chemical energy into RNA remodeling is unknown. We present crystal structures of the isolated DEAD-domains of human DDX2A/eIF4A1, DDX2B/eIF4A2, DDX5, DDX10/DBP4, DDX18/myc-regulated DEAD-box protein, DDX20, DDX47, DDX52/ROK1, and DDX53/CAGE, and of the helicase domains of DDX25 and DDX41. Together with prior knowledge this enables a family-wide comparative structural analysis. We propose a general mechanism for opening of the RNA binding site. This analysis also provides insights into the diversity of DExD/H- proteins, with implications for understanding the functions of individual family members. PMID:20941364

  17. Domain motions of Argonaute, the catalytic engine of RNA interference

    PubMed Central

    Ming, Dengming; Wall, Michael E; Sanbonmatsu, Kevin Y

    2007-01-01

    Background The Argonaute protein is the core component of the RNA-induced silencing complex, playing the central role of cleaving the mRNA target. Visual inspection of static crystal structures already has enabled researchers to suggest conformational changes of Argonaute that might occur during RNA interference. We have taken the next step by performing an all-atom normal mode analysis of the Pyrococcus furiosus and Aquifex aeolicus Argonaute crystal structures, allowing us to quantitatively assess the feasibility of these conformational changes. To perform the analysis, we begin with the energy-minimized X-ray structures. Normal modes are then calculated using an all-atom molecular mechanics force field. Results The analysis reveals low-frequency vibrations that facilitate the accommodation of RNA duplexes – an essential step in target recognition. The Pyrococcus furiosus and Aquifex aeolicus Argonaute proteins both exhibit low-frequency torsion and hinge motions; however, differences in the overall architecture of the proteins cause the detailed dynamics to be significantly different. Conclusion Overall, low-frequency vibrations of Argonaute are consistent with mechanisms within the current reaction cycle model for RNA interference. PMID:18053142

  18. Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model

    NASA Astrophysics Data System (ADS)

    Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi

    Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.

  19. Models in palaeontological functional analysis

    PubMed Central

    Anderson, Philip S. L.; Bright, Jen A.; Gill, Pamela G.; Palmer, Colin; Rayfield, Emily J.

    2012-01-01

    Models are a principal tool of modern science. By definition, and in practice, models are not literal representations of reality but provide simplifications or substitutes of the events, scenarios or behaviours that are being studied or predicted. All models make assumptions, and palaeontological models in particular require additional assumptions to study unobservable events in deep time. In the case of functional analysis, the degree of missing data associated with reconstructing musculoskeletal anatomy and neuronal control in extinct organisms has, in the eyes of some scientists, rendered detailed functional analysis of fossils intractable. Such a prognosis may indeed be realized if palaeontologists attempt to recreate elaborate biomechanical models based on missing data and loosely justified assumptions. Yet multiple enabling methodologies and techniques now exist: tools for bracketing boundaries of reality; more rigorous consideration of soft tissues and missing data and methods drawing on physical principles that all organisms must adhere to. As with many aspects of science, the utility of such biomechanical models depends on the questions they seek to address, and the accuracy and validity of the models themselves. PMID:21865242

  20. Vibrational spectroscopic study of poldervaartite CaCa[SiO3(OH)(OH)

    NASA Astrophysics Data System (ADS)

    Frost, Ray L.; López, Andrés; Scholz, Ricardo; Lima, Rosa Malena Fernandes

    2015-02-01

    We have studied the mineral poldervaartite CaCa[SiO3(OH)(OH)] which forms a series with its manganese analogue olmiite CaMn[SiO3(OH)](OH) using a range of techniques including scanning electron microscopy, thermogravimetric analysis, Raman and infrared spectroscopy. Chemical analysis shows the mineral is reasonably pure and contains only calcium and manganese with low amounts of Al and F. Thermogravimetric analysis proves the mineral decomposes at 485 °C with a mass loss of 7.6% compared with the theoretical mass loss of 7.7%. A strong Raman band at 852 cm-1 is assigned to the SiO stretching vibration of the SiO3(OH) units. Two Raman bands at 914 and 953 cm-1 are attributed to the antisymmetric vibrations. Intense prominent peaks observed at 3487, 3502, 3509, 3521 and 3547 cm-1 are assigned to the OH stretching vibration of the SiO3(OH) units. The observation of multiple OH bands supports the concept of the non-equivalence of the OH units. Vibrational spectroscopy enables a detailed assessment of the molecular structure of poldervaartite.

Top