Sample records for present method enables

  1. A cross-domain communication resource scheduling method for grid-enabled communication networks

    NASA Astrophysics Data System (ADS)

    Zheng, Xiangquan; Wen, Xiang; Zhang, Yongding

    2011-10-01

    To support a wide range of different grid applications in environments where various heterogeneous communication networks coexist, it is important to enable advanced capabilities in on-demand and dynamical integration and efficient co-share with cross-domain heterogeneous communication resource, thus providing communication services which are impossible for single communication resource to afford. Based on plug-and-play co-share and soft integration with communication resource, Grid-enabled communication network is flexibly built up to provide on-demand communication services for gird applications with various requirements on quality of service. Based on the analysis of joint job and communication resource scheduling in grid-enabled communication networks (GECN), this paper presents a cross multi-domain communication resource cooperatively scheduling method and describes the main processes such as traffic requirement resolution for communication services, cross multi-domain negotiation on communication resource, on-demand communication resource scheduling, and so on. The presented method is to afford communication service capability to cross-domain traffic delivery in GECNs. Further research work towards validation and implement of the presented method is pointed out at last.

  2. System and Method for Multi-Wavelength Optical Signal Detection

    NASA Technical Reports Server (NTRS)

    McGlone, Thomas D. (Inventor)

    2017-01-01

    The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.

  3. Method for forming a nano-textured substrate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeong, Sangmoo; Hu, Liangbing; Cui, Yi

    A method for forming a nano-textured surface on a substrate is disclosed. An illustrative embodiment of the present invention comprises dispensing of a nanoparticle ink of nanoparticles and solvent onto the surface of a substrate, distributing the ink to form substantially uniform, liquid nascent layer of the ink, and enabling the solvent to evaporate from the nanoparticle ink thereby inducing the nanoparticles to assemble into an texture layer. Methods in accordance with the present invention enable rapid formation of large-area substrates having a nano-textured surface. Embodiments of the present invention are well suited for texturing substrates using high-speed, large scale,more » roll-to-roll coating equipment, such as that used in office product, film coating, and flexible packaging applications. Further, embodiments of the present invention are well suited for use with rigid or flexible substrates.« less

  4. Quantitative Photochemical Immobilization of Biomolecules on Planar and Corrugated Substrates: A Versatile Strategy for Creating Functional Biointerfaces

    PubMed Central

    Martin, Teresa A.; Herman, Christine T.; Limpoco, Francis T.; Michael, Madeline C.; Potts, Gregory K.; Bailey, Ryan C.

    2014-01-01

    Methods for the generation of substrates presenting biomolecules in a spatially controlled manner are enabling tools for applications in biosensor systems, microarray technologies, fundamental biological studies and biointerface science. We have implemented a method to create biomolecular patterns by using light to control the direct covalent immobilization of biomolecules onto benzophenone-modified glass substrates. We have generated substrates presenting up to three different biomolecules patterned in sequence, and demonstrate biomolecular photopatterning on corrugated substrates. The chemistry of the underlying monolayer was optimized to incorporate poly(ethylene glycol) to enable adhesive cell adhesion onto patterned extracellular matrix proteins. Substrates were characterized with contact angle goniometry, AFM, and immunofluorescence microscopy. Importantly, radioimmunoassays were performed to quantify the site density of immobilized biomolecules on photopatterned substrates. Retention of function of photopatterned proteins was demonstrated both by native ligand recognition and cell adhesion to photopatterned substrates, revealing that substrates generated with this method are suitable for probing specific cell receptor-ligand interactions. This molecularly general photochemical patterning method is an enabling tool that will allow the creation of substrates presenting both biochemical and topographical variation, which is an important feature of many native biointerfaces. PMID:21793535

  5. From the Teachers' Perspective: A Way of Simplicity for Multimedia Design

    ERIC Educational Resources Information Center

    Hirca, Necati

    2009-01-01

    Presently, teaching and presentation methods are changing from chalk and blackboards to interactive methods. Multimedia technology is presently used in many schools, however much of the commercially-available software programs don't allow teachers to share their experiences. Adobe Captivate 3 is a computer program that enables teachers, without…

  6. Summary of transformation equations and equations of motion used in free flight and wind tunnel data reduction and analysis

    NASA Technical Reports Server (NTRS)

    Gainer, T. G.; Hoffman, S.

    1972-01-01

    Basic formulations for developing coordinate transformations and motion equations used with free-flight and wind-tunnel data reduction are presented. The general forms presented include axes transformations that enable transfer back and forth between any of the five axes systems that are encountered in aerodynamic analysis. Equations of motion are presented that enable calculation of motions anywhere in the vicinity of the earth. A bibliography of publications on methods of analyzing flight data is included.

  7. Extension of the thermal porosimetry method to high gas pressure for nanoporosimetry estimation

    NASA Astrophysics Data System (ADS)

    Jannot, Y.; Degiovanni, A.; Camus, M.

    2018-04-01

    Standard pore size determination methods like mercury porosimetry, nitrogen sorption, microscopy, or X-ray tomography are not suited to highly porous, low density, and thus very fragile materials. For this kind of materials, a method based on thermal characterization has been developed in a previous study. This method has been used with air pressure varying from 10-1 to 105 Pa for materials having a thermal conductivity less than 0.05 W m-1 K-1 at atmospheric pressure. It enables the estimation of pore size distribution between 100 nm and 1 mm. In this paper, we present a new experimental device enabling thermal conductivity measurement under gas pressure up to 106 Pa, enabling the estimation of the volume fraction of pores having a 10 nm diameter. It is also demonstrated that the main thermal conductivity models (parallel, series, Maxwell, Bruggeman, self-consistent) lead to the same estimation of the pore size distribution as the extended parallel model (EPM) presented in this paper and then used to process the experimental data. Three materials with thermal conductivities at atmospheric pressure ranging from 0.014 W m-1 K-1 to 0.04 W m-1 K-1 are studied. The thermal conductivity measurement results obtained with the three materials are presented, and the corresponding pore size distributions between 10 nm and 1 mm are presented and discussed.

  8. Direct amide formation using radiofrequency heating.

    PubMed

    Houlding, Thomas K; Tchabanenko, Kirill; Rahman, Md Taifur; Rebrov, Evgeny V

    2013-07-07

    We present a simple method for direct and solvent-free formation of amides from carboxylic acids and amines using radiofrequency heating. The direct energy coupling of the AC magnetic field via nickel ferrite magnetic nanoparticles enables fast and controllable heating, as well as enabling facile work-up via magnetic separation.

  9. Computer Aided Evaluation of Higher Education Tutors' Performance

    ERIC Educational Resources Information Center

    Xenos, Michalis; Papadopoulos, Thanos

    2007-01-01

    This article presents a method for computer-aided tutor evaluation: Bayesian Networks are used for organizing the collected data about tutors and for enabling accurate estimations and predictions about future tutor behavior. The model provides indications about each tutor's strengths and weaknesses, which enables the evaluator to exploit strengths…

  10. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi -LAT data

    DOE PAGES

    Lott, B.; Escande, L.; Larsson, S.; ...

    2012-07-19

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LATmore » analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.« less

  11. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moran, James; Alexander, Thomas; Aalseth, Craig

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  13. Optimal cooperative time-fixed impulsive rendezvous

    NASA Technical Reports Server (NTRS)

    Mirfakhraie, Koorosh; Conway, Bruce A.

    1990-01-01

    New capabilities have been added to a method that had been developed for determining optimal, i.e., minimum fuel, trajectories for the fixed-time cooperative rendezvous of two spacecraft. The method utilizes the primer vector theory. The new capabilities enable the method to accomodate cases in which there are fuel constraints on the spacecraft and/or enable the addition of a mid-course impulse to one of the vehicle's trajectories. Results are presented for a large number of cases, and the effect of varying parameters, such as vehicle fuel constraints, vehicle initial masses, and time allowed for the rendezvous, is demonstrated.

  14. Tracking perturbations in Boolean networks with spectral methods

    NASA Astrophysics Data System (ADS)

    Kesseli, Juha; Rämö, Pauli; Yli-Harja, Olli

    2005-08-01

    In this paper we present a method for predicting the spread of perturbations in Boolean networks. The method is applicable to networks that have no regular topology. The prediction of perturbations can be performed easily by using a presented result which enables the efficient computation of the required iterative formulas. This result is based on abstract Fourier transform of the functions in the network. In this paper the method is applied to show the spread of perturbations in networks containing a distribution of functions found from biological data. The advances in the study of the spread of perturbations can directly be applied to enable ways of quantifying chaos in Boolean networks. Derrida plots over an arbitrary number of time steps can be computed and thus distributions of functions compared with each other with respect to the amount of order they create in random networks.

  15. Multiple Frequency Audio Signal Communication as a Mechanism for Neurophysiology and Video Data Synchronization

    PubMed Central

    Topper, Nicholas C.; Burke, S.N.; Maurer, A.P.

    2014-01-01

    BACKGROUND Current methods for aligning neurophysiology and video data are either prepackaged, requiring the additional purchase of a software suite, or use a blinking LED with a stationary pulse-width and frequency. These methods lack significant user interface for adaptation, are expensive, or risk a misalignment of the two data streams. NEW METHOD A cost-effective means to obtain high-precision alignment of behavioral and neurophysiological data is obtained by generating an audio-pulse embedded with two domains of information, a low-frequency binary-counting signal and a high, randomly changing frequency. This enabled the derivation of temporal information while maintaining enough entropy in the system for algorithmic alignment. RESULTS The sample to frame index constructed using the audio input correlation method described in this paper enables video and data acquisition to be aligned at a sub-frame level of precision. COMPARISONS WITH EXISTING METHOD Traditionally, a synchrony pulse is recorded on-screen via a flashing diode. The higher sampling rate of the audio input of the camcorder enables the timing of an event to be detected with greater precision. CONCLUSIONS While On-line analysis and synchronization using specialized equipment may be the ideal situation in some cases, the method presented in the current paper presents a viable, low cost alternative, and gives the flexibility to interface with custom off-line analysis tools. Moreover, the ease of constructing and implements this set-up presented in the current paper makes it applicable to a wide variety of applications that require video recording. PMID:25256648

  16. Urea Biosynthesis Using Liver Slices

    ERIC Educational Resources Information Center

    Teal, A. R.

    1976-01-01

    Presented is a practical scheme to enable introductory biology students to investigate the mechanism by which urea is synthesized in the liver. The tissue-slice technique is discussed, and methods for the quantitative analysis of metabolites are presented. (Author/SL)

  17. Improvement of the edge method for on-orbit MTF measurement.

    PubMed

    Viallefont-Robinet, Françoise; Léger, Dominique

    2010-02-15

    The edge method is a widely used way to assess the on-orbit Modulation Transfer Function (MTF). Since good quality is required for the edge, the higher the spatial resolution, the better the results are. In this case, an artificial target can be built and used to ensure a good edge quality. For moderate spatial resolutions, only natural targets are available. Hence the edge quality is unknown and generally rather poor. Improvements of the method have been researched in order to compensate for the poor quality of natural edges. This has been done through the use of symmetry and/or a transfer function model, which enables the elimination of noise. This has also been used for artificial target. In this case, the use of the model overcomes the incomplete sampling when the target is too small or gives the opportunity to assess the defocus of the sensor. This paper begins with a recall of the method followed by a presentation of the changes relying on transfer function parametric model. The transfer function model and the process corresponding to the changes are described. Applications of these changes for several satellites of the French spatial agency are presented: for SPOT 1, it enables to assess XS MTF with natural edges, for SPOT 5, it enables to use the Salon-de-Provence artificial target for MTF assessment in the HM mode, and for the foreseen Pleiades, it enables to estimate the defocus.

  18. A Brain-Computer Interface (BCI) system to use arbitrary Windows applications by directly controlling mouse and keyboard.

    PubMed

    Spuler, Martin

    2015-08-01

    A Brain-Computer Interface (BCI) allows to control a computer by brain activity only, without the need for muscle control. In this paper, we present an EEG-based BCI system based on code-modulated visual evoked potentials (c-VEPs) that enables the user to work with arbitrary Windows applications. Other BCI systems, like the P300 speller or BCI-based browsers, allow control of one dedicated application designed for use with a BCI. In contrast, the system presented in this paper does not consist of one dedicated application, but enables the user to control mouse cursor and keyboard input on the level of the operating system, thereby making it possible to use arbitrary applications. As the c-VEP BCI method was shown to enable very fast communication speeds (writing more than 20 error-free characters per minute), the presented system is the next step in replacing the traditional mouse and keyboard and enabling complete brain-based control of a computer.

  19. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  20. Airborne Doppler Wind Lidar Post Data Processing Software DAPS-LV

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J. (Inventor); Beyon, Jeffrey Y. (Inventor); Koch, Grady J. (Inventor)

    2015-01-01

    Systems, methods, and devices of the present invention enable post processing of airborne Doppler wind LIDAR data. In an embodiment, airborne Doppler wind LIDAR data software written in LabVIEW may be provided and may run two versions of different airborne wind profiling algorithms. A first algorithm may be the Airborne Wind Profiling Algorithm for Doppler Wind LIDAR ("APOLO") using airborne wind LIDAR data from two orthogonal directions to estimate wind parameters, and a second algorithm may be a five direction based method using pseudo inverse functions to estimate wind parameters. The various embodiments may enable wind profiles to be compared using different algorithms, may enable wind profile data for long haul color displays to be generated, may display long haul color displays, and/or may enable archiving of data at user-selectable altitudes over a long observation period for data distribution and population.

  1. Determining the transferability of flight simulator data

    NASA Technical Reports Server (NTRS)

    Green, David

    1992-01-01

    This paper presented a method for collecting and graphically correlating subjective ratings and objective flight test data. The method enables flight-simulation engineers to enhance the simulator characterization of rotor craft flight in order to achieve maximum transferability of simulator experience.

  2. Data-driven Green's function retrieval and application to imaging with multidimensional deconvolution

    NASA Astrophysics Data System (ADS)

    Broggini, Filippo; Wapenaar, Kees; van der Neut, Joost; Snieder, Roel

    2014-01-01

    An iterative method is presented that allows one to retrieve the Green's function originating from a virtual source located inside a medium using reflection data measured only at the acquisition surface. In addition to the reflection response, an estimate of the travel times corresponding to the direct arrivals is required. However, no detailed information about the heterogeneities in the medium is needed. The iterative scheme generalizes the Marchenko equation for inverse scattering to the seismic reflection problem. To give insight in the mechanism of the iterative method, its steps for a simple layered medium are analyzed using physical arguments based on the stationary phase method. The retrieved Green's wavefield is shown to correctly contain the multiples due to the inhomogeneities present in the medium. Additionally, a variant of the iterative scheme enables decomposition of the retrieved wavefield into its downgoing and upgoing components. These wavefields then enable creation of a ghost-free image of the medium with either cross correlation or multidimensional deconvolution, presenting an advantage over standard prestack migration.

  3. A vector matching method for analysing logic Petri nets

    NASA Astrophysics Data System (ADS)

    Du, YuYue; Qi, Liang; Zhou, MengChu

    2011-11-01

    Batch processing function and passing value indeterminacy in cooperative systems can be described and analysed by logic Petri nets (LPNs). To directly analyse the properties of LPNs, the concept of transition enabling vector sets is presented and a vector matching method used to judge the enabling transitions is proposed in this article. The incidence matrix of LPNs is defined; an equation about marking change due to a transition's firing is given; and a reachable tree is constructed. The state space explosion is mitigated to a certain extent from directly analysing LPNs. Finally, the validity and reliability of the proposed method are illustrated by an example in electronic commerce.

  4. Evaluating Presentation Skills of Volunteer Trainers.

    ERIC Educational Resources Information Center

    Mohan, Donna K.

    A systematic method for evaluating the presentation skills of volunteer trainers would enable the discovery of hidden problems. It would also increase individual trainer skills and satisfaction and improve the overall effectiveness of the training program. A first step is to determine the general presentation skills a successful volunteer trainer…

  5. Lipid Vesicle Shape Analysis from Populations Using Light Video Microscopy and Computer Vision

    PubMed Central

    Zupanc, Jernej; Drašler, Barbara; Boljte, Sabina; Kralj-Iglič, Veronika; Iglič, Aleš; Erdogmus, Deniz; Drobne, Damjana

    2014-01-01

    We present a method for giant lipid vesicle shape analysis that combines manually guided large-scale video microscopy and computer vision algorithms to enable analyzing vesicle populations. The method retains the benefits of light microscopy and enables non-destructive analysis of vesicles from suspensions containing up to several thousands of lipid vesicles (1–50 µm in diameter). For each sample, image analysis was employed to extract data on vesicle quantity and size distributions of their projected diameters and isoperimetric quotients (measure of contour roundness). This process enables a comparison of samples from the same population over time, or the comparison of a treated population to a control. Although vesicles in suspensions are heterogeneous in sizes and shapes and have distinctively non-homogeneous distribution throughout the suspension, this method allows for the capture and analysis of repeatable vesicle samples that are representative of the population inspected. PMID:25426933

  6. Computational Analysis of the Caenorhabditis elegans Germline to Study the Distribution of Nuclei, Proteins, and the Cytoskeleton.

    PubMed

    Gopal, Sandeep; Pocock, Roger

    2018-04-19

    The Caenorhabditis elegans (C. elegans) germline is used to study several biologically important processes including stem cell development, apoptosis, and chromosome dynamics. While the germline is an excellent model, the analysis is often two dimensional due to the time and labor required for three-dimensional analysis. Major readouts in such studies are the number/position of nuclei and protein distribution within the germline. Here, we present a method to perform automated analysis of the germline using confocal microscopy and computational approaches to determine the number and position of nuclei in each region of the germline. Our method also analyzes germline protein distribution that enables the three-dimensional examination of protein expression in different genetic backgrounds. Further, our study shows variations in cytoskeletal architecture in distinct regions of the germline that may accommodate specific spatial developmental requirements. Finally, our method enables automated counting of the sperm in the spermatheca of each germline. Taken together, our method enables rapid and reproducible phenotypic analysis of the C. elegans germline.

  7. Developing an expert panel process to refine health outcome definitions in observational data.

    PubMed

    Fox, Brent I; Hollingsworth, Joshua C; Gray, Michael D; Hollingsworth, Michael L; Gao, Juan; Hansen, Richard A

    2013-10-01

    Drug safety surveillance using observational data requires valid adverse event, or health outcome of interest (HOI) measurement. The objectives of this study were to develop a method to review HOI definitions in claims databases using (1) web-based digital tools to present de-identified patient data, (2) a systematic expert panel review process, and (3) a data collection process enabling analysis of concepts-of-interest that influence panelists' determination of HOI. De-identified patient data were presented via an interactive web-based dashboard to enable case review and determine if specific HOIs were present or absent. Criteria for determining HOIs and their severity were provided to each panelist. Using a modified Delphi method, six panelist pairs independently reviewed approximately 200 cases across each of three HOIs (acute liver injury, acute kidney injury, and acute myocardial infarction) such that panelist pairs independently reviewed the same cases. Panelists completed an assessment within the dashboard for each case that included their assessment of the presence or absence of the HOI, HOI severity (if present), and data contributing to their decision. Discrepancies within panelist pairs were resolved during a consensus process. Dashboard development was iterative, focusing on data presentation and recording panelists' assessments. Panelists reported quickly learning how to use the dashboard. The assessment module was used consistently. The dashboard was reliable, enabling an efficient review process for panelists. Modifications were made to the dashboard and review process when necessary to facilitate case review. Our methods should be applied to other health outcomes of interest to further refine the dashboard and case review process. The expert review process was effective and was supported by the web-based dashboard. Our methods for case review and classification can be applied to future methods for case identification in observational data sources. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. NavP: Structured and Multithreaded Distributed Parallel Programming

    NASA Technical Reports Server (NTRS)

    Pan, Lei; Xu, Jingling

    2006-01-01

    This slide presentation reviews some of the issues around distributed parallel programming. It compares and contrast two methods of programming: Single Program Multiple Data (SPMD) with the Navigational Programming (NAVP). It then reviews the distributed sequential computing (DSC) method and the methodology of NavP. Case studies are presented. It also reviews the work that is being done to enable the NavP system.

  9. Lensless Photoluminescence Hyperspectral Camera Employing Random Speckle Patterns.

    PubMed

    Žídek, Karel; Denk, Ondřej; Hlubuček, Jiří

    2017-11-10

    We propose and demonstrate a spectrally-resolved photoluminescence imaging setup based on the so-called single pixel camera - a technique of compressive sensing, which enables imaging by using a single-pixel photodetector. The method relies on encoding an image by a series of random patterns. In our approach, the image encoding was maintained via laser speckle patterns generated by an excitation laser beam scattered on a diffusor. By using a spectrometer as the single-pixel detector we attained a realization of a spectrally-resolved photoluminescence camera with unmatched simplicity. We present reconstructed hyperspectral images of several model scenes. We also discuss parameters affecting the imaging quality, such as the correlation degree of speckle patterns, pattern fineness, and number of datapoints. Finally, we compare the presented technique to hyperspectral imaging using sample scanning. The presented method enables photoluminescence imaging for a broad range of coherent excitation sources and detection spectral areas.

  10. New calibration method for I-scan sensors to enable the precise measurement of pressures delivered by 'pressure garments'.

    PubMed

    Macintyre, Lisa

    2011-11-01

    Accurate measurement of the pressure delivered by medical compression products is highly desirable both in monitoring treatment and in developing new pressure inducing garments or products. There are several complications in measuring pressure at the garment/body interface and at present no ideal pressure measurement tool exists for this purpose. This paper summarises a thorough evaluation of the accuracy and reproducibility of measurements taken following both of Tekscan Inc.'s recommended calibration procedures for I-scan sensors; and presents an improved method for calibrating and using I-scan pressure sensors. The proposed calibration method enables accurate (±2.1 mmHg) measurement of pressures delivered by pressure garments to body parts with a circumference ≥30 cm. This method is too cumbersome for routine clinical use but is very useful, accurate and reproducible for product development or clinical evaluation purposes. Copyright © 2011 Elsevier Ltd and ISBI. All rights reserved.

  11. Modern Focused-Ion-Beam-Based Site-Specific Specimen Preparation for Atom Probe Tomography.

    PubMed

    Prosa, Ty J; Larson, David J

    2017-04-01

    Approximately 30 years after the first use of focused ion beam (FIB) instruments to prepare atom probe tomography specimens, this technique has grown to be used by hundreds of researchers around the world. This past decade has seen tremendous advances in atom probe applications, enabled by the continued development of FIB-based specimen preparation methodologies. In this work, we provide a short review of the origin of the FIB method and the standard methods used today for lift-out and sharpening, using the annular milling method as applied to atom probe tomography specimens. Key steps for enabling correlative analysis with transmission electron-beam backscatter diffraction, transmission electron microscopy, and atom probe tomography are presented, and strategies for preparing specimens for modern microelectronic device structures are reviewed and discussed in detail. Examples are used for discussion of the steps for each of these methods. We conclude with examples of the challenges presented by complex topologies such as nanowires, nanoparticles, and organic materials.

  12. Transforming paper-based assessment forms to a digital format: Exemplified by the Housing Enabler prototype app.

    PubMed

    Svarre, Tanja; Lunn, Tine Bieber Kirkegaard; Helle, Tina

    2017-11-01

    The aim of this paper is to provide the reader with an overall impression of the stepwise user-centred design approach including the specific methods used and lessons learned when transforming paper-based assessment forms into a prototype app, taking the Housing Enabler as an example. Four design iterations were performed, building on a domain study, workshops, expert evaluation and controlled and realistic usability tests. The user-centred design process involved purposefully selected participants with different Housing Enabler knowledge and housing adaptation experience. The design iterations resulted in the development of a Housing Enabler prototype app. The prototype app has several features and options that are new compared with the original paper-based Housing Enabler assessment form. These new features include a user friendly overview of the assessment form; easy navigation by swiping back and forth between items; onsite data analysis; and ranking of the accessibility score, photo documentation and a data export facility. Based on the presented stepwise approach, a high-fidelity Housing Enabler prototype app was successfully developed. The development process has emphasized the importance of combining design participants' knowledge and experiences, and has shown that methods should seem relevant to participants to increase their engagement.

  13. Role of land use planning in noise control

    Treesearch

    Stephanie J. Caswell; Karl Jakus

    1977-01-01

    A method for controlling outdoor noise through land use planning is presented. The method utilizes a computer model that broadly assesses the likely noise environments of a community on the basis of generalized land use and highway noise production and transmission conditions. The method is designed to enable town planners and other community decision-makers to...

  14. Methods for producing reinforced carbon nanotubes

    DOEpatents

    Ren, Zhifen [Newton, MA; Wen, Jian Guo [Newton, MA; Lao, Jing Y [Chestnut Hill, MA; Li, Wenzhi [Brookline, MA

    2008-10-28

    Methods for producing reinforced carbon nanotubes having a plurality of microparticulate carbide or oxide materials formed substantially on the surface of such reinforced carbon nanotubes composite materials are disclosed. In particular, the present invention provides reinforced carbon nanotubes (CNTs) having a plurality of boron carbide nanolumps formed substantially on a surface of the reinforced CNTs that provide a reinforcing effect on CNTs, enabling their use as effective reinforcing fillers for matrix materials to give high-strength composites. The present invention also provides methods for producing such carbide reinforced CNTs.

  15. Multiple frequency audio signal communication as a mechanism for neurophysiology and video data synchronization.

    PubMed

    Topper, Nicholas C; Burke, Sara N; Maurer, Andrew Porter

    2014-12-30

    Current methods for aligning neurophysiology and video data are either prepackaged, requiring the additional purchase of a software suite, or use a blinking LED with a stationary pulse-width and frequency. These methods lack significant user interface for adaptation, are expensive, or risk a misalignment of the two data streams. A cost-effective means to obtain high-precision alignment of behavioral and neurophysiological data is obtained by generating an audio-pulse embedded with two domains of information, a low-frequency binary-counting signal and a high, randomly changing frequency. This enabled the derivation of temporal information while maintaining enough entropy in the system for algorithmic alignment. The sample to frame index constructed using the audio input correlation method described in this paper enables video and data acquisition to be aligned at a sub-frame level of precision. Traditionally, a synchrony pulse is recorded on-screen via a flashing diode. The higher sampling rate of the audio input of the camcorder enables the timing of an event to be detected with greater precision. While on-line analysis and synchronization using specialized equipment may be the ideal situation in some cases, the method presented in the current paper presents a viable, low cost alternative, and gives the flexibility to interface with custom off-line analysis tools. Moreover, the ease of constructing and implements this set-up presented in the current paper makes it applicable to a wide variety of applications that require video recording. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Whole-genome regression and prediction methods applied to plant and animal breeding.

    PubMed

    de Los Campos, Gustavo; Hickey, John M; Pong-Wong, Ricardo; Daetwyler, Hans D; Calus, Mario P L

    2013-02-01

    Genomic-enabled prediction is becoming increasingly important in animal and plant breeding and is also receiving attention in human genetics. Deriving accurate predictions of complex traits requires implementing whole-genome regression (WGR) models where phenotypes are regressed on thousands of markers concurrently. Methods exist that allow implementing these large-p with small-n regressions, and genome-enabled selection (GS) is being implemented in several plant and animal breeding programs. The list of available methods is long, and the relationships between them have not been fully addressed. In this article we provide an overview of available methods for implementing parametric WGR models, discuss selected topics that emerge in applications, and present a general discussion of lessons learned from simulation and empirical data analysis in the last decade.

  17. Stiffness of RBC optical confinement affected by optical clearing

    NASA Astrophysics Data System (ADS)

    Grishin, Oleg V.; Fedosov, Ivan V.; Tuchin, Valery V.

    2017-03-01

    In vivo optical trapping is a novel applied direction of an optical manipulation, which enables one to noninvasive measurement of mechanical properties of cells and tissues in living animals directly. But an application area of this direction is limited because strong scattering of many biological tissues. An optical clearing enables one to decrease the scattering and therefore increase a depth of light penetration, decrease a distortion of light beam, improve a resolution in imaging applications. Now novel methods had appeared for a measurement an optical clearing degree at a cellular level. But these methods aren't applicable in vivo. In this paper we present novel measurement method of estimate of the optical clearing, which are based on a measurement of optical trap stiffness. Our method may be applicable in vivo.

  18. Whole-Genome Regression and Prediction Methods Applied to Plant and Animal Breeding

    PubMed Central

    de los Campos, Gustavo; Hickey, John M.; Pong-Wong, Ricardo; Daetwyler, Hans D.; Calus, Mario P. L.

    2013-01-01

    Genomic-enabled prediction is becoming increasingly important in animal and plant breeding and is also receiving attention in human genetics. Deriving accurate predictions of complex traits requires implementing whole-genome regression (WGR) models where phenotypes are regressed on thousands of markers concurrently. Methods exist that allow implementing these large-p with small-n regressions, and genome-enabled selection (GS) is being implemented in several plant and animal breeding programs. The list of available methods is long, and the relationships between them have not been fully addressed. In this article we provide an overview of available methods for implementing parametric WGR models, discuss selected topics that emerge in applications, and present a general discussion of lessons learned from simulation and empirical data analysis in the last decade. PMID:22745228

  19. Next-generation genome-scale models for metabolic engineering.

    PubMed

    King, Zachary A; Lloyd, Colton J; Feist, Adam M; Palsson, Bernhard O

    2015-12-01

    Constraint-based reconstruction and analysis (COBRA) methods have become widely used tools for metabolic engineering in both academic and industrial laboratories. By employing a genome-scale in silico representation of the metabolic network of a host organism, COBRA methods can be used to predict optimal genetic modifications that improve the rate and yield of chemical production. A new generation of COBRA models and methods is now being developed--encompassing many biological processes and simulation strategies-and next-generation models enable new types of predictions. Here, three key examples of applying COBRA methods to strain optimization are presented and discussed. Then, an outlook is provided on the next generation of COBRA models and the new types of predictions they will enable for systems metabolic engineering. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Designing Systems for Many Possible Futures. An RSC-based Method for Affordable Concept Selection (RMACS) with Multi-Era Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaffner, Michael

    2014-06-01

    The current downward trend in funding for U.S. defense systems seems to be on a collision course with the state of the practice in systems engineering, which typically results in the increased pace and scale of capabilities and resultantly increased cost of complex national defense systems. Recent advances in the state of the art in systems engineering methodology can be leveraged to address this growing challenge. The present work leverages advanced constructs and methods for early-phase conceptual design of complex systems, when committed costs are still low and management influence is still high. First, a literature review is presented ofmore » the topics relevant to this work, including approaches to the design of affordable systems, assumptions and methods of exploratory modeling, and enabling techniques to help mitigate the computational challenges involved. The types, purposes, and limits of early-phase, exploratory models are then elucidated. The RSC-based Method for Affordable Concept Selection (RMACS) is described, which comprises nine processes in the three main thrusts of information gathering, evaluation, and analysis. The method is then applied to a naval ship case example, described as the Next-Generation Combat Ship, with representational information outputs and discussions of affordability with respect to each process. The ninth process, Multi-Era Analysis (MERA), is introduced and explicated, including required and optional informational components, temporal and change-related considerations, required and optional activities involved, and the potential types of outputs from the process. The MERA process is then applied to a naval ship case example similar to that of the RMACS application, but with discrete change options added to enable a tradespace network. The seven activities of the MERA process are demonstrated, with the salient outputs of each given and discussed. Additional thoughts are presented on MERA and RMACS, and 8 distinct areas are identified for further research in the MERA process, along with a brief description of the directions that such research might take. It is concluded that the affordability of complex systems can be better enabled through a conceptual design method that incorporates MERA as well as metrics such as Multi-Attribute Expense, Max Expense, and Expense Stability. It is also found that affordability of changeable systems can be better enabled through the use of existing path-planning algorithms in efficient evaluation and analysis of long-term strategies. Finally, it is found that MERA enables the identification and analysis of path-dependent considerations related to designs, epochs, strategies, and change options, in many possible futures.« less

  1. The Use of the Visualisation of Multidimensional Data Using PCA to Evaluate Possibilities of the Division of Coal Samples Space Due to their Suitability for Fluidised Gasification

    NASA Astrophysics Data System (ADS)

    Jamróz, Dariusz; Niedoba, Tomasz; Surowiak, Agnieszka; Tumidajski, Tadeusz

    2016-09-01

    Methods serving to visualise multidimensional data through the transformation of multidimensional space into two-dimensional space, enable to present the multidimensional data on the computer screen. Thanks to this, qualitative analysis of this data can be performed in the most natural way for humans, through the sense of sight. An example of such a method of multidimensional data visualisation is PCA (principal component analysis) method. This method was used in this work to present and analyse a set of seven-dimensional data (selected seven properties) describing coal samples obtained from Janina and Wieczorek coal mines. Coal from these mines was previously subjected to separation by means of a laboratory ring jig, consisting of ten rings. With 5 layers of both types of coal (with 2 rings each) were obtained in this way. It was decided to check if the method of multidimensional data visualisation enables to divide the space of such divided samples into areas with different suitability for the fluidised gasification process. To that end, the card of technological suitability of coal was used (Sobolewski et al., 2012; 2013), in which key, relevant and additional parameters, having effect on the gasification process, were described. As a result of analyses, it was stated that effective determination of coal samples suitability for the on-surface gasification process in a fluidised reactor is possible. The PCA method enables the visualisation of the optimal subspace containing the set requirements concerning the properties of coals intended for this process.

  2. Combination of High-density Microelectrode Array and Patch Clamp Recordings to Enable Studies of Multisynaptic Integration.

    PubMed

    Jäckel, David; Bakkum, Douglas J; Russell, Thomas L; Müller, Jan; Radivojevic, Milos; Frey, Urs; Franke, Felix; Hierlemann, Andreas

    2017-04-20

    We present a novel, all-electric approach to record and to precisely control the activity of tens of individual presynaptic neurons. The method allows for parallel mapping of the efficacy of multiple synapses and of the resulting dynamics of postsynaptic neurons in a cortical culture. For the measurements, we combine an extracellular high-density microelectrode array, featuring 11'000 electrodes for extracellular recording and stimulation, with intracellular patch-clamp recording. We are able to identify the contributions of individual presynaptic neurons - including inhibitory and excitatory synaptic inputs - to postsynaptic potentials, which enables us to study dendritic integration. Since the electrical stimuli can be controlled at microsecond resolution, our method enables to evoke action potentials at tens of presynaptic cells in precisely orchestrated sequences of high reliability and minimum jitter. We demonstrate the potential of this method by evoking short- and long-term synaptic plasticity through manipulation of multiple synaptic inputs to a specific neuron.

  3. Story and Healing in Action: New Methods for Fostering Heart-to-Heart Dialogue about Race

    ERIC Educational Resources Information Center

    Saury, Rachel E.; Alexander, John

    2003-01-01

    In this article, the authors present a new interdisciplinary methods called "Story and Healing." These methods enable teachers to enter a discussion of the ills of white privilege and the epidemic of racism through the "back door" approach. First, students are asked to contemplate transpersonal experiences such as healing and suffering, second,…

  4. Optical power transfer and communication methods for wireless implantable sensing platforms.

    PubMed

    Mujeeb-U-Rahman, Muhammad; Adalian, Dvin; Chang, Chieh-Feng; Scherer, Axel

    2015-09-01

    Ultrasmall scale implants have recently attracted focus as valuable tools for monitoring both acute and chronic diseases. Semiconductor optical technologies are the key to miniaturizing these devices to the long-sought sub-mm scale, which will enable long-term use of these devices for medical applications. This can also enable the use of multiple implantable devices concurrently to form a true body area network of sensors. We demonstrate optical power transfer techniques and methods to effectively harness this power for implantable devices. Furthermore, we also present methods for optical data transfer from such implants. Simultaneous use of these technologies can result in miniaturized sensing platforms that can allow for large-scale use of such systems in real world applications.

  5. Novel scanning procedure enabling the vectorization of entire rhizotron-grown root systems

    PubMed Central

    2013-01-01

    This paper presents an original spit-and-combine imaging procedure that enables the complete vectorization of complex root systems grown in rhizotrons. The general principle of the method is to (1) separate the root system into a small number of large pieces to reduce root overlap, (2) scan these pieces one by one, (3) analyze separate images with a root tracing software and (4) combine all tracings into a single vectorized root system. This method generates a rich dataset containing morphological, topological and geometrical information of entire root systems grown in rhizotrons. The utility of the method is illustrated with a detailed architectural analysis of a 20-day old maize root system, coupled with a spatial analysis of water uptake patterns. PMID:23286457

  6. Novel scanning procedure enabling the vectorization of entire rhizotron-grown root systems.

    PubMed

    Lobet, Guillaume; Draye, Xavier

    2013-01-04

    : This paper presents an original spit-and-combine imaging procedure that enables the complete vectorization of complex root systems grown in rhizotrons. The general principle of the method is to (1) separate the root system into a small number of large pieces to reduce root overlap, (2) scan these pieces one by one, (3) analyze separate images with a root tracing software and (4) combine all tracings into a single vectorized root system. This method generates a rich dataset containing morphological, topological and geometrical information of entire root systems grown in rhizotrons. The utility of the method is illustrated with a detailed architectural analysis of a 20-day old maize root system, coupled with a spatial analysis of water uptake patterns.

  7. Optical power transfer and communication methods for wireless implantable sensing platforms

    NASA Astrophysics Data System (ADS)

    Mujeeb-U-Rahman, Muhammad; Adalian, Dvin; Chang, Chieh-Feng; Scherer, Axel

    2015-09-01

    Ultrasmall scale implants have recently attracted focus as valuable tools for monitoring both acute and chronic diseases. Semiconductor optical technologies are the key to miniaturizing these devices to the long-sought sub-mm scale, which will enable long-term use of these devices for medical applications. This can also enable the use of multiple implantable devices concurrently to form a true body area network of sensors. We demonstrate optical power transfer techniques and methods to effectively harness this power for implantable devices. Furthermore, we also present methods for optical data transfer from such implants. Simultaneous use of these technologies can result in miniaturized sensing platforms that can allow for large-scale use of such systems in real world applications.

  8. The Body That Speaks: Recombining Bodies and Speech Sources in Unscripted Face-to-Face Communication.

    PubMed

    Gillespie, Alex; Corti, Kevin

    2016-01-01

    This article examines advances in research methods that enable experimental substitution of the speaking body in unscripted face-to-face communication. A taxonomy of six hybrid social agents is presented by combining three types of bodies (mechanical, virtual, and human) with either an artificial or human speech source. Our contribution is to introduce and explore the significance of two particular hybrids: (1) the cyranoid method that enables humans to converse face-to-face through the medium of another person's body, and (2) the echoborg method that enables artificial intelligence to converse face-to-face through the medium of a human body. These two methods are distinct in being able to parse the unique influence of the human body when combined with various speech sources. We also introduce a new framework for conceptualizing the body's role in communication, distinguishing three levels: self's perspective on the body, other's perspective on the body, and self's perspective of other's perspective on the body. Within each level the cyranoid and echoborg methodologies make important research questions tractable. By conceptualizing and synthesizing these methods, we outline a novel paradigm of research on the role of the body in unscripted face-to-face communication.

  9. The Body That Speaks: Recombining Bodies and Speech Sources in Unscripted Face-to-Face Communication

    PubMed Central

    Gillespie, Alex; Corti, Kevin

    2016-01-01

    This article examines advances in research methods that enable experimental substitution of the speaking body in unscripted face-to-face communication. A taxonomy of six hybrid social agents is presented by combining three types of bodies (mechanical, virtual, and human) with either an artificial or human speech source. Our contribution is to introduce and explore the significance of two particular hybrids: (1) the cyranoid method that enables humans to converse face-to-face through the medium of another person's body, and (2) the echoborg method that enables artificial intelligence to converse face-to-face through the medium of a human body. These two methods are distinct in being able to parse the unique influence of the human body when combined with various speech sources. We also introduce a new framework for conceptualizing the body's role in communication, distinguishing three levels: self's perspective on the body, other's perspective on the body, and self's perspective of other's perspective on the body. Within each level the cyranoid and echoborg methodologies make important research questions tractable. By conceptualizing and synthesizing these methods, we outline a novel paradigm of research on the role of the body in unscripted face-to-face communication. PMID:27660616

  10. Social Activity Method (SAM): A Fractal Language for Mathematics

    ERIC Educational Resources Information Center

    Dowling, Paul

    2013-01-01

    In this paper I shall present and develop my organisational language, "social activity method" (SAM), and illustrate some of its applications. I shall introduce a new scheme for "modes of recontextualisation" that enables the analysis of the ways in which one activity--which might be school mathematics or social research or any…

  11. A New Approach for Proving or Generating Combinatorial Identities

    ERIC Educational Resources Information Center

    Gonzalez, Luis

    2010-01-01

    A new method for proving, in an immediate way, many combinatorial identities is presented. The method is based on a simple recursive combinatorial formula involving n + 1 arbitrary real parameters. Moreover, this formula enables one not only to prove, but also generate many different combinatorial identities (not being required to know them "a…

  12. Creating an effective poster presentation.

    PubMed

    Taggart, H M; Arslanian, C

    2000-01-01

    One way to build knowledge in nursing is to share research findings or clinical program outcomes. The dissemination of these findings is often a difficult final step in a project that has taken months or years to complete. One method of sharing findings in a relaxed and informal setting is a poster presentation. This method is an effective form for presenting findings using an interactive approach. The milieu of a poster presentation enables the presenters to interact and dialogue with colleagues. Guidelines for size and format require that the poster is clear and informative. Application of design helps to create visually appealing posters. This article summarizes elements of designing and conducting a poster presentation.

  13. Performance Evaluation of an Improved GC-MS Method to Quantify Methylmercury in Fish.

    PubMed

    Watanabe, Takahiro; Kikuchi, Hiroyuki; Matsuda, Rieko; Hayashi, Tomoko; Akaki, Koichi; Teshima, Reiko

    2015-01-01

    Here, we set out to improve our previously developed methylmercury analytical method, involving phenyl derivatization and gas chromatography-mass spectrometry (GC-MS). In the improved method, phenylation of methylmercury with sodium tetraphenylborate was carried out in a toluene/water two-phase system, instead of in water alone. The modification enabled derivatization at optimum pH, and the formation of by-products was dramatically reduced. In addition, adsorption of methyl phenyl mercury in the GC system was suppressed by co-injection of PEG200, enabling continuous analysis without loss of sensitivity. The performance of the improved analytical method was independently evaluated by three analysts using certified reference materials and methylmercury-spiked fresh fish samples. The present analytical method was validated as suitable for determination of compliance with the provisional regulation value for methylmercury in fish, set in the Food Sanitation haw.

  14. Advances in dual-tone development for pitch frequency doubling

    NASA Astrophysics Data System (ADS)

    Fonseca, Carlos; Somervell, Mark; Scheer, Steven; Kuwahara, Yuhei; Nafus, Kathleen; Gronheid, Roel; Tarutani, Shinji; Enomoto, Yuuichiro

    2010-04-01

    Dual-tone development (DTD) has been previously proposed as a potential cost-effective double patterning technique1. DTD was reported as early as in the late 1990's2. The basic principle of dual-tone imaging involves processing exposed resist latent images in both positive tone (aqueous base) and negative tone (organic solvent) developers. Conceptually, DTD has attractive cost benefits since it enables pitch doubling without the need for multiple etch steps of patterned resist layers. While the concept for DTD technique is simple to understand, there are many challenges that must be overcome and understood in order to make it a manufacturing solution. Previous work by the authors demonstrated feasibility of DTD imaging for 50nm half-pitch features at 0.80NA (k1 = 0.21) and discussed challenges lying ahead for printing sub-40nm half-pitch features with DTD. While previous experimental results suggested that clever processing on the wafer track can be used to enable DTD beyond 50nm halfpitch, it also suggest that identifying suitable resist materials or chemistries is essential for achieving successful imaging results with novel resist processing methods on the wafer track. In this work, we present recent advances in the search for resist materials that work in conjunction with novel resist processing methods on the wafer track to enable DTD. Recent experimental results with new resist chemistries, specifically designed for DTD, are presented in this work. We also present simulation studies that help and support identifying resist properties that could enable DTD imaging, which ultimately lead to producing viable DTD resist materials.

  15. Sorting Rotating Micromachines by Variations in Their Magnetic Properties

    NASA Astrophysics Data System (ADS)

    Howell, Taylor A.; Osting, Braxton; Abbott, Jake J.

    2018-05-01

    We consider sorting for the broad class of micromachines (also known as microswimmers, microrobots, micropropellers, etc.) propelled by rotating magnetic fields. We present a control policy that capitalizes on the variation in magnetic properties between otherwise-homogeneous micromachines to enable the sorting of a select fraction of a group from the remainder and prescribe its net relative movement, using a uniform magnetic field that is applied equally to all micromachines. The method enables us to accomplish this sorting task using open-loop control, without relying on a structured environment or localization information of individual micromachines. With our method, the control time to perform the sort is invariant to the number of micromachines. The method is verified through simulations and scaled experiments. Finally, we include an extended discussion about the limitations of the method and address open questions related to its practical application.

  16. Direct evaluation of free energy for large system through structure integration approach.

    PubMed

    Takeuchi, Kazuhito; Tanaka, Ryohei; Yuge, Koretaka

    2015-09-30

    We propose a new approach, 'structure integration', enabling direct evaluation of configurational free energy for large systems. The present approach is based on the statistical information of lattice. Through first-principles-based simulation, we find that the present method evaluates configurational free energy accurately in disorder states above critical temperature.

  17. Improved Method Of Bending Concentric Pipes

    NASA Technical Reports Server (NTRS)

    Schroeder, James E.

    1995-01-01

    Proposed method for bending two concentric pipes simultaneously while maintaining void between them replaces present tedious, messy, and labor-intensive method. Array of rubber tubes inserted in gap between concentric pipes. Tubes then inflated with relatively incompressible liquid to fill gap. Enables bending to be done faster and more cleanly, and amenable to automation of significant portion of bending process on computer numerically controlled (CNC) tube-bending machinery.

  18. Development of a virtual speaking simulator using Image Based Rendering.

    PubMed

    Lee, J M; Kim, H; Oh, M J; Ku, J H; Jang, D P; Kim, I Y; Kim, S I

    2002-01-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology has enabled the use of virtual reality (VR) for the treatment of the fear of public speaking. There are two techniques for building virtual environments for the treatment of this fear: a model-based and a movie-based method. Both methods have the weakness that they are unrealistic and not controllable individually. To understand these disadvantages, this paper presents a virtual environment produced with Image Based Rendering (IBR) and a chroma-key simultaneously. IBR enables the creation of realistic virtual environments where the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma-keys puts virtual audience members under individual control in the environment. In addition, real time capture technique is used in constructing the virtual environments enabling spoken interaction between the subject and a therapist or another subject.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohenberger, Erik; Freitag, Nathan; Rosenmann, Daniel

    Here, we present a facile method for fabricating nanostructured silver films containing a high density of nanoscopic gap features through a surface directed phenomenon utilizing nanoporous scaffolds rather than through traditional lithographic patterning processes. This method enables tunability of the silver film growth by simply adjusting the formulation and processing conditions of the nanoporous film prior to metallization. We further demonstrate that this process can produce nanoscopic gaps in thick (100 nm) silver films supporting localized surface plasmon resonance with large field amplification within the gaps while enabling launching of propagating surface plasmons within the silver grains. These enhanced fieldsmore » provide metal enhanced fluorescence with enhancement factors as high as 21 times compared to glass, as well as enable visualization of single fluorophore emission. This work provides a low-cost rapid approach for producing novel nanostructures capable of broadband fluorescence amplification, with potential applications including plasmonic and fluorescence based optical sensing and imaging applications.« less

  20. Surface preparation for high purity alumina ceramics enabling direct brazing in hydrogen atmospheres

    DOEpatents

    Cadden, Charles H.; Yang, Nancy Yuan Chi; Hosking, Floyd M.

    2001-01-01

    The present invention relates to a method for preparing the surface of a high purity alumina ceramic or sapphire specimen that enables direct brazing in a hydrogen atmosphere using an active braze alloy. The present invention also relates to a method for directly brazing a high purity alumina ceramic or sapphire specimen to a ceramic or metal member using this method of surface preparation, and to articles produced by this brazing method. The presence of silicon, in the form of a SiO.sub.2 -containing surface layer, can more than double the tensile bond strength in alumina ceramic joints brazed in a hydrogen atmosphere using an active Au-16Ni-0.75 Mo-1.75V filler metal. A thin silicon coating applied by PVD processing can, after air firing, produce a semi-continuous coverage of the alumina surface with a SiO.sub.2 film. Room temperature tensile strength was found to be proportional to the fraction of air fired surface covered by silicon-containing films. Similarly, the ratio of substrate fracture versus interface separation was also related to the amount of surface silicon present prior to brazing. This process can replace the need to perform a "moly-manganese" metallization step.

  1. Boolean logic analysis for flow regime recognition of gas-liquid horizontal flow

    NASA Astrophysics Data System (ADS)

    Ramskill, Nicholas P.; Wang, Mi

    2011-10-01

    In order to develop a flowmeter for the accurate measurement of multiphase flows, it is of the utmost importance to correctly identify the flow regime present to enable the selection of the optimal method for metering. In this study, the horizontal flow of air and water in a pipeline was studied under a multitude of conditions using electrical resistance tomography but the flow regimes that are presented in this paper have been limited to plug and bubble air-water flows. This study proposes a novel method for recognition of the prevalent flow regime using only a fraction of the data, thus rendering the analysis more efficient. By considering the average conductivity of five zones along the central axis of the tomogram, key features can be identified, thus enabling the recognition of the prevalent flow regime. Boolean logic and frequency spectrum analysis has been applied for flow regime recognition. Visualization of the flow using the reconstructed images provides a qualitative comparison between different flow regimes. Application of the Boolean logic scheme enables a quantitative comparison of the flow patterns, thus reducing the subjectivity in the identification of the prevalent flow regime.

  2. The ADE scorecards: a tool for adverse drug event detection in electronic health records.

    PubMed

    Chazard, Emmanuel; Băceanu, Adrian; Ferret, Laurie; Ficheur, Grégoire

    2011-01-01

    Although several methods exist for Adverse Drug events (ADE) detection due to past hospitalizations, a tool that could display those ADEs to the physicians does not exist yet. This article presents the ADE Scorecards, a Web tool that enables to screen past hospitalizations extracted from Electronic Health Records (EHR), using a set of ADE detection rules, presently rules discovered by data mining. The tool enables the physicians to (1) get contextualized statistics about the ADEs that happen in their medical department, (2) see the rules that are useful in their department, i.e. the rules that could have enabled to prevent those ADEs and (3) review in detail the ADE cases, through a comprehensive interface displaying the diagnoses, procedures, lab results, administered drugs and anonymized records. The article shows a demonstration of the tool through a use case.

  3. Power Series Solution to the Pendulum Equation

    ERIC Educational Resources Information Center

    Benacka, Jan

    2009-01-01

    This note gives a power series solution to the pendulum equation that enables to investigate the system in an analytical way only, i.e. to avoid numeric methods. A method of determining the number of the terms for getting a required relative error is presented that uses bigger and lesser geometric series. The solution is suitable for modelling the…

  4. Sonic Boom Modeling Technical Challenge

    NASA Technical Reports Server (NTRS)

    Sullivan, Brenda M.

    2007-01-01

    This viewgraph presentation reviews the technical challenges in modeling sonic booms. The goal of this program is to develop knowledge, capabilities and technologies to enable overland supersonic flight. The specific objectives of the modeling are: (1) Develop and validate sonic boom propagation model through realistic atmospheres, including effects of turbulence (2) Develop methods enabling prediction of response of and acoustic transmission into structures impacted by sonic booms (3) Develop and validate psychoacoustic model of human response to sonic booms under both indoor and outdoor listening conditions, using simulators.

  5. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  6. Automatic computation and solution of generalized harmonic balance equations

    NASA Astrophysics Data System (ADS)

    Peyton Jones, J. C.; Yaser, K. S. A.; Stevenson, J.

    2018-02-01

    Generalized methods are presented for generating and solving the harmonic balance equations for a broad class of nonlinear differential or difference equations and for a general set of harmonics chosen by the user. In particular, a new algorithm for automatically generating the Jacobian of the balance equations enables efficient solution of these equations using continuation methods. Efficient numeric validation techniques are also presented, and the combined algorithm is applied to the analysis of dc, fundamental, second and third harmonic response of a nonlinear automotive damper.

  7. Cell separation using electric fields

    NASA Technical Reports Server (NTRS)

    Eppich, Henry M. (Inventor); Mangano, Joseph A. (Inventor)

    2003-01-01

    The present invention involves methods and devices which enable discrete objects having a conducting inner core, surrounded by a dielectric membrane to be selectively inactivated by electric fields via irreversible breakdown of their dielectric membrane. One important application of the invention is in the selection, purification, and/or purging of desired or undesired biological cells from cell suspensions. According to the invention, electric fields can be utilized to selectively inactivate and render non-viable particular subpopulations of cells in a suspension, while not adversely affecting other desired subpopulations. According to the inventive methods, the cells can be selected on the basis of intrinsic or induced differences in a characteristic electroporation threshold, which can depend, for example, on a difference in cell size and/or critical dielectric membrane breakdown voltage. The invention enables effective cell separation without the need to employ undesirable exogenous agents, such as toxins or antibodies. The inventive method also enables relatively rapid cell separation involving a relatively low degree of trauma or modification to the selected, desired cells. The inventive method has a variety of potential applications in clinical medicine, research, etc., with two of the more important foreseeable applications being stem cell enrichment/isolation, and cancer cell purging.

  8. Cell separation using electric fields

    NASA Technical Reports Server (NTRS)

    Mangano, Joseph (Inventor); Eppich, Henry (Inventor)

    2009-01-01

    The present invention involves methods and devices which enable discrete objects having a conducting inner core, surrounded by a dielectric membrane to be selectively inactivated by electric fields via irreversible breakdown of their dielectric membrane. One important application of the invention is in the selection, purification, and/or purging of desired or undesired biological cells from cell suspensions. According to the invention, electric fields can be utilized to selectively inactivate and render non-viable particular subpopulations of cells in a suspension, while not adversely affecting other desired subpopulations. According to the inventive methods, the cells can be selected on the basis of intrinsic or induced differences in a characteristic electroporation threshold, which can depend, for example, on a difference in cell size and/or critical dielectric membrane breakdown voltage. The invention enables effective cell separation without the need to employ undesirable exogenous agents, such as toxins or antibodies. The inventive method also enables relatively rapid cell separation involving a relatively low degree of trauma or modification to the selected, desired cells. The inventive method has a variety of potential applications in clinical medicine, research, etc., with two of the more important foreseeable applications being stem cell enrichment/isolation, and cancer cell purging.

  9. Reappraising Accretion to Vesta and the Angrite Parent Body Through Mineral-Scale Platinum Group Element and Os-Isotope Analyses

    NASA Astrophysics Data System (ADS)

    Riches, A. J. V.; Burton, K. W.; Nowell, G. M.; Dale, C. W.; Ottley, C. J.

    2016-08-01

    New methods presented here enable quantitative determination of mineral-scale PGE-abundances and Os-isotope compositions in meteorite materials thereby providing valuable new insight into planetary evolution.

  10. Using permeable membranes to produce hydrogen and oxygen from water

    NASA Technical Reports Server (NTRS)

    Sanders, A. P.; Williams, R. J.; Downs, W. R.; Mcbryar, H.

    1975-01-01

    Concept may make it profitable to obtain hydrogen fuel from water. Laboratory tests have demonstrated that method enables decomposition of water several orders of magnitude beyond equilibrium state where only small amounts of free hydrogen are present.

  11. Primary School Science: Implementation of Domain-General Strategies into Teaching Didactics

    ERIC Educational Resources Information Center

    Dejonckheere, Peter J. N.; Van de Keere, Kristof; Tallir, Isabel; Vervaet, Stephanie

    2013-01-01

    In the present study we present a didactic method to help children aged 11 and 12 learn science in such a way as to enable a dynamic interaction between domain general strategies and the development of conceptual knowledge, whilst each type of scientific process has been considered (forming of hypotheses, experimenting and evaluating). We have…

  12. Using Instant Feedback System and Micro Exams to Enhance Active Learning

    ERIC Educational Resources Information Center

    Sabag, N.; Kosolapov, S.

    2012-01-01

    This paper presents the outcomes of the preliminary survey in which the method of IFS was used to integrate motivating questions into the lecture presentations in order to increase the students' involvement. Instant Feedback System (IFS) enables the educators to improve their own teaching by getting instant and real-time feedback about how clear…

  13. Multi-photon microfabrication of three-dimensional capillary-scale vascular networks

    NASA Astrophysics Data System (ADS)

    Skylar-Scott, Mark A.; Liu, Man-Chi; Wu, Yuelong; Yanik, Mehmet Fatih

    2017-02-01

    Biomimetic models of microvasculature could enable assays of complex cellular behavior at the capillary-level, and enable efficient nutrient perfusion for the maintenance of tissues. However, existing three-dimensional printing methods for generating perfusable microvasculature with have insufficient resolution to recapitulate the microscale geometry of capillaries. Here, we present a collection of multiphoton microfabrication methods that enable the production of precise, three-dimensional, branched microvascular networks in collagen. When endothelial cells are added to the channels, they form perfusable lumens with diameters as small as 10 μm. Using a similar photochemistry, we also demonstrate the micropatterning of proteins embedded in microfabricated collagen scaffolds, producing hybrid scaffolds with both defined microarchitecture with integrated gradients of chemical cues. We provide examples for how these hybrid microfabricated scaffolds could be used in angiogenesis and cell homing assays. Finally, we describe a new method for increasing the micropatterning speed by synchronous laser and stage scanning. Using these technologies, we are working towards large-scale (>1 cm), high resolution ( 1 μm) scaffolds with both microarchitecture and embedded protein cues, with applications in three-dimensional assays of cellular behavior.

  14. Zone plate method for electronic holographic display using resolution redistribution technique.

    PubMed

    Takaki, Yasuhiro; Nakamura, Junya

    2011-07-18

    The resolution redistribution (RR) technique can increase the horizontal viewing-zone angle and screen size of electronic holographic display. The present study developed a zone plate method that would reduce hologram calculation time for the RR technique. This method enables calculation of an image displayed on a spatial light modulator by performing additions of the zone plates, while the previous calculation method required performing the Fourier transform twice. The derivation and modeling of the zone plate are shown. In addition, the look-up table approach was introduced for further reduction in computation time. Experimental verification using a holographic display module based on the RR technique is presented.

  15. Parameterizing Coefficients of a POD-Based Dynamical System

    NASA Technical Reports Server (NTRS)

    Kalb, Virginia L.

    2010-01-01

    A method of parameterizing the coefficients of a dynamical system based of a proper orthogonal decomposition (POD) representing the flow dynamics of a viscous fluid has been introduced. (A brief description of POD is presented in the immediately preceding article.) The present parameterization method is intended to enable construction of the dynamical system to accurately represent the temporal evolution of the flow dynamics over a range of Reynolds numbers. The need for this or a similar method arises as follows: A procedure that includes direct numerical simulation followed by POD, followed by Galerkin projection to a dynamical system has been proven to enable representation of flow dynamics by a low-dimensional model at the Reynolds number of the simulation. However, a more difficult task is to obtain models that are valid over a range of Reynolds numbers. Extrapolation of low-dimensional models by use of straightforward Reynolds-number-based parameter continuation has proven to be inadequate for successful prediction of flows. A key part of the problem of constructing a dynamical system to accurately represent the temporal evolution of the flow dynamics over a range of Reynolds numbers is the problem of understanding and providing for the variation of the coefficients of the dynamical system with the Reynolds number. Prior methods do not enable capture of temporal dynamics over ranges of Reynolds numbers in low-dimensional models, and are not even satisfactory when large numbers of modes are used. The basic idea of the present method is to solve the problem through a suitable parameterization of the coefficients of the dynamical system. The parameterization computations involve utilization of the transfer of kinetic energy between modes as a function of Reynolds number. The thus-parameterized dynamical system accurately predicts the flow dynamics and is applicable to a range of flow problems in the dynamical regime around the Hopf bifurcation. Parameter-continuation software can be used on the parameterized dynamical system to derive a bifurcation diagram that accurately predicts the temporal flow behavior.

  16. Supercolor coding methods for large-scale multiplexing of biochemical assays.

    PubMed

    Rajagopal, Aditya; Scherer, Axel; Homyk, Andrew; Kartalov, Emil

    2013-08-20

    We present a novel method for the encoding and decoding of multiplexed biochemical assays. The method enables a theoretically unlimited number of independent targets to be detected and uniquely identified in any combination in the same sample. For example, the method offers easy access to 12-plex and larger PCR assays, as contrasted to the current 4-plex assays. This advancement would allow for large panels of tests to be run simultaneously in the same sample, saving reagents, time, consumables, and manual labor, while also avoiding the traditional loss of sensitivity due to sample aliquoting. Thus, the presented method is a major technological breakthrough with far-reaching impact on biotechnology, biomedical science, and clinical diagnostics. Herein, we present the mathematical theory behind the method as well as its experimental proof of principle using Taqman PCR on sequences specific to infectious diseases.

  17. Three-Component Reaction Discovery Enabled by Mass Spectrometry of Self-Assembled Monolayers

    PubMed Central

    Montavon, Timothy J.; Li, Jing; Cabrera-Pardo, Jaime R.; Mrksich, Milan; Kozmin, Sergey A.

    2011-01-01

    Multi-component reactions have been extensively employed in many areas of organic chemistry. Despite significant progress, the discovery of such enabling transformations remains challenging. Here, we present the development of a parallel, label-free reaction-discovery platform, which can be used for identification of new multi-component transformations. Our approach is based on the parallel mass spectrometric screening of interfacial chemical reactions on arrays of self-assembled monolayers. This strategy enabled the identification of a simple organic phosphine that can catalyze a previously unknown condensation of siloxy alkynes, aldehydes and amines to produce 3-hydroxy amides with high efficiency and diastereoselectivity. The reaction was further optimized using solution phase methods. PMID:22169871

  18. Concept of a staged FEL enabled by fast synchrotron radiation cooling of laser-plasma accelerated beam by solenoidal magnetic fields in plasma bubble

    NASA Astrophysics Data System (ADS)

    Seryi, Andrei; Lesz, Zsolt; Andreev, Alexander; Konoplev, Ivan

    2017-03-01

    A novel method for generating GigaGauss solenoidal fields in a laser-plasma bubble, using screw-shaped laser pulses, has been recently presented. Such magnetic fields enable fast synchrotron radiation cooling of the beam emittance of laser-plasma accelerated leptons. This recent finding opens a novel approach for design of laser-plasma FELs or colliders, where the acceleration stages are interleaved with laser-plasma emittance cooling stages. In this concept paper, we present an outline of what a staged plasma-acceleration FEL could look like, and discuss further studies needed to investigate the feasibility of the concept in detail.

  19. NXE pellicle: development update

    NASA Astrophysics Data System (ADS)

    Brouns, Derk; Bendiksen, Aage; Broman, Par; Casimiri, Eric; Colsters, Paul; de Graaf, Dennis; Harrold, Hilary; Hennus, Piet; Janssen, Paul; Kramer, Ronald; Kruizinga, Matthias; Kuntzel, Henk; Lafarre, Raymond; Mancuso, Andrea; Ockwell, David; Smith, Daniel; van de Weg, David; Wiley, Jim

    2016-09-01

    ASML introduced the NXE pellicle concept, a removable pellicle solution that is compatible with current and future patterned mask inspection methods. We will present results of how we have taken the idea from concept to a demonstrated solution enabling the use of EUV pellicle by the industry for high volume manufacturing. We will update on the development of the next generation of pellicle films with higher power capability. Further, we will provide an update on top level requirements for pellicles and external interface requirements needed to support NXE pellicle adoption at a mask shop. Finally, we will present ASML's pellicle handling equipment to enable pellicle use at mask shops and our NXE pellicle roadmap outlining future improvements.

  20. Printable semiconductor structures and related methods of making and assembling

    DOEpatents

    Nuzzo, Ralph G.; Rogers, John A.; Menard, Etienne; Lee, Keon Jae; Khang; , Dahl-Young; Sun, Yugang; Meitl, Matthew; Zhu, Zhengtao; Ko, Heung Cho; Mack, Shawn

    2013-03-12

    The present invention provides a high yield pathway for the fabrication, transfer and assembly of high quality printable semiconductor elements having selected physical dimensions, shapes, compositions and spatial orientations. The compositions and methods of the present invention provide high precision registered transfer and integration of arrays of microsized and/or nanosized semiconductor structures onto substrates, including large area substrates and/or flexible substrates. In addition, the present invention provides methods of making printable semiconductor elements from low cost bulk materials, such as bulk silicon wafers, and smart-materials processing strategies that enable a versatile and commercially attractive printing-based fabrication platform for making a broad range of functional semiconductor devices.

  1. Printable semiconductor structures and related methods of making and assembling

    DOEpatents

    Nuzzo, Ralph G [Champaign, IL; Rogers, John A [Champaign, IL; Menard, Etienne [Durham, NC; Lee, Keon Jae [Tokyo, JP; Khang, Dahl-Young [Urbana, IL; Sun, Yugang [Westmont, IL; Meitl, Matthew [Raleigh, NC; Zhu, Zhengtao [Rapid City, SD; Ko, Heung Cho [Urbana, IL; Mack, Shawn [Goleta, CA

    2011-10-18

    The present invention provides a high yield pathway for the fabrication, transfer and assembly of high quality printable semiconductor elements having selected physical dimensions, shapes, compositions and spatial orientations. The compositions and methods of the present invention provide high precision registered transfer and integration of arrays of microsized and/or nanosized semiconductor structures onto substrates, including large area substrates and/or flexible substrates. In addition, the present invention provides methods of making printable semiconductor elements from low cost bulk materials, such as bulk silicon wafers, and smart-materials processing strategies that enable a versatile and commercially attractive printing-based fabrication platform for making a broad range of functional semiconductor devices.

  2. Printable semiconductor structures and related methods of making and assembling

    DOEpatents

    Nuzzo, Ralph G.; Rogers, John A.; Menard, Etienne; Lee, Keon Jae; Khang, Dahl-Young; Sun, Yugang; Meitl, Matthew; Zhu, Zhengtao; Ko, Heung Cho; Mack, Shawn

    2010-09-21

    The present invention provides a high yield pathway for the fabrication, transfer and assembly of high quality printable semiconductor elements having selected physical dimensions, shapes, compositions and spatial orientations. The compositions and methods of the present invention provide high precision registered transfer and integration of arrays of microsized and/or nanosized semiconductor structures onto substrates, including large area substrates and/or flexible substrates. In addition, the present invention provides methods of making printable semiconductor elements from low cost bulk materials, such as bulk silicon wafers, and smart-materials processing strategies that enable a versatile and commercially attractive printing-based fabrication platform for making a broad range of functional semiconductor devices.

  3. Neurometric assessment of intraoperative anesthetic

    DOEpatents

    Kangas, Lars J.; Keller, Paul E.

    1998-01-01

    The present invention is a method and apparatus for collecting EEG data, reducing the EEG data into coefficients, and correlating those coefficients with a depth of unconsciousness or anesthetic depth, and which obtains a bounded first derivative of anesthetic depth to indicate trends. The present invention provides a developed artificial neural network based method capable of continuously analyzing EEG data to discriminate between awake and anesthetized states in an individual and continuously monitoring anesthetic depth trends in real-time. The present invention enables an anesthesiologist to respond immediately to changes in anesthetic depth of the patient during surgery and to administer the correct amount of anesthetic.

  4. State estimation for advanced control of wave energy converters

    DOE Data Explorer

    Coe, Ryan; Bacelli, Giorgio

    2017-04-25

    A report on state estimation for advanced control of wave energy converters (WECs), with supporting data models and slides from the overview presentation. The methods discussed are intended for use to enable real-time closed loop control of WECs.

  5. Quantitative image analysis for evaluating the coating thickness and pore distribution in coated small particles.

    PubMed

    Laksmana, F L; Van Vliet, L J; Hartman Kok, P J A; Vromans, H; Frijlink, H W; Van der Voort Maarschalk, K

    2009-04-01

    This study aims to develop a characterization method for coating structure based on image analysis, which is particularly promising for the rational design of coated particles in the pharmaceutical industry. The method applies the MATLAB image processing toolbox to images of coated particles taken with Confocal Laser Scanning Microscopy (CSLM). The coating thicknesses have been determined along the particle perimeter, from which a statistical analysis could be performed to obtain relevant thickness properties, e.g. the minimum coating thickness and the span of the thickness distribution. The characterization of the pore structure involved a proper segmentation of pores from the coating and a granulometry operation. The presented method facilitates the quantification of porosity, thickness and pore size distribution of a coating. These parameters are considered the important coating properties, which are critical to coating functionality. Additionally, the effect of the coating process variations on coating quality can straight-forwardly be assessed. Enabling a good characterization of the coating qualities, the presented method can be used as a fast and effective tool to predict coating functionality. This approach also enables the influence of different process conditions on coating properties to be effectively monitored, which latterly leads to process tailoring.

  6. Stimulated Raman scattering (SRS) spectroscopic OCT (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Robles, Francisco E.; Zhou, Kevin C.; Fischer, Martin C.; Warren, Warren S.

    2017-02-01

    Optical coherence tomography (OCT) enables non-invasive, high-resolution, tomographic imaging of biological tissues by leveraging principles of low coherence interferometry; however, OCT lacks molecular specificity. Spectroscopic OCT (SOCT) overcomes this limitation by providing depth-resolved spectroscopic signatures of chromophores, but SOCT has been limited to a couple of endogenous molecules, namely hemoglobin and melanin. Stimulated Raman scattering, on the other hand, can provide highly specific molecular information of many endogenous species, but lacks the spatial and spectral multiplexing capabilities of SOCT. In this work we integrate the two methods, SRS and SOCT, to enable simultaneously multiplexed spatial and spectral imaging with sensitivity to many endogenous biochemical species that play an important role in biology and medicine. The method, termed SRS-SOCT, has the potential to achieve fast, volumetric, and highly sensitive label-free molecular imaging, which would be valuable for many applications. We demonstrate the approach by imaging excised human adipose tissue and detecting the lipids' Raman signatures in the high-wavenumber region. Details of this method along with validations and results will be presented.

  7. MorphDB: Prioritizing Genes for Specialized Metabolism Pathways and Gene Ontology Categories in Plants.

    PubMed

    Zwaenepoel, Arthur; Diels, Tim; Amar, David; Van Parys, Thomas; Shamir, Ron; Van de Peer, Yves; Tzfadia, Oren

    2018-01-01

    Recent times have seen an enormous growth of "omics" data, of which high-throughput gene expression data are arguably the most important from a functional perspective. Despite huge improvements in computational techniques for the functional classification of gene sequences, common similarity-based methods often fall short of providing full and reliable functional information. Recently, the combination of comparative genomics with approaches in functional genomics has received considerable interest for gene function analysis, leveraging both gene expression based guilt-by-association methods and annotation efforts in closely related model organisms. Besides the identification of missing genes in pathways, these methods also typically enable the discovery of biological regulators (i.e., transcription factors or signaling genes). A previously built guilt-by-association method is MORPH, which was proven to be an efficient algorithm that performs particularly well in identifying and prioritizing missing genes in plant metabolic pathways. Here, we present MorphDB, a resource where MORPH-based candidate genes for large-scale functional annotations (Gene Ontology, MapMan bins) are integrated across multiple plant species. Besides a gene centric query utility, we present a comparative network approach that enables researchers to efficiently browse MORPH predictions across functional gene sets and species, facilitating efficient gene discovery and candidate gene prioritization. MorphDB is available at http://bioinformatics.psb.ugent.be/webtools/morphdb/morphDB/index/. We also provide a toolkit, named "MORPH bulk" (https://github.com/arzwa/morph-bulk), for running MORPH in bulk mode on novel data sets, enabling researchers to apply MORPH to their own species of interest.

  8. X-ray Moiré deflectometry using synthetic reference images

    DOE PAGES

    Stutman, Dan; Valdivia, Maria Pia; Finkenthal, Michael

    2015-06-25

    Moiré fringe deflectometry with grating interferometers is a technique that enables refraction-based x-ray imaging using a single exposure of an object. To obtain the refraction image, the method requires a reference fringe pattern (without the object). Our study shows that, in order to avoid artifacts, the reference pattern must be exactly matched in phase with the object fringe pattern. In experiments, however, it is difficult to produce a perfectly matched reference pattern due to unavoidable interferometer drifts. We present a simple method to obtain matched reference patterns using a phase-scan procedure to generate synthetic Moiré images. As a result, themore » method will enable deflectometric diagnostics of transient phenomena such as laser-produced plasmas and could improve the sensitivity and accuracy of medical phase-contrast imaging.« less

  9. [Biocybernetic approach to the thermometric methods of blood supply measurements of periodontal tissues].

    PubMed

    Pastusiak, J; Zakrzewski, J

    1988-11-01

    Specific biocybernetic approach to the problem of the blood supply determination of paradontium tissues by means of thermometric methods has been presented in the paper. The compartment models of the measuring procedure have been given. Dilutodynamic methology and classification has been applied. Such an approach enables to select appropriate biophysical parameters describing the state of blood supply of paradontium tissues and optimal design of transducers and measuring methods.

  10. A Unified Approach to Genotype Imputation and Haplotype-Phase Inference for Large Data Sets of Trios and Unrelated Individuals

    PubMed Central

    Browning, Brian L.; Browning, Sharon R.

    2009-01-01

    We present methods for imputing data for ungenotyped markers and for inferring haplotype phase in large data sets of unrelated individuals and parent-offspring trios. Our methods make use of known haplotype phase when it is available, and our methods are computationally efficient so that the full information in large reference panels with thousands of individuals is utilized. We demonstrate that substantial gains in imputation accuracy accrue with increasingly large reference panel sizes, particularly when imputing low-frequency variants, and that unphased reference panels can provide highly accurate genotype imputation. We place our methodology in a unified framework that enables the simultaneous use of unphased and phased data from trios and unrelated individuals in a single analysis. For unrelated individuals, our imputation methods produce well-calibrated posterior genotype probabilities and highly accurate allele-frequency estimates. For trios, our haplotype-inference method is four orders of magnitude faster than the gold-standard PHASE program and has excellent accuracy. Our methods enable genotype imputation to be performed with unphased trio or unrelated reference panels, thus accounting for haplotype-phase uncertainty in the reference panel. We present a useful measure of imputation accuracy, allelic R2, and show that this measure can be estimated accurately from posterior genotype probabilities. Our methods are implemented in version 3.0 of the BEAGLE software package. PMID:19200528

  11. Methods for transition toward computer assisted cognitive examination.

    PubMed

    Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A

    2015-01-01

    We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.

  12. Recent Enhancements to the Development of CFD-Based Aeroelastic Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    2007-01-01

    Recent enhancements to the development of CFD-based unsteady aerodynamic and aeroelastic reduced-order models (ROMs) are presented. These enhancements include the simultaneous application of structural modes as CFD input, static aeroelastic analysis using a ROM, and matched-point solutions using a ROM. The simultaneous application of structural modes as CFD input enables the computation of the unsteady aerodynamic state-space matrices with a single CFD execution, independent of the number of structural modes. The responses obtained from a simultaneous excitation of the CFD-based unsteady aerodynamic system are processed using system identification techniques in order to generate an unsteady aerodynamic state-space ROM. Once the unsteady aerodynamic state-space ROM is generated, a method for computing the static aeroelastic response using this unsteady aerodynamic ROM and a state-space model of the structure, is presented. Finally, a method is presented that enables the computation of matchedpoint solutions using a single ROM that is applicable over a range of dynamic pressures and velocities for a given Mach number. These enhancements represent a significant advancement of unsteady aerodynamic and aeroelastic ROM technology.

  13. New Method to Study the Vibrational Modes of Biomolecules in the Terahertz Range Based on a Single-Stage Raman Spectrometer.

    PubMed

    Kalanoor, Basanth S; Ronen, Maria; Oren, Ziv; Gerber, Doron; Tischler, Yaakov R

    2017-03-31

    The low-frequency vibrational (LFV) modes of biomolecules reflect specific intramolecular and intermolecular thermally induced fluctuations that are driven by external perturbations, such as ligand binding, protein interaction, electron transfer, and enzymatic activity. Large efforts have been invested over the years to develop methods to access the LFV modes due to their importance in the studies of the mechanisms and biological functions of biomolecules. Here, we present a method to measure the LFV modes of biomolecules based on Raman spectroscopy that combines volume holographic filters with a single-stage spectrometer, to obtain high signal-to-noise-ratio spectra in short acquisition times. We show that this method enables LFV mode characterization of biomolecules even in a hydrated environment. The measured spectra exhibit distinct features originating from intra- and/or intermolecular collective motion and lattice modes. The observed modes are highly sensitive to the overall structure, size, long-range order, and configuration of the molecules, as well as to their environment. Thus, the LFV Raman spectrum acts as a fingerprint of the molecular structure and conformational state of a biomolecule. The comprehensive method we present here is widely applicable, thus enabling high-throughput study of LFV modes of biomolecules.

  14. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  15. METHODOLOGY FOR MEASURING PM 2.5 SEPARATOR CHARACTERISTICS USING AN AEROSIZER

    EPA Science Inventory

    A method is presented that enables the measurement of the particle size separation characteristics of an inertial separator in a rapid fashion. Overall penetration is determined for discrete particle sizes using an Aerosizer (Model LD, TSI, Incorporated, Particle Instruments/Am...

  16. Modelling of percolation rate of stormwater from underground infiltration systems.

    PubMed

    Burszta-Adamiak, Ewa; Lomotowski, Janusz

    2013-01-01

    Underground or surface stormwater storage tank systems that enable the infiltration of water into the ground are basic elements used in Sustainable Urban Drainage Systems (SUDS). So far, the design methods for such facilities have not taken into account the phenomenon of ground clogging during stormwater infiltration. Top layer sealing of the filter bed influences the infiltration rate of water into the ground. This study presents an original mathematical model describing changes in the infiltration rate variability in the phases of filling and emptying the storage and infiltration tank systems, which enables the determination of the degree of top ground layer clogging. The input data for modelling were obtained from studies conducted on experimental sites on objects constructed on a semi-technological scale. The experiment conducted has proven that the application of the model developed for the phase of water infiltration enables us to estimate the degree of module clogging. However, this method is more suitable for reservoirs embedded in more permeable soils than for those located in cohesive soils.

  17. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples.

    PubMed

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-07-05

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell.

  18. Defining and Enabling Resiliency of Electric Distribution Systems With Multiple Microgrids

    DOE PAGES

    Chanda, Sayonsom; Srivastava, Anurag K.

    2016-05-02

    This paper presents a method for quantifying and enabling the resiliency of a power distribution system (PDS) using analytical hierarchical process and percolation theory. Using this metric, quantitative analysis can be done to analyze the impact of possible control decisions to pro-actively enable the resilient operation of distribution system with multiple microgrids and other resources. Developed resiliency metric can also be used in short term distribution system planning. The benefits of being able to quantify resiliency can help distribution system planning engineers and operators to justify control actions, compare different reconfiguration algorithms, develop proactive control actions to avert power systemmore » outage due to impending catastrophic weather situations or other adverse events. Validation of the proposed method is done using modified CERTS microgrids and a modified industrial distribution system. Furthermore, simulation results show topological and composite metric considering power system characteristics to quantify the resiliency of a distribution system with the proposed methodology, and improvements in resiliency using two-stage reconfiguration algorithm and multiple microgrids.« less

  19. Direct on-chip DNA synthesis using electrochemically modified gold electrodes as solid support

    NASA Astrophysics Data System (ADS)

    Levrie, Karen; Jans, Karolien; Schepers, Guy; Vos, Rita; Van Dorpe, Pol; Lagae, Liesbet; Van Hoof, Chris; Van Aerschot, Arthur; Stakenborg, Tim

    2018-04-01

    DNA microarrays have propelled important advancements in the field of genomic research by enabling the monitoring of thousands of genes in parallel. The throughput can be increased even further by scaling down the microarray feature size. In this respect, microelectronics-based DNA arrays are promising as they can leverage semiconductor processing techniques with lithographic resolutions. We propose a method that enables the use of metal electrodes for de novo DNA synthesis without the need for an insulating support. By electrochemically functionalizing gold electrodes, these electrodes can act as solid support for phosphoramidite-based synthesis. The proposed method relies on the electrochemical reduction of diazonium salts, enabling site-specific incorporation of hydroxyl groups onto the metal electrodes. An automated DNA synthesizer was used to couple phosphoramidite moieties directly onto the OH-modified electrodes to obtain the desired oligonucleotide sequence. Characterization was done via cyclic voltammetry and fluorescence microscopy. Our results present a valuable proof-of-concept for the integration of solid-phase DNA synthesis with microelectronics.

  20. A facile route towards large area self-assembled nanoscale silver film morphologies and their applications towards metal enhanced fluorescence

    DOE PAGES

    Hohenberger, Erik; Freitag, Nathan; Rosenmann, Daniel; ...

    2017-04-19

    Here, we present a facile method for fabricating nanostructured silver films containing a high density of nanoscopic gap features through a surface directed phenomenon utilizing nanoporous scaffolds rather than through traditional lithographic patterning processes. This method enables tunability of the silver film growth by simply adjusting the formulation and processing conditions of the nanoporous film prior to metallization. We further demonstrate that this process can produce nanoscopic gaps in thick (100 nm) silver films supporting localized surface plasmon resonance with large field amplification within the gaps while enabling launching of propagating surface plasmons within the silver grains. These enhanced fieldsmore » provide metal enhanced fluorescence with enhancement factors as high as 21 times compared to glass, as well as enable visualization of single fluorophore emission. This work provides a low-cost rapid approach for producing novel nanostructures capable of broadband fluorescence amplification, with potential applications including plasmonic and fluorescence based optical sensing and imaging applications.« less

  1. Improvements to sample processing and measurement to enable more widespread environmental application of tritium.

    PubMed

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.

  2. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    DOE PAGES

    Moran, James; Alexander, Thomas; Aalseth, Craig; ...

    2017-01-26

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. Here, we present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We also identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133 Bq of total T activity. Furthermore, this enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps inmore » our understanding of both natural and artificial T behavior in the environment.« less

  3. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor.

    PubMed

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-Ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-03-05

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes.

  4. Computationally efficient approach for solving time dependent diffusion equation with discrete temporal convolution applied to granular particles of battery electrodes

    NASA Astrophysics Data System (ADS)

    Senegačnik, Jure; Tavčar, Gregor; Katrašnik, Tomaž

    2015-03-01

    The paper presents a computationally efficient method for solving the time dependent diffusion equation in a granule of the Li-ion battery's granular solid electrode. The method, called Discrete Temporal Convolution method (DTC), is based on a discrete temporal convolution of the analytical solution of the step function boundary value problem. This approach enables modelling concentration distribution in the granular particles for arbitrary time dependent exchange fluxes that do not need to be known a priori. It is demonstrated in the paper that the proposed method features faster computational times than finite volume/difference methods and Padé approximation at the same accuracy of the results. It is also demonstrated that all three addressed methods feature higher accuracy compared to the quasi-steady polynomial approaches when applied to simulate the current densities variations typical for mobile/automotive applications. The proposed approach can thus be considered as one of the key innovative methods enabling real-time capability of the multi particle electrochemical battery models featuring spatial and temporal resolved particle concentration profiles.

  5. A method for validation of finite element forming simulation on basis of a pointwise comparison of distance and curvature

    NASA Astrophysics Data System (ADS)

    Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank

    2016-10-01

    Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.

  6. Enabling fast charging - A battery technology gap assessment

    NASA Astrophysics Data System (ADS)

    Ahmed, Shabbir; Bloom, Ira; Jansen, Andrew N.; Tanim, Tanvir; Dufek, Eric J.; Pesaran, Ahmad; Burnham, Andrew; Carlson, Richard B.; Dias, Fernando; Hardy, Keith; Keyser, Matthew; Kreuzer, Cory; Markel, Anthony; Meintz, Andrew; Michelbacher, Christopher; Mohanpurkar, Manish; Nelson, Paul A.; Robertson, David C.; Scoffield, Don; Shirk, Matthew; Stephens, Thomas; Vijayagopal, Ram; Zhang, Jiucai

    2017-11-01

    The battery technology literature is reviewed, with an emphasis on key elements that limit extreme fast charging. Key gaps in existing elements of the technology are presented as well as developmental needs. Among these needs are advanced models and methods to detect and prevent lithium plating; new positive-electrode materials which are less prone to stress-induced failure; better electrode designs to accommodate very rapid diffusion in and out of the electrode; measure temperature distributions during fast charge to enable/validate models; and develop thermal management and pack designs to accommodate the higher operating voltage.

  7. Investigation of metabolic objectives in cultured hepatocytes.

    PubMed

    Uygun, Korkut; Matthew, Howard W T; Huang, Yinlun

    2007-06-15

    Using optimization based methods to predict fluxes in metabolic flux balance models has been a successful approach for some microorganisms, enabling construction of in silico models and even inference of some regulatory motifs. However, this success has not been translated to mammalian cells. The lack of knowledge about metabolic objectives in mammalian cells is a major obstacle that prevents utilization of various metabolic engineering tools and methods for tissue engineering and biomedical purposes. In this work, we investigate and identify possible metabolic objectives for hepatocytes cultured in vitro. To achieve this goal, we present a special data-mining procedure for identifying metabolic objective functions in mammalian cells. This multi-level optimization based algorithm enables identifying the major fluxes in the metabolic objective from MFA data in the absence of information about critical active constraints of the system. Further, once the objective is determined, active flux constraints can also be identified and analyzed. This information can be potentially used in a predictive manner to improve cell culture results or clinical metabolic outcomes. As a result of the application of this method, it was found that in vitro cultured hepatocytes maximize oxygen uptake, coupling of urea and TCA cycles, and synthesis of serine and urea. Selection of these fluxes as the metabolic objective enables accurate prediction of the flux distribution in the system given a limited amount of flux data; thus presenting a workable in silico model for cultured hepatocytes. It is observed that an overall homeostasis picture is also emergent in the findings.

  8. Copper(II) acetate promoted intramolecular diamination of unactivated olefins.

    PubMed

    Zabawa, Thomas P; Kasi, Dhanalakshmi; Chemler, Sherry R

    2005-08-17

    A concise method for the synthesis of cyclic sulfamides and vicinal diamines is presented. This method is enabled by Cu(OAc)2 and demonstrates a new transformation for this metal. Both five- and six-membered vicinal diamine-containing heterocycles have been synthesized in good to excellent yields, and substrate-based asymmetric induction has been achieved. This is the first reported example of intramolecular diamination of olefins.

  9. A concept for holistic whole body MRI data analysis, Imiomics

    PubMed Central

    Malmberg, Filip; Johansson, Lars; Lind, Lars; Sundbom, Magnus; Ahlström, Håkan; Kullberg, Joel

    2017-01-01

    Purpose To present and evaluate a whole-body image analysis concept, Imiomics (imaging–omics) and an image registration method that enables Imiomics analyses by deforming all image data to a common coordinate system, so that the information in each voxel can be compared between persons or within a person over time and integrated with non-imaging data. Methods The presented image registration method utilizes relative elasticity constraints of different tissue obtained from whole-body water-fat MRI. The registration method is evaluated by inverse consistency and Dice coefficients and the Imiomics concept is evaluated by example analyses of importance for metabolic research using non-imaging parameters where we know what to expect. The example analyses include whole body imaging atlas creation, anomaly detection, and cross-sectional and longitudinal analysis. Results The image registration method evaluation on 128 subjects shows low inverse consistency errors and high Dice coefficients. Also, the statistical atlas with fat content intensity values shows low standard deviation values, indicating successful deformations to the common coordinate system. The example analyses show expected associations and correlations which agree with explicit measurements, and thereby illustrate the usefulness of the proposed Imiomics concept. Conclusions The registration method is well-suited for Imiomics analyses, which enable analyses of relationships to non-imaging data, e.g. clinical data, in new types of holistic targeted and untargeted big-data analysis. PMID:28241015

  10. Measurement in Physical Education. 5th Edition.

    ERIC Educational Resources Information Center

    Mathews, Donald K.

    Concepts of measurement in physical education are presented in this college-level text to enable the preservice physical education major to develop skills in determining pupil status, designing effective physical activity programs, and measuring student progress. Emphasis is placed upon discussion of essential statistical methods, test…

  11. Euclid and Descartes: A Partnership.

    ERIC Educational Resources Information Center

    Wasdovich, Dorothy Hoy

    1991-01-01

    Presented is a method of reorganizing a high school geometry course to integrate coordinate geometry together with Euclidean geometry at an earlier stage in the course, thus enabling students to prove subsequent theorems from either perspective. Several examples contrasting different proofs from both perspectives are provided. (MDH)

  12. An online database for plant image analysis software tools.

    PubMed

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-10-09

    Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.

  13. Blood flow estimation in gastroscopic true-color images

    NASA Astrophysics Data System (ADS)

    Jacoby, Raffael S.; Herpers, Rainer; Zwiebel, Franz M.; Englmeier, Karl-Hans

    1995-05-01

    The assessment of blood flow in the gastrointestinal mucosa might be an important factor for the diagnosis and treatment of several diseases such as ulcers, gastritis, colitis, or early cancer. The quantity of blood flow is roughly estimated by computing the spatial hemoglobin distribution in the mucosa. The presented method enables a practical realization by calculating approximately the hemoglobin concentration based on a spectrophotometric analysis of endoscopic true-color images, which are recorded during routine examinations. A system model based on the reflectance spectroscopic law of Kubelka-Munk is derived which enables an estimation of the hemoglobin concentration by means of the color values of the images. Additionally, a transformation of the color values is developed in order to improve the luminance independence. Applying this transformation and estimating the hemoglobin concentration for each pixel of interest, the hemoglobin distribution can be computed. The obtained results are mostly independent of luminance. An initial validation of the presented method is performed by a quantitative estimation of the reproducibility.

  14. Method and system rapid piece handling

    DOEpatents

    Spletzer, Barry L.

    1996-01-01

    The advent of high-speed fabric cutters has made necessary the development of automated techniques for the collection and sorting of garment pieces into collated piles of pieces ready for assembly. The present invention enables a new method for such handling and sorting of garment parts, and to apparatus capable of carrying out this new method. The common thread is the application of computer-controlled shuttling bins, capable of picking up a desired piece of fabric and dropping it in collated order for assembly. Such apparatus with appropriate computer control relieves the bottleneck now presented by the sorting and collation procedure, thus greatly increasing the overall rate at which garments can be assembled.

  15. Neural dynamic optimization for control systems. I. Background.

    PubMed

    Seong, C Y; Widrow, B

    2001-01-01

    The paper presents neural dynamic optimization (NDO) as a method of optimal feedback control for nonlinear multi-input-multi-output (MIMO) systems. The main feature of NDO is that it enables neural networks to approximate the optimal feedback solution whose existence dynamic programming (DP) justifies, thereby reducing the complexities of computation and storage problems of the classical methods such as DP. This paper mainly describes the background and motivations for the development of NDO, while the two other subsequent papers of this topic present the theory of NDO and demonstrate the method with several applications including control of autonomous vehicles and of a robot arm, respectively.

  16. Neural dynamic optimization for control systems.III. Applications.

    PubMed

    Seong, C Y; Widrow, B

    2001-01-01

    For pt.II. see ibid., p. 490-501. The paper presents neural dynamic optimization (NDO) as a method of optimal feedback control for nonlinear multi-input-multi-output (MIMO) systems. The main feature of NDO is that it enables neural networks to approximate the optimal feedback solution whose existence dynamic programming (DP) justifies, thereby reducing the complexities of computation and storage problems of the classical methods such as DP. This paper demonstrates NDO with several applications including control of autonomous vehicles and of a robot-arm, while the two other companion papers of this topic describes the background for the development of NDO and present the theory of the method, respectively.

  17. Aeroelastic Tailoring of Transport Aircraft Wings: State-of-the-Art and Potential Enabling Technologies

    NASA Technical Reports Server (NTRS)

    Jutte, Christine; Stanford, Bret K.

    2014-01-01

    This paper provides a brief overview of the state-of-the-art for aeroelastic tailoring of subsonic transport aircraft and offers additional resources on related research efforts. Emphasis is placed on aircraft having straight or aft swept wings. The literature covers computational synthesis tools developed for aeroelastic tailoring and numerous design studies focused on discovering new methods for passive aeroelastic control. Several new structural and material technologies are presented as potential enablers of aeroelastic tailoring, including selectively reinforced materials, functionally graded materials, fiber tow steered composite laminates, and various nonconventional structural designs. In addition, smart materials and structures whose properties or configurations change in response to external stimuli are presented as potential active approaches to aeroelastic tailoring.

  18. Complete mitochondrial genome sequence of a Middle Pleistocene cave bear reconstructed from ultrashort DNA fragments.

    PubMed

    Dabney, Jesse; Knapp, Michael; Glocke, Isabelle; Gansauge, Marie-Theres; Weihmann, Antje; Nickel, Birgit; Valdiosera, Cristina; García, Nuria; Pääbo, Svante; Arsuaga, Juan-Luis; Meyer, Matthias

    2013-09-24

    Although an inverse relationship is expected in ancient DNA samples between the number of surviving DNA fragments and their length, ancient DNA sequencing libraries are strikingly deficient in molecules shorter than 40 bp. We find that a loss of short molecules can occur during DNA extraction and present an improved silica-based extraction protocol that enables their efficient retrieval. In combination with single-stranded DNA library preparation, this method enabled us to reconstruct the mitochondrial genome sequence from a Middle Pleistocene cave bear (Ursus deningeri) bone excavated at Sima de los Huesos in the Sierra de Atapuerca, Spain. Phylogenetic reconstructions indicate that the U. deningeri sequence forms an early diverging sister lineage to all Western European Late Pleistocene cave bears. Our results prove that authentic ancient DNA can be preserved for hundreds of thousand years outside of permafrost. Moreover, the techniques presented enable the retrieval of phylogenetically informative sequences from samples in which virtually all DNA is diminished to fragments shorter than 50 bp.

  19. Complete mitochondrial genome sequence of a Middle Pleistocene cave bear reconstructed from ultrashort DNA fragments

    PubMed Central

    Dabney, Jesse; Knapp, Michael; Glocke, Isabelle; Gansauge, Marie-Theres; Weihmann, Antje; Nickel, Birgit; Valdiosera, Cristina; García, Nuria; Pääbo, Svante; Arsuaga, Juan-Luis; Meyer, Matthias

    2013-01-01

    Although an inverse relationship is expected in ancient DNA samples between the number of surviving DNA fragments and their length, ancient DNA sequencing libraries are strikingly deficient in molecules shorter than 40 bp. We find that a loss of short molecules can occur during DNA extraction and present an improved silica-based extraction protocol that enables their efficient retrieval. In combination with single-stranded DNA library preparation, this method enabled us to reconstruct the mitochondrial genome sequence from a Middle Pleistocene cave bear (Ursus deningeri) bone excavated at Sima de los Huesos in the Sierra de Atapuerca, Spain. Phylogenetic reconstructions indicate that the U. deningeri sequence forms an early diverging sister lineage to all Western European Late Pleistocene cave bears. Our results prove that authentic ancient DNA can be preserved for hundreds of thousand years outside of permafrost. Moreover, the techniques presented enable the retrieval of phylogenetically informative sequences from samples in which virtually all DNA is diminished to fragments shorter than 50 bp. PMID:24019490

  20. Electrochemical thermodynamic measurement system

    DOEpatents

    Reynier, Yvan [Meylan, FR; Yazami, Rachid [Los Angeles, CA; Fultz, Brent T [Pasadena, CA

    2009-09-29

    The present invention provides systems and methods for accurately characterizing thermodynamic and materials properties of electrodes and electrochemical energy storage and conversion systems. Systems and methods of the present invention are configured for simultaneously collecting a suite of measurements characterizing a plurality of interconnected electrochemical and thermodynamic parameters relating to the electrode reaction state of advancement, voltage and temperature. Enhanced sensitivity provided by the present methods and systems combined with measurement conditions that reflect thermodynamically stabilized electrode conditions allow very accurate measurement of thermodynamic parameters, including state functions such as the Gibbs free energy, enthalpy and entropy of electrode/electrochemical cell reactions, that enable prediction of important performance attributes of electrode materials and electrochemical systems, such as the energy, power density, current rate and the cycle life of an electrochemical cell.

  1. Neurometric assessment of intraoperative anesthetic

    DOEpatents

    Kangas, L.J.; Keller, P.E.

    1998-07-07

    The present invention is a method and apparatus for collecting EEG data, reducing the EEG data into coefficients, and correlating those coefficients with a depth of unconsciousness or anesthetic depth, and which obtains a bounded first derivative of anesthetic depth to indicate trends. The present invention provides a developed artificial neural network based method capable of continuously analyzing EEG data to discriminate between awake and anesthetized states in an individual and continuously monitoring anesthetic depth trends in real-time. The present invention enables an anesthesiologist to respond immediately to changes in anesthetic depth of the patient during surgery and to administer the correct amount of anesthetic. 7 figs.

  2. Neurometric assessment of intraoperative anesthetic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kangas, L.J.; Keller, P.E.

    1998-07-07

    The present invention is a method and apparatus for collecting EEG data, reducing the EEG data into coefficients, and correlating those coefficients with a depth of unconsciousness or anesthetic depth, and which obtains a bounded first derivative of anesthetic depth to indicate trends. The present invention provides a developed artificial neural network based method capable of continuously analyzing EEG data to discriminate between awake and anesthetized states in an individual and continuously monitoring anesthetic depth trends in real-time. The present invention enables an anesthesiologist to respond immediately to changes in anesthetic depth of the patient during surgery and to administermore » the correct amount of anesthetic. 7 figs.« less

  3. Use of CFD modelling for analysing air parameters in auditorium halls

    NASA Astrophysics Data System (ADS)

    Cichowicz, Robert

    2017-11-01

    Modelling with the use of numerical methods is currently the most popular method of solving scientific as well as engineering problems. Thanks to the use of computer methods it is possible for example to comprehensively describe the conditions in a given room and to determine thermal comfort, which is a complex issue including subjective sensations of the persons in a given room. The article presents the results of measurements and numerical computing that enabled carrying out the assessment of environment parameters, taking into consideration microclimate, temperature comfort, speeds in the zone of human presence and dustiness in auditory halls. For this purpose measurements of temperature, relative humidity and dustiness were made with the use of a digital microclimate meter and a laser dust particles counter. Thanks to the above by using the application DesignBuilder numerical computing was performed and the obtained results enabled determining PMV comfort indicator in selected rooms.

  4. Teaching Network Security with IP Darkspace Data

    ERIC Educational Resources Information Center

    Zseby, Tanja; Iglesias Vázquez, Félix; King, Alistair; Claffy, K. C.

    2016-01-01

    This paper presents a network security laboratory project for teaching network traffic anomaly detection methods to electrical engineering students. The project design follows a research-oriented teaching principle, enabling students to make their own discoveries in real network traffic, using data captured from a large IP darkspace monitor…

  5. Human health risk assessment (HHRA) for environmental development and transfer of antibiotic resistance

    EPA Science Inventory

    Objective: Here we present possible approaches and identify research needs to enable human health risk assessments that focus on the role the environment plays in antibiotic treatment failure of patients. Methods: The authors participated in a workshop sub-committee to define t...

  6. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  7. Surface and allied studies in silicon solar cells

    NASA Technical Reports Server (NTRS)

    Lindholm, F. A.

    1983-01-01

    Two main results are presented. The first deals with a simple method that determines the minority-carrier lifetime and the effective surface recombination velocity of the quasi-neutral base of silicon solar cells. The method requires the observation of only a single transient, and is amenable to automation for in-process monitoring in manufacturing. This method, which is called short-circuit current decay, avoids distortion in the observed transient and consequent inacccuracies that arise from the presence of mobile holes and electrons stored in the p/n junction spacecharge region at the initial instant of the transient. The second main result consists in a formulation of the relevant boundary-value problems that resembles that used in linear two-port network theory. This formulation enables comparisons to be made among various contending methods for measuring material parameters of p/n junction devices, and enables the option of putting the description in the time domain of the transient studies in the form of an infinite series, although closed-form solutions are also possible.

  8. Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy

    DOE PAGES

    Rosewater, David; Ferreira, Summer; Schoenwald, David; ...

    2018-01-25

    Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less

  9. Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosewater, David; Ferreira, Summer; Schoenwald, David

    Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less

  10. Analysis of Curved Target-Type Thrust Reversers

    DTIC Science & Technology

    1974-06-07

    methods f-or two -dimensional cases, the Levi - Civita method provides a \\ariet> t>l bucket shapes and enables one to round off the sharp corners of...surface In the present work three methods arc employed to investigate the deflection of mviscid. incompressible curved surfaces: Levi - Civitas ...shapes are shown in Fig. V A special case for (T, =0 31416 and fr2= 0.47124, and A = 0. ,*46, (/< = 6X ), is shown in Fig 4. Fvidently, Levi - Civita "s

  11. Exact analytic solutions of Maxwell's equations describing propagating nonparaxial electromagnetic beams.

    PubMed

    Garay-Avendaño, Roger L; Zamboni-Rached, Michel

    2014-07-10

    In this paper, we propose a method that is capable of describing in exact and analytic form the propagation of nonparaxial scalar and electromagnetic beams. The main features of the method presented here are its mathematical simplicity and the fast convergence in the cases of highly nonparaxial electromagnetic beams, enabling us to obtain high-precision results without the necessity of lengthy numerical simulations or other more complex analytical calculations. The method can be used in electromagnetism (optics, microwaves) as well as in acoustics.

  12. High-resolution quantitative determination of dielectric function by using scattering scanning near-field optical microscopy

    PubMed Central

    Tranca, D. E.; Stanciu, S. G.; Hristu, R.; Stoichita, C.; Tofail, S. A. M.; Stanciu, G. A.

    2015-01-01

    A new method for high-resolution quantitative measurement of the dielectric function by using scattering scanning near-field optical microscopy (s-SNOM) is presented. The method is based on a calibration procedure that uses the s-SNOM oscillating dipole model of the probe-sample interaction and quantitative s-SNOM measurements. The nanoscale capabilities of the method have the potential to enable novel applications in various fields such as nano-electronics, nano-photonics, biology or medicine. PMID:26138665

  13. Nanoscale methods for single-molecule electrochemistry.

    PubMed

    Mathwig, Klaus; Aartsma, Thijs J; Canters, Gerard W; Lemay, Serge G

    2014-01-01

    The development of experiments capable of probing individual molecules has led to major breakthroughs in fields ranging from molecular electronics to biophysics, allowing direct tests of knowledge derived from macroscopic measurements and enabling new assays that probe population heterogeneities and internal molecular dynamics. Although still somewhat in their infancy, such methods are also being developed for probing molecular systems in solution using electrochemical transduction mechanisms. Here we outline the present status of this emerging field, concentrating in particular on optical methods, metal-molecule-metal junctions, and electrochemical nanofluidic devices.

  14. Method of fabricating a high aspect ratio microstructure

    DOEpatents

    Warren, John B.

    2003-05-06

    The present invention is for a method of fabricating a high aspect ratio, freestanding microstructure. The fabrication method modifies the exposure process for SU-8, an negative-acting, ultraviolet-sensitive photoresist used for microfabrication whereby a UV-absorbent glass substrate, chosen for complete absorption of UV radiation at 380 nanometers or less, is coated with a negative photoresist, exposed and developed according to standard practice. This UV absorbent glass enables the fabrication of cylindrical cavities in a negative photoresist microstructures that have aspect ratios of 8:1.

  15. Investigation of source location determination from Magsat magnetic anomalies: The Euler method approach

    NASA Technical Reports Server (NTRS)

    Ravat, Dhananjay

    1996-01-01

    The applicability of the Euler method of source location determination was investigated on several model situations pertinent to satellite-data scale situations as well as Magsat data of Europe. Our investigations enabled us to understand the end-member cases for which the Euler method will work with the present satellite magnetic data and also the cases for which the assumptions implicit in the Euler method will not be met by the present satellite magnetic data. These results have been presented in one invited lecture at the Indo-US workshop on Geomagnetism in Studies of the Earth's Interior in August 1994 in Pune, India, and at one presentation at the 21st General Assembly of the IUGG in July 1995 in Boulder, CO. A new method, called Anomaly Attenuation Rate (AAR) Method (based on the Euler method), was developed during this study. This method is scale-independent and is appropriate to locate centroids of semi-compact three dimensional sources of gravity and magnetic anomalies. The method was presented during 1996 Spring AGU meeting and a manuscript describing this method is being prepared for its submission to a high-ranking journal. The grant has resulted in 3 papers and presentations at national and international meetings and one manuscript of a paper (to be submitted shortly to a reputable journal).

  16. Hybrid rendering of the chest and virtual bronchoscopy [corrected].

    PubMed

    Seemann, M D; Seemann, O; Luboldt, W; Gebicke, K; Prime, G; Claussen, C D

    2000-10-30

    Thin-section spiral computed tomography was used to acquire the volume data sets of the thorax. The tracheobronchial system and pathological changes of the chest were visualized using a color-coded surface rendering method. The structures of interest were then superimposed on a volume rendering of the other thoracic structures, thus producing a hybrid rendering. The hybrid rendering technique exploit the advantages of both rendering methods and enable virtual bronchoscopic examinations using different representation models. Virtual bronchoscopic examinations with a transparent color-coded shaded-surface model enables the simultaneous visualization of both the airways and the adjacent structures behind of the tracheobronchial wall and therefore, offers a practical alternative to fiberoptic bronchoscopy. Hybrid rendering and virtual endoscopy obviate the need for time consuming detailed analysis and presentation of axial source images.

  17. [Determination of cost-effective strategies in colorectal cancer screening].

    PubMed

    Dervaux, B; Eeckhoudt, L; Lebrun, T; Sailly, J C

    1992-01-01

    The object of the article is to implement particular methodologies in order to determine which strategies are cost-effective in the mass screening of colorectal cancer after a positive Hemoccult test. The first approach to be presented consists in proposing a method which enables all the admissible diagnostic strategies to be determined. The second approach enables a minimal cost function to be estimated using an adaptation of "Data Envelopment Analysis". This method proves to be particularly successful in cost-efficiency analysis, when the performance indicators are numerous and hard to aggregate. The results show that there are two cost-effective strategies after a positive Hemoccult test: coloscopy and sigmoidoscopy; they put into question the relevance of double contrast barium enema in the diagnosis of colo-rectal lesions.

  18. Gutzwiller renormalization group

    DOE PAGES

    Lanatà, Nicola; Yao, Yong -Xin; Deng, Xiaoyu; ...

    2016-01-06

    We develop a variational scheme called the “Gutzwiller renormalization group” (GRG), which enables us to calculate the ground state of Anderson impurity models (AIM) with arbitrary numerical precision. Our method exploits the low-entanglement property of the ground state of local Hamiltonians in combination with the framework of the Gutzwiller wave function and indicates that the ground state of the AIM has a very simple structure, which can be represented very accurately in terms of a surprisingly small number of variational parameters. Furthermore, we perform benchmark calculations of the single-band AIM that validate our theory and suggest that the GRG mightmore » enable us to study complex systems beyond the reach of the other methods presently available and pave the way to interesting generalizations, e.g., to nonequilibrium transport in nanostructures.« less

  19. Inclusion free cadmium zinc tellurium and cadmium tellurium crystals and associated growth method

    DOEpatents

    Bolotnikov, Aleskey E [South Setauket, NY; James, Ralph B [Ridge, NY

    2010-07-20

    The present disclosure provides systems and methods for crystal growth of cadmium zinc tellurium (CZT) and cadmium tellurium (CdTe) crystals with an inverted growth reactor chamber. The inverted growth reactor chamber enables growth of single, large, high purity CZT and CdTe crystals that can be used, for example, in X-ray and gamma detection, substrates for infrared detectors, or the like. The inverted growth reactor chamber enables reductions in the presence of Te inclusions, which are recognized as an important limiting factor in using CZT or CdTe as radiation detectors. The inverted growth reactor chamber can be utilized with existing crystal growth techniques such as the Bridgman crystal growth mechanism and the like. In an exemplary embodiment, the inverted growth reactor chamber is a U-shaped ampoule.

  20. 78 FR 29387 - Government-Owned Inventions, Available for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ....: MSC-24919-1: Systems and Methods for RFID-Enables Information Collection; NASA Case No.: MSC-25632-1... Methods for RFID-Enabled Dispenser; NASA Case No.: MSC-25313-1: Hydrostatic Hyperbaric Apparatus and...; NASA Case No: MSC-25590-1: Systems and Methods for RFID-Enabled Pressure Sensing Apparatus; NASA Case...

  1. dPCR: A Technology Review

    PubMed Central

    Quan, Phenix-Lan; Sauzade, Martin

    2018-01-01

    Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144

  2. Optical phase conjugation assisted scattering lens: variable focusing and 3D patterning

    PubMed Central

    Ryu, Jihee; Jang, Mooseok; Eom, Tae Joong; Yang, Changhuei; Chung, Euiheon

    2016-01-01

    Variable light focusing is the ability to flexibly select the focal distance of a lens. This feature presents technical challenges, but is significant for optical interrogation of three-dimensional objects. Numerous lens designs have been proposed to provide flexible light focusing, including zoom, fluid, and liquid-crystal lenses. Although these lenses are useful for macroscale applications, they have limited utility in micron-scale applications due to restricted modulation range and exacting requirements for fabrication and control. Here, we present a holographic focusing method that enables variable light focusing without any physical modification to the lens element. In this method, a scattering layer couples low-angle (transverse wave vector) components into a full angular spectrum, and a digital optical phase conjugation (DOPC) system characterizes and plays back the wavefront that focuses through the scattering layer. We demonstrate micron-scale light focusing and patterning over a wide range of focal distances of 22–51 mm. The interferometric nature of the focusing scheme also enables an aberration-free scattering lens. The proposed method provides a unique variable focusing capability for imaging thick specimens or selective photoactivation of neuronal networks. PMID:27049442

  3. X-ray and gamma-ray computed tomography for industrial nondestructive testing and evaluation

    NASA Astrophysics Data System (ADS)

    Costello, Ian; Wells, Peter; Davis, John R.; Benci, Nino; Skerrett, David; Davies, D. R.

    1994-03-01

    This paper presents an overview of two recently constructed computed tomography (CT) scanners that have been designed to provide structural information for industrially relevant materials and components. CT enables cross-sectional slices of an object to be nondestructively imaged and represented as a map of linear attenuation coefficient. As linear attenuation is the product of mass attenuation and density, this usually enables a straightforward interpretation of the image in terms of density. The two instruments are a transportable scanner using a 160 kV(peak) powered x-ray tube for the inspection of wooden power poles up to 450 mm in diameter, and an industrial scanning system designed around an Ir-192 gamma-ray source for materials characterization and the testing and evaluation of castings, ceramics, and composites. The images presented in this paper have generally been reconstructed using the summation convolution back-projection (SCBP) method, and this technique is outlined. Direct Fourier reconstruction is also used and compared with the SCBP method. A brief discussion is offered on incorporating edge detection methods into the image reconstruction process for the improved identification of defects such as cracks and voids.

  4. Force and Conductance Spectroscopy of Single Molecule Junctions

    NASA Astrophysics Data System (ADS)

    Frei, Michael

    Investigation of mechanical properties of single molecule junctions is crucial to develop an understanding and enable control of single molecular junctions. This work presents an experimental and analytical approach that enables the statistical evaluation of force and simultaneous conductance data of metallic atomic point contacts and molecular junctions. A conductive atomic force microscope based break junction technique is developed to form single molecular junctions and collect conductance and force data simultaneously. Improvements of the optical components have been achieved through the use of a super-luminescent diode, enabling tremendous increases in force resolution. An experimental procedure to collect data for various molecular junctions has been developed and includes deposition, calibration, and analysis methods. For the statistical analysis of force, novel approaches based on two dimensional histograms and a direct force identification method are presented. The two dimensional method allows for an unbiased evaluation of force events that are identified using corresponding conductance signatures. This is not always possible however, and in these situations, the force based identification of junction rearrangement events is an attractive alternative method. This combined experimental and analytical approach is then applied to three studies: First, the impact of molecular backbones to the mechanical behavior of single molecule junctions is investigated and it is found that junctions formed with identical linkers but different backbone structure result in junctions with varying breaking forces. All molecules used show a clear molecular signature and force data can be evaluated using the 2D method. Second, the effects of the linker group used to attach molecules to gold electrodes are investigated. A study of four alkane molecules with different linkers finds a drastic difference in the evolution of donor-acceptor and covalently bonded molecules respectively. In fact, the covalent bond is found to significantly distort the metal electrode rearrangement such that junction rearrangement events can no longer be identified with a clean and well defined conductance signature. For this case, the force based identification process is used. Third, results for break junction measurements with different metals are presented. It is found that silver and palladium junctions rupture with forces different from those of gold contacts. In the case of silver experiments in ambient conditions, we can also identify oxygen impurities in the silver contact formation process, leading to force and conductance measurements of silver-oxygen structures. For the future, this work provides an experimental and analytical foundation that will enable insights into single molecule systems not previously accessible.

  5. Network science of biological systems at different scales: A review

    NASA Astrophysics Data System (ADS)

    Gosak, Marko; Markovič, Rene; Dolenšek, Jurij; Slak Rupnik, Marjan; Marhl, Marko; Stožer, Andraž; Perc, Matjaž

    2018-03-01

    Network science is today established as a backbone for description of structure and function of various physical, chemical, biological, technological, and social systems. Here we review recent advances in the study of complex biological systems that were inspired and enabled by methods of network science. First, we present

  6. Kinetographic determination of airplane flight characteristics

    NASA Technical Reports Server (NTRS)

    Raethjen, P; Knott, H

    1927-01-01

    The author's first experiments with a glider on flight characteristics demonstrated that an accurate flight-path measurement would enable determination of the polar diagram from a gliding flight. Since then he has endeavored to obtain accurate flight measurements by means of kinetograph (motion-picture camera). Different methods of accomplishing this are presented.

  7. Observations on Patient and Family Coping with Huntington's Disease.

    ERIC Educational Resources Information Center

    Falek, Arthur

    1979-01-01

    Huntington's disease is an autosomal dominantly inherited disorder. No definitive method for the preclinical detection of carriers is known. The consequences of this diagnosis are discussed. The significance of genetic counseling and a description of the phases in psychological coping is presented to enable informed decision making by family…

  8. Stress Management Consultation to Israeli Social Workers during the Gulf War.

    ERIC Educational Resources Information Center

    Cwikel, Julie C.; And Others

    1993-01-01

    Describes Stress Management Consultation (SMC), short-term group intervention designed to enable social workers in Israel during Persian Gulf War to work through stress reactions and model method workers could use with their own target populations. Presents qualitative feedback from participants and administrators indicating that SMC model was…

  9. Guide to the expression of uncertainty in measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathew, Kattathu Joseph

    The enabling objectives of this presentation are to: Provide a working knowledge of the ISO GUM method to estimation of uncertainties in safeguards measurements; Introduce GUM terminology; Provide brief historical background of the GUM methodology; Introduce GUM Workbench software; Isotope ratio measurements by MS will be discussed in the next session.

  10. Practitioner Assessment of Conflict Resolution Programs. ERIC Digest Number 163.

    ERIC Educational Resources Information Center

    Deutsch, Morton

    There are many ways to assess the effectiveness of school conflict resolution training (CRT) programs. Some methods require extensive resources, but others, conducted by CRT practitioners themselves, also provide useful information. This digest presents a framework for CRT evaluation by practitioners that enables them to reflect productively on…

  11. Electric Motor Thermal Management R&D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennion, Kevin

    2016-06-07

    Thermal management enables more efficient and cost-effective motors. This Annual Merit Review presentation describes the technical accomplishments and progress in electric motor thermal management R&D over the last year. This project supports a broad industry demand for data, analysis methods, and experimental techniques to improve and better understand motor thermal management.

  12. Metal-catalyzed Decarboxylative Fluoroalkylation Reactions.

    PubMed

    Ambler, Brett R; Yang, Ming-Hsiu; Altman, Ryan A

    2016-12-01

    Metal-catalyzed decarboxylative fluoroalkylation reactions enable the conversion of simple O-based substrates into biologically relevant fluorinated analogs. Herein, we present decarboxylative methods that facilitate the synthesis of trifluoromethyl- and difluoroketone-containing products. We highlight key mechanistic aspects that are critical for efficient catalysis, and that inspired our thinking while developing the reactions.

  13. EMISSIONS INVENTORY OF PM 2.5 TRACE ELEMENTS ACROSS THE U.S.

    EPA Science Inventory

    This abstract describes work done to speciate PM2.5 emissions into emissions of trace metals to enable concentrations of metal species to be predicted by air quality models. Methods are described and initial results are presented. A technique for validating the resul...

  14. Electric Motor Thermal Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennion, Kevin S

    Thermal management enables more efficient and cost-effective motors. This Annual Merit Review presentation describes the technical accomplishments and progress in electric motor thermal management R&D over the last year. This project supports a broad industry demand for data, analysis methods, and experimental techniques to improve and better understand motor thermal management.

  15. Green synthesis of nanomaterials and sustainable applications of nano-catalysts

    EPA Science Inventory

    Green synthesis efforts involving the use of vitamins B1, B2, C, and tea and wine polyphenols which function both as reducing and capping agents will be presented which enables extremely simple, one-pot, green synthetic methods to nanomaterials in water.1a Shape-controlled synth...

  16. Building Staff Competencies and Selecting Communications Methods for Waste Management Programs.

    ERIC Educational Resources Information Center

    Richardson, John G.

    The Waste Management Institute provided in-service training to interested County Extension agents in North Carolina to enable them to provide leadership in developing and delivering a comprehensive county-level waste management program. Training included technical, economic, environmental, social, and legal aspects of waste management presented in…

  17. Boulder Capture System Design Options for the Asteroid Robotic Redirect Mission Alternate Approach Trade Study

    NASA Technical Reports Server (NTRS)

    Belbin, Scott P.; Merrill, Raymond G.

    2014-01-01

    This paper presents a boulder acquisition and asteroid surface interaction electromechanical concept developed for the Asteroid Robotic Redirect Mission (ARRM) option to capture a free standing boulder on the surface of a 100 m or larger Near Earth Asteroid (NEA). It details the down select process and ranking of potential boulder capture methods, the evolution of a simple yet elegant articulating spaceframe, and ongoing risk reduction and concept refinement efforts. The capture system configuration leverages the spaceframe, heritage manipulators, and a new microspine technology to enable the ARRM boulder capture. While at the NEA it enables attenuation of terminal descent velocity, ascent to escape velocity, boulder collection and restraint. After departure from the NEA it enables, robotic inspection, sample caching, and crew Extra Vehicular Activities (EVA).

  18. A Progressive Damage Model for unidirectional Fibre Reinforced Composites with Application to Impact and Penetration Simulation

    NASA Astrophysics Data System (ADS)

    Kerschbaum, M.; Hopmann, C.

    2016-06-01

    The computationally efficient simulation of the progressive damage behaviour of continuous fibre reinforced plastics is still a challenging task with currently available computer aided engineering methods. This paper presents an original approach for an energy based continuum damage model which accounts for stress-/strain nonlinearities, transverse and shear stress interaction phenomena, quasi-plastic shear strain components, strain rate effects, regularised damage evolution and consideration of load reversal effects. The physically based modelling approach enables experimental determination of all parameters on ply level to avoid expensive inverse analysis procedures. The modelling strategy, implementation and verification of this model using commercially available explicit finite element software are detailed. The model is then applied to simulate the impact and penetration of carbon fibre reinforced cross-ply specimens with variation of the impact speed. The simulation results show that the presented approach enables a good representation of the force-/displacement curves and especially well agreement with the experimentally observed fracture patterns. In addition, the mesh dependency of the results were assessed for one impact case showing only very little change of the simulation results which emphasises the general applicability of the presented method.

  19. Enabling fast charging – A battery technology gap assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed, Shabbir; Bloom, Ira; Jansen, Andrew N.

    The battery technology literature is reviewed, with an emphasis on key elements that limit extreme fast charging. Key gaps in existing elements of the technology are presented as well as developmental needs. Among these needs are advanced models and methods to detect and prevent lithium plating; new positive-electrode materials which are less prone to stress-induced failure; better electrode designs to accommodate very rapid diffusion in and out of the electrode; measure temperature distributions during fast charge to enable/validate models; and develop thermal management and pack designs to accommodate the higher operating voltage.

  20. Enabling fast charging – A battery technology gap assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed, Shabbir; Bloom, Ira; Jansen, Andrew N.

    The battery technology literature is reviewed, with an emphasis on key elements that limit extreme fast charging. Key gaps in existing elements of the technology are presented as well as developmental needs. Among these needs are advanced models and methods to detect and prevent lithium plating; new positive-electrode materials which are less prone to stress-induced failure; better electrode designs to accommodate very rapid diffusion in and out of the electrode; measure temperature distributions during fast charge to enable / validate models; and develop thermal management and pack designs to accommodate the higher operating voltage.

  1. X-ray Computed Microtomography technique applied for cementitious materials: A review.

    PubMed

    da Silva, Ítalo Batista

    2018-04-01

    The main objective of this article is to present a bibliographical review about the use of the X-ray microtomography method in 3D images processing of cementitious materials microstructure, analyzing the pores microstructure and connectivity network, enabling tthe possibility of building a relationship between permeability and porosity. The use of this technique enables the understanding of physical, chemical and mechanical properties of cementitious materials by publishing good results, considering that the quality and quantity of accessible information were significant and may contribute to the study of cementitious materials development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A new ChainMail approach for real-time soft tissue simulation.

    PubMed

    Zhang, Jinao; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2016-07-03

    This paper presents a new ChainMail method for real-time soft tissue simulation. This method enables the use of different material properties for chain elements to accommodate various materials. Based on the ChainMail bounding region, a new time-saving scheme is developed to improve computational efficiency for isotropic materials. The proposed method also conserves volume and strain energy. Experimental results demonstrate that the proposed ChainMail method can not only accommodate isotropic, anisotropic and heterogeneous materials but also model incompressibility and relaxation behaviors of soft tissues. Further, the proposed method can achieve real-time computational performance.

  3. Robust and accurate vectorization of line drawings.

    PubMed

    Hilaire, Xavier; Tombre, Karl

    2006-06-01

    This paper presents a method for vectorizing the graphical parts of paper-based line drawings. The method consists of separating the input binary image into layers of homogeneous thickness, skeletonizing each layer, segmenting the skeleton by a method based on random sampling, and simplifying the result. The segmentation method is robust with a best bound of 50 percent noise reached for indefinitely long primitives. Accurate estimation of the recognized vector's parameters is enabled by explicitly computing their feasibility domains. Theoretical performance analysis and expression of the complexity of the segmentation method are derived. Experimental results and comparisons with other vectorization systems are also provided.

  4. SIMULTANEOUS MULTISLICE MAGNETIC RESONANCE FINGERPRINTING WITH LOW-RANK AND SUBSPACE MODELING

    PubMed Central

    Zhao, Bo; Bilgic, Berkin; Adalsteinsson, Elfar; Griswold, Mark A.; Wald, Lawrence L.; Setsompop, Kawin

    2018-01-01

    Magnetic resonance fingerprinting (MRF) is a new quantitative imaging paradigm that enables simultaneous acquisition of multiple magnetic resonance tissue parameters (e.g., T1, T2, and spin density). Recently, MRF has been integrated with simultaneous multislice (SMS) acquisitions to enable volumetric imaging with faster scan time. In this paper, we present a new image reconstruction method based on low-rank and subspace modeling for improved SMS-MRF. Here the low-rank model exploits strong spatiotemporal correlation among contrast-weighted images, while the subspace model captures the temporal evolution of magnetization dynamics. With the proposed model, the image reconstruction problem is formulated as a convex optimization problem, for which we develop an algorithm based on variable splitting and the alternating direction method of multipliers. The performance of the proposed method has been evaluated by numerical experiments, and the results demonstrate that the proposed method leads to improved accuracy over the conventional approach. Practically, the proposed method has a potential to allow for a 3x speedup with minimal reconstruction error, resulting in less than 5 sec imaging time per slice. PMID:29060594

  5. How to detect and reduce movement artifacts in near-infrared imaging using moving standard deviation and spline interpolation.

    PubMed

    Scholkmann, F; Spichtig, S; Muehlemann, T; Wolf, M

    2010-05-01

    Near-infrared imaging (NIRI) is a neuroimaging technique which enables us to non-invasively measure hemodynamic changes in the human brain. Since the technique is very sensitive, the movement of a subject can cause movement artifacts (MAs), which affect the signal quality and results to a high degree. No general method is yet available to reduce these MAs effectively. The aim was to develop a new MA reduction method. A method based on moving standard deviation and spline interpolation was developed. It enables the semi-automatic detection and reduction of MAs in the data. It was validated using simulated and real NIRI signals. The results show that a significant reduction of MAs and an increase in signal quality are achieved. The effectiveness and usability of the method is demonstrated by the improved detection of evoked hemodynamic responses. The present method can not only be used in the postprocessing of NIRI signals but also for other kinds of data containing artifacts, for example ECG or EEG signals.

  6. Simultaneous multislice magnetic resonance fingerprinting with low-rank and subspace modeling.

    PubMed

    Bo Zhao; Bilgic, Berkin; Adalsteinsson, Elfar; Griswold, Mark A; Wald, Lawrence L; Setsompop, Kawin

    2017-07-01

    Magnetic resonance fingerprinting (MRF) is a new quantitative imaging paradigm that enables simultaneous acquisition of multiple magnetic resonance tissue parameters (e.g., T 1 , T 2 , and spin density). Recently, MRF has been integrated with simultaneous multislice (SMS) acquisitions to enable volumetric imaging with faster scan time. In this paper, we present a new image reconstruction method based on low-rank and subspace modeling for improved SMS-MRF. Here the low-rank model exploits strong spatiotemporal correlation among contrast-weighted images, while the subspace model captures the temporal evolution of magnetization dynamics. With the proposed model, the image reconstruction problem is formulated as a convex optimization problem, for which we develop an algorithm based on variable splitting and the alternating direction method of multipliers. The performance of the proposed method has been evaluated by numerical experiments, and the results demonstrate that the proposed method leads to improved accuracy over the conventional approach. Practically, the proposed method has a potential to allow for a 3× speedup with minimal reconstruction error, resulting in less than 5 sec imaging time per slice.

  7. Fundamentals of affinity cell separations.

    PubMed

    Zhang, Ye; Lyons, Veronica; Pappas, Dimitri

    2018-03-01

    Cell separations using affinity methods continue to be an enabling science for a wide variety of applications. In this review, we discuss the fundamental aspects of affinity separation, including the competing forces for cell capture and elution, cell-surface interactions, and models for cell adhesion. Factors affecting separation performance such as bond affinity, contact area, and temperature are presented. We also discuss and demonstrate the effects of nonspecific binding on separation performance. Metrics for evaluating cell separations are presented, along with methods of comparing separation techniques for cell isolation using affinity capture. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Statistical study of generalized nonlinear phase step estimation methods in phase-shifting interferometry.

    PubMed

    Langoju, Rajesh; Patil, Abhijit; Rastogi, Pramod

    2007-11-20

    Signal processing methods based on maximum-likelihood theory, discrete chirp Fourier transform, and spectral estimation methods have enabled accurate measurement of phase in phase-shifting interferometry in the presence of nonlinear response of the piezoelectric transducer to the applied voltage. We present the statistical study of these generalized nonlinear phase step estimation methods to identify the best method by deriving the Cramér-Rao bound. We also address important aspects of these methods for implementation in practical applications and compare the performance of the best-identified method with other bench marking algorithms in the presence of harmonics and noise.

  9. Ontology-based geospatial data query and integration

    USGS Publications Warehouse

    Zhao, T.; Zhang, C.; Wei, M.; Peng, Z.-R.

    2008-01-01

    Geospatial data sharing is an increasingly important subject as large amount of data is produced by a variety of sources, stored in incompatible formats, and accessible through different GIS applications. Past efforts to enable sharing have produced standardized data format such as GML and data access protocols such as Web Feature Service (WFS). While these standards help enabling client applications to gain access to heterogeneous data stored in different formats from diverse sources, the usability of the access is limited due to the lack of data semantics encoded in the WFS feature types. Past research has used ontology languages to describe the semantics of geospatial data but ontology-based queries cannot be applied directly to legacy data stored in databases or shapefiles, or to feature data in WFS services. This paper presents a method to enable ontology query on spatial data available from WFS services and on data stored in databases. We do not create ontology instances explicitly and thus avoid the problems of data replication. Instead, user queries are rewritten to WFS getFeature requests and SQL queries to database. The method also has the benefits of being able to utilize existing tools of databases, WFS, and GML while enabling query based on ontology semantics. ?? 2008 Springer-Verlag Berlin Heidelberg.

  10. Moire technique utilization for detection and measurement of scoliosis

    NASA Astrophysics Data System (ADS)

    Zawieska, Dorota; Podlasiak, Piotr

    1993-02-01

    Moire projection method enables non-contact measurement of the shape or deformation of different surfaces and constructions by fringe pattern analysis. The fringe map acquisition of the whole surface of the object under test is one of the main advantages compared with 'point by point' methods. The computer analyzes the shape of the whole surface and next user can selected different points or cross section of the object map. In this paper a few typical examples of an application of the moire technique in solving different medical problems will be presented. We will also present to you the equipment the moire pattern analysis is done in real time using the phase stepping method with CCD camera.

  11. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor †

    PubMed Central

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-01-01

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes. PMID:29510599

  12. Modeling cardiovascular hemodynamics using the lattice Boltzmann method on massively parallel supercomputers

    NASA Astrophysics Data System (ADS)

    Randles, Amanda Elizabeth

    Accurate and reliable modeling of cardiovascular hemodynamics has the potential to improve understanding of the localization and progression of heart diseases, which are currently the most common cause of death in Western countries. However, building a detailed, realistic model of human blood flow is a formidable mathematical and computational challenge. The simulation must combine the motion of the fluid, the intricate geometry of the blood vessels, continual changes in flow and pressure driven by the heartbeat, and the behavior of suspended bodies such as red blood cells. Such simulations can provide insight into factors like endothelial shear stress that act as triggers for the complex biomechanical events that can lead to atherosclerotic pathologies. Currently, it is not possible to measure endothelial shear stress in vivo, making these simulations a crucial component to understanding and potentially predicting the progression of cardiovascular disease. In this thesis, an approach for efficiently modeling the fluid movement coupled to the cell dynamics in real-patient geometries while accounting for the additional force from the expansion and contraction of the heart will be presented and examined. First, a novel method to couple a mesoscopic lattice Boltzmann fluid model to the microscopic molecular dynamics model of cell movement is elucidated. A treatment of red blood cells as extended structures, a method to handle highly irregular geometries through topology driven graph partitioning, and an efficient molecular dynamics load balancing scheme are introduced. These result in a large-scale simulation of the cardiovascular system, with a realistic description of the complex human arterial geometry, from centimeters down to the spatial resolution of red-blood cells. The computational methods developed to enable scaling of the application to 294,912 processors are discussed, thus empowering the simulation of a full heartbeat. Second, further extensions to enable the modeling of fluids in vessels with smaller diameters and a method for introducing the deformational forces exerted on the arterial flows from the movement of the heart by borrowing concepts from cosmodynamics are presented. These additional forces have a great impact on the endothelial shear stress. Third, the fluid model is extended to not only recover Navier-Stokes hydrodynamics, but also a wider range of Knudsen numbers, which is especially important in micro- and nano-scale flows. The tradeoffs of many optimizations methods such as the use of deep halo level ghost cells that, alongside hybrid programming models, reduce the impact of such higher-order models and enable efficient modeling of extreme regimes of computational fluid dynamics are discussed. Fourth, the extension of these models to other research questions like clogging in microfluidic devices and determining the severity of co-arctation of the aorta is presented. Through this work, a validation of these methods by taking real patient data and the measured pressure value before the narrowing of the aorta and predicting the pressure drop across the co-arctation is shown. Comparison with the measured pressure drop in vivo highlights the accuracy and potential impact of such patient specific simulations. Finally, a method to enable the simulation of longer trajectories in time by discretizing both spatially and temporally is presented. In this method, a serial coarse iterator is used to initialize data at discrete time steps for a fine model that runs in parallel. This coarse solver is based on a larger time step and typically a coarser discretization in space. Iterative refinement enables the compute-intensive fine iterator to be modeled with temporal parallelization. The algorithm consists of a series of prediction-corrector iterations completing when the results have converged within a certain tolerance. Combined, these developments allow large fluid models to be simulated for longer time durations than previously possible.

  13. High order field-to-field corrections for imaging and overlay to achieve sub 20-nm lithography requirements

    NASA Astrophysics Data System (ADS)

    Mulkens, Jan; Kubis, Michael; Hinnen, Paul; de Graaf, Roelof; van der Laan, Hans; Padiy, Alexander; Menchtchikov, Boris

    2013-04-01

    Immersion lithography is being extended to the 20-nm and 14-nm node and the lithography performance requirements need to be tightened further to enable this shrink. In this paper we present an integral method to enable high-order fieldto- field corrections for both imaging and overlay, and we show that this method improves the performance with 20% - 50%. The lithography architecture we build for these higher order corrections connects the dynamic scanner actuators with the angle resolved scatterometer via a separate application server. Improvements of CD uniformity are based on enabling the use of freeform intra-field dose actuator and field-to-field control of focus. The feedback control loop uses CD and focus targets placed on the production mask. For the overlay metrology we use small in-die diffraction based overlay targets. Improvements of overlay are based on using the high order intra-field correction actuators on a field-tofield basis. We use this to reduce the machine matching error, extending the heating control and extending the correction capability for process induced errors.

  14. Efficient visibility encoding for dynamic illumination in direct volume rendering.

    PubMed

    Kronander, Joel; Jönsson, Daniel; Löw, Joakim; Ljung, Patric; Ynnerman, Anders; Unger, Jonas

    2012-03-01

    We present an algorithm that enables real-time dynamic shading in direct volume rendering using general lighting, including directional lights, point lights, and environment maps. Real-time performance is achieved by encoding local and global volumetric visibility using spherical harmonic (SH) basis functions stored in an efficient multiresolution grid over the extent of the volume. Our method enables high-frequency shadows in the spatial domain, but is limited to a low-frequency approximation of visibility and illumination in the angular domain. In a first pass, level of detail (LOD) selection in the grid is based on the current transfer function setting. This enables rapid online computation and SH projection of the local spherical distribution of visibility information. Using a piecewise integration of the SH coefficients over the local regions, the global visibility within the volume is then computed. By representing the light sources using their SH projections, the integral over lighting, visibility, and isotropic phase functions can be efficiently computed during rendering. The utility of our method is demonstrated in several examples showing the generality and interactive performance of the approach.

  15. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images

    PubMed Central

    Afshar, Yaser; Sbalzarini, Ivo F.

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144

  16. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples

    PubMed Central

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-01-01

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell. DOI: http://dx.doi.org/10.7554/eLife.26580.001 PMID:28678007

  17. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images.

    PubMed

    Afshar, Yaser; Sbalzarini, Ivo F

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.

  18. Building Energy Modeling and Control Methods for Optimization and Renewables Integration

    NASA Astrophysics Data System (ADS)

    Burger, Eric M.

    This dissertation presents techniques for the numerical modeling and control of building systems, with an emphasis on thermostatically controlled loads. The primary objective of this work is to address technical challenges related to the management of energy use in commercial and residential buildings. This work is motivated by the need to enhance the performance of building systems and by the potential for aggregated loads to perform load following and regulation ancillary services, thereby enabling the further adoption of intermittent renewable energy generation technologies. To increase the generalizability of the techniques, an emphasis is placed on recursive and adaptive methods which minimize the need for customization to specific buildings and applications. The techniques presented in this dissertation can be divided into two general categories: modeling and control. Modeling techniques encompass the processing of data streams from sensors and the training of numerical models. These models enable us to predict the energy use of a building and of sub-systems, such as a heating, ventilation, and air conditioning (HVAC) unit. Specifically, we first present an ensemble learning method for the short-term forecasting of total electricity demand in buildings. As the deployment of intermittent renewable energy resources continues to rise, the generation of accurate building-level electricity demand forecasts will be valuable to both grid operators and building energy management systems. Second, we present a recursive parameter estimation technique for identifying a thermostatically controlled load (TCL) model that is non-linear in the parameters. For TCLs to perform demand response services in real-time markets, online methods for parameter estimation are needed. Third, we develop a piecewise linear thermal model of a residential building and train the model using data collected from a custom-built thermostat. This model is capable of approximating unmodeled dynamics within a building by learning from sensor data. Control techniques encompass the application of optimal control theory, model predictive control, and convex distributed optimization to TCLs. First, we present the alternative control trajectory (ACT) representation, a novel method for the approximate optimization of non-convex discrete systems. This approach enables the optimal control of a population of non-convex agents using distributed convex optimization techniques. Second, we present a distributed convex optimization algorithm for the control of a TCL population. Experimental results demonstrate the application of this algorithm to the problem of renewable energy generation following. This dissertation contributes to the development of intelligent energy management systems for buildings by presenting a suite of novel and adaptable modeling and control techniques. Applications focus on optimizing the performance of building operations and on facilitating the integration of renewable energy resources.

  19. What helps or hinders the transformation from a major tertiary center to a major trauma center? Identifying barriers and enablers using the Theoretical Domains Framework.

    PubMed

    Roberts, Neil; Lorencatto, Fabiana; Manson, Joanna; Brundage, Susan I; Jansen, Jan O

    2016-03-12

    Major Trauma Centers (MTCs), as part of a trauma system, improve survival and functional outcomes from injury. Developing such centers from current teaching hospitals is likely to generate diverse beliefs amongst staff. These may act as barriers or enablers. Prior identification of these may make the service development process more efficient. The importance of applying theory to systematically identify barriers and enablers to changing clinical practice in emergency medicine has been emphasized. This study systematically explored theory-based barriers and enablers towards implementing the transformation of a tertiary hospital into a MTC. Our goal was to demonstrate the use of a replicable method to identify targets that could be addressed to achieve a successful transformation from an organization evolved to provide a particular type of clinical care into a clinical system with different demands, requirements and expectations. The Theoretical Domains Framework (TDF) is a tool designed to elicit and analyze beliefs affecting behavior. Semi-structured interviews based around the TDF were conducted in a major tertiary hospital in Scotland due to become a MTC with a purposive sample of major stakeholders including clinicians and nurses from specialties involved in trauma care, clinical managers and administration. Belief statements were identified through qualitative analysis, and assessed for importance according to prevalence, discordance and evidence base. 1728 utterances were recorded and coded into 91 belief statements. 58 were classified as important barriers/enablers. There were major concerns about resource demands, with optimism conditional on these being met. Distracting priorities abound within the Emergency Department. Better communication is needed. Staff motivation is high and they should be engaged in skills development and developing performance improvement processes. This study presents a systematic and replicable method of identifying theory-based barriers and enablers towards complex service development. It identifies multiple barriers/enablers that may serve as a basis for developing an implementation intervention to enhance the development of MTCs. This method can be used to address similar challenges in developing specialist centers or implementing clinical practice change in emergency care across both developing and developed countries.

  20. Site-selective oxidation, amination and epimerization reactions of complex polyols enabled by transfer hydrogenation

    NASA Astrophysics Data System (ADS)

    Hill, Christopher K.; Hartwig, John F.

    2017-12-01

    Polyoxygenated hydrocarbons that bear one or more hydroxyl groups comprise a large set of natural and synthetic compounds, often with potent biological activity. In synthetic chemistry, alcohols are important precursors to carbonyl groups, which then can be converted into a wide range of oxygen- or nitrogen-based functionality. Therefore, the selective conversion of a single hydroxyl group in natural products into a ketone would enable the selective introduction of unnatural functionality. However, the methods known to convert a simple alcohol, or even an alcohol in a molecule that contains multiple protected functional groups, are not suitable for selective reactions of complex polyol structures. We present a new ruthenium catalyst with a unique efficacy for the selective oxidation of a single hydroxyl group among many in unprotected polyol natural products. This oxidation enables the introduction of nitrogen-based functional groups into such structures that lack nitrogen atoms and enables a selective alcohol epimerization by stepwise or reversible oxidation and reduction.

  1. Photoionization dynamics of ammonia (B(1)E''): dependence on ionizing photon energy and initial vibrational level.

    PubMed

    Hockett, Paul; Staniforth, Michael; Reid, Katharine L

    2010-10-28

    In this article we present photoelectron spectra and angular distributions in which ion rotational states are resolved. This data enables the comparison of direct and threshold photoionization techniques. We also present angle-resolved photoelectron signals at different total energies, providing a method to scan the structure of the continuum in the near-threshold region. Finally, we have studied the influence of vibrational excitation on the photoionization dynamics.

  2. Identification of fracture zones and its application in automatic bone fracture reduction.

    PubMed

    Paulano-Godino, Félix; Jiménez-Delgado, Juan J

    2017-04-01

    The preoperative planning of bone fractures using information from CT scans increases the probability of obtaining satisfactory results, since specialists are provided with additional information before surgery. The reduction of complex bone fractures requires solving a 3D puzzle in order to place each fragment into its correct position. Computer-assisted solutions may aid in this process by identifying the number of fragments and their location, by calculating the fracture zones or even by computing the correct position of each fragment. The main goal of this paper is the development of an automatic method to calculate contact zones between fragments and thus to ease the computation of bone fracture reduction. In this paper, an automatic method to calculate the contact zone between two bone fragments is presented. In a previous step, bone fragments are segmented and labelled from CT images and a point cloud is generated for each bone fragment. The calculated contact zones enable the automatic reduction of complex fractures. To that end, an automatic method to match bone fragments in complex fractures is also presented. The proposed method has been successfully applied in the calculation of the contact zone of 4 different bones from the ankle area. The calculated fracture zones enabled the reduction of all the tested cases using the presented matching algorithm. The performed tests show that the reduction of these fractures using the proposed methods leaded to a small overlapping between fragments. The presented method makes the application of puzzle-solving strategies easier, since it does not obtain the entire fracture zone but the contact area between each pair of fragments. Therefore, it is not necessary to find correspondences between fracture zones and fragments may be aligned two by two. The developed algorithms have been successfully applied in different fracture cases in the ankle area. The small overlapping error obtained in the performed tests demonstrates the absence of visual overlapping in the figures. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Image registration: enabling technology for image guided surgery and therapy.

    PubMed

    Sauer, Frank

    2005-01-01

    Imaging looks inside the patient's body, exposing the patient's anatomy beyond what is visible on the surface. Medical imaging has a very successful history for medical diagnosis. It also plays an increasingly important role as enabling technology for minimally invasive procedures. Interventional procedures (e.g. catheter based cardiac interventions) are traditionally supported by intra-procedure imaging (X-ray fluoro, ultrasound). There is realtime feedback, but the images provide limited information. Surgical procedures are traditionally supported with pre-operative images (CT, MR). The image quality can be very good; however, the link between images and patient has been lost. For both cases, image registration can play an essential role -augmenting intra-op images with pre-op images, and mapping pre-op images to the patient's body. We will present examples of both approaches from an application oriented perspective, covering electrophysiology, radiation therapy, and neuro-surgery. Ultimately, as the boundaries between interventional radiology and surgery are becoming blurry, also the different methods for image guidance will merge. Image guidance will draw upon a combination of pre-op and intra-op imaging together with magnetic or optical tracking systems, and enable precise minimally invasive procedures. The information is registered into a common coordinate system, and allows advanced methods for visualization such as augmented reality or advanced methods for therapy delivery such as robotics.

  4. Image-derived arterial input function for quantitative fluorescence imaging of receptor-drug binding in vivo

    PubMed Central

    Elliott, Jonathan T.; Samkoe, Kimberley S.; Davis, Scott C.; Gunn, Jason R.; Paulsen, Keith D.; Roberts, David W.; Pogue, Brian W.

    2017-01-01

    Receptor concentration imaging (RCI) with targeted-untargeted optical dye pairs has enabled in vivo immunohistochemistry analysis in preclinical subcutaneous tumors. Successful application of RCI to fluorescence guided resection (FGR), so that quantitative molecular imaging of tumor-specific receptors could be performed in situ, would have a high impact. However, assumptions of pharmacokinetics, permeability and retention, as well as the lack of a suitable reference region limit the potential for RCI in human neurosurgery. In this study, an arterial input graphic analysis (AIGA) method is presented which is enabled by independent component analysis (ICA). The percent difference in arterial concentration between the image-derived arterial input function (AIFICA) and that obtained by an invasive method (ICACAR) was 2.0 ± 2.7% during the first hour of circulation of a targeted-untargeted dye pair in mice. Estimates of distribution volume and receptor concentration in tumor bearing mice (n = 5) recovered using the AIGA technique did not differ significantly from values obtained using invasive AIF measurements (p=0.12). The AIGA method, enabled by the subject-specific AIFICA, was also applied in a rat orthotopic model of U-251 glioblastoma to obtain the first reported receptor concentration and distribution volume maps during open craniotomy. PMID:26349671

  5. Cloud Computing as a Core Discipline in a Technology Entrepreneurship Program

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2012-01-01

    Education in entrepreneurship continues to be a developing area of curricula for computer science and information systems students. Entrepreneurship is enabled frequently by cloud computing methods that furnish benefits to especially medium and small-sized firms. Expanding upon an earlier foundation paper, the authors of this paper present an…

  6. Plasticity as a Framing Concept Enabling Transdisciplinary Understanding and Research in Neuroscience and Education

    ERIC Educational Resources Information Center

    García Carrasco, Joaquín; Hernández Serrano, María Jose; Martín García, Antonio Victor

    2015-01-01

    This article examines the emerging literature on the need for a synergy between neuroscience and educational sciences, identifying several differences in approach and methods that hinder the connecting processes between these two disciplines. From this review a transdisciplinary framework is presented which is based on the systemic and lifelong…

  7. Airborne Wind Profiling Algorithm for Doppler Wind LIDAR

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J. (Inventor); Beyon, Jeffrey Y. (Inventor); Koch, Grady J. (Inventor)

    2015-01-01

    Systems, methods, and devices of the present invention enable airborne Doppler Wind LIDAR system measurements and INS/GPS measurements to be combined to estimate wind parameters and compensate for instrument misalignment. In a further embodiment, the wind speed and wind direction may be computed based on two orthogonal line-of-sight LIDAR returns.

  8. Toward Fairness in Assessing Student Groupwork: A Protocol for Peer Evaluation of Individual Contributions

    ERIC Educational Resources Information Center

    Fellenz, Martin R.

    2006-01-01

    A key challenge for management instructors using graded groupwork with students is to find ways to maximize student learning from group projects while ensuring fair and accurate assessment methods. This article presents the Groupwork Peer-Evaluation Protocol (GPEP) that enables the assessment of individual contributions to graded student…

  9. Psychological and Physiological Alternatives in the Control of Human Communicative Behavior.

    ERIC Educational Resources Information Center

    Springhorn, Ron G.

    The paper considers whether precise control over the actions, thoughts, emotions, and desires of individuals is desirable. New technological methods for controlling human behavior enable systematic manipulation of people and promise an even greater degree of manipulation in the near future. Arguments for and against behavior control are presented.…

  10. Lecture Recording: Structural and Symbolic Information vs. Flexibility of Presentation

    ERIC Educational Resources Information Center

    Stolzenberg, Daniel; Pforte, Stefan

    2007-01-01

    Rapid eLearning is an ongoing trend which enables flexible and cost-effective creation of learning materials. Especially, lecture recording has turned out to be a lightweight method particularly suited for existing lectures and blended learning strategies. In order to not only sequentially play back but offer full fledged navigation, search and…

  11. Innovative Methods for Promoting and Assessing Intercultural Competence in Higher Education

    ERIC Educational Resources Information Center

    Hiller, Gundula Gwenn

    2010-01-01

    This paper presents an intercultural training program that was developed by the Center for Intercultural Learning at the European University Viadrina in cooperation with students. A few of the student-generated activities will be described in detail. The program, aimed at enabling students to acquire intercultural competence, was developed at an…

  12. Genetic algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  13. Low inlet gas velocity high throughput biomass gasifier

    DOEpatents

    Feldmann, Herman F.; Paisley, Mark A.

    1989-01-01

    The present invention discloses a novel method of operating a gasifier for production of fuel gas from carbonaceous fuels. The process disclosed enables operating in an entrained mode using inlet gas velocities of less than 7 feet per second, feedstock throughputs exceeding 4000 lbs/ft.sup.2 -hr, and pressures below 100 psia.

  14. Compact fusion energy based on the spherical tokamak

    NASA Astrophysics Data System (ADS)

    Sykes, A.; Costley, A. E.; Windsor, C. G.; Asunta, O.; Brittles, G.; Buxton, P.; Chuyanov, V.; Connor, J. W.; Gryaznevich, M. P.; Huang, B.; Hugill, J.; Kukushkin, A.; Kingham, D.; Langtry, A. V.; McNamara, S.; Morgan, J. G.; Noonan, P.; Ross, J. S. H.; Shevchenko, V.; Slade, R.; Smith, G.

    2018-01-01

    Tokamak Energy Ltd, UK, is developing spherical tokamaks using high temperature superconductor magnets as a possible route to fusion power using relatively small devices. We present an overview of the development programme including details of the enabling technologies, the key modelling methods and results, and the remaining challenges on the path to compact fusion.

  15. Computational trigonometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  16. High-yield fermentation and a novel heat-precipitation purification method for hydrophobin HGFI from Grifola frondosa in Pichia pastoris.

    PubMed

    Song, Dongmin; Gao, Zhendong; Zhao, Liqiang; Wang, Xiangxiang; Xu, Haijin; Bai, Yanling; Zhang, Xiuming; Linder, Markus B; Feng, Hui; Qiao, Mingqiang

    2016-12-01

    Hydrophobins are proteins produced by filamentous fungi with high natural-surfactant activities and that can self-assemble in interfaces of air-water or solid-water to form amphiphilic membranes. Here, we reported a high-yield fermentation method for hydrophobin HGFI from Grifola frondosa in Pichia pastoris, attaining production of 300 mg/L by keeping the dissolved oxygen level at 15%-25% by turning the methanol-feeding speed. We also developed a novel HGFI-purification method enabling large-scare purification of HGFI, with >90% recovery. Additionally, we observed that hydrophobin HGFI in fermentation broth precipitated at pH < 7.0 and temperatures >90 °C. We also identified the structure and properties of proteins purified by this method through atomic force microscopy, circular dichroism, X-ray photoelectron spectroscopy, and water-contact angle measurement, which is similar to protein purification by ultrafiltration without heating treatment that enables our method to maintain native HGFI structure and properties. Furthermore, the purification method presented here can be applied to large-scale purification of other type I hydrophobins. Copyright © 2016. Published by Elsevier Inc.

  17. Internal scanning method as unique imaging method of optical vortex scanning microscope

    NASA Astrophysics Data System (ADS)

    Popiołek-Masajada, Agnieszka; Masajada, Jan; Szatkowski, Mateusz

    2018-06-01

    The internal scanning method is specific for the optical vortex microscope. It allows to move the vortex point inside the focused vortex beam with nanometer resolution while the whole beam stays in place. Thus the sample illuminated by the focused vortex beam can be scanned just by the vortex point. We show that this method enables high resolution imaging. The paper presents the preliminary experimental results obtained with the first basic image recovery procedure. A prospect of developing more powerful tools for topography recovery with the optical vortex scanning microscope is discussed shortly.

  18. Probabilistic numerical methods for PDE-constrained Bayesian inverse problems

    NASA Astrophysics Data System (ADS)

    Cockayne, Jon; Oates, Chris; Sullivan, Tim; Girolami, Mark

    2017-06-01

    This paper develops meshless methods for probabilistically describing discretisation error in the numerical solution of partial differential equations. This construction enables the solution of Bayesian inverse problems while accounting for the impact of the discretisation of the forward problem. In particular, this drives statistical inferences to be more conservative in the presence of significant solver error. Theoretical results are presented describing rates of convergence for the posteriors in both the forward and inverse problems. This method is tested on a challenging inverse problem with a nonlinear forward model.

  19. Electrostatic similarity of proteins: Application of three dimensional spherical harmonic decomposition

    PubMed Central

    Długosz, Maciej; Trylska, Joanna

    2008-01-01

    We present a method for describing and comparing global electrostatic properties of biomolecules based on the spherical harmonic decomposition of electrostatic potential data. Unlike other approaches our method does not require any prior three dimensional structural alignment. The electrostatic potential, given as a volumetric data set from a numerical solution of the Poisson or Poisson–Boltzmann equation, is represented with descriptors that are rotation invariant. The method can be applied to large and structurally diverse sets of biomolecules enabling to cluster them according to their electrostatic features. PMID:18624502

  20. Method for remote detection of trace contaminants

    DOEpatents

    Simonson, Robert J.; Hance, Bradley G.

    2003-09-09

    A method for remote detection of trace contaminants in a target area comprises applying sensor particles that preconcentrate the trace contaminant to the target area and detecting the contaminant-sensitive fluorescence from the sensor particles. The sensor particles can have contaminant-sensitive and contaminant-insensitive fluorescent compounds to enable the determination of the amount of trace contaminant present in the target are by relative comparison of the emission of the fluorescent compounds by a local or remote fluorescence detector. The method can be used to remotely detect buried minefields.

  1. Methods for Presenting Braille Characters on a Mobile Device with a Touchscreen and Tactile Feedback.

    PubMed

    Rantala, J; Raisamo, R; Lylykangas, J; Surakka, V; Raisamo, J; Salminen, K; Pakkanen, T; Hippula, A

    2009-01-01

    Three novel interaction methods were designed for reading six-dot Braille characters from the touchscreen of a mobile device. A prototype device with a piezoelectric actuator embedded under the touchscreen was used to create tactile feedback. The three interaction methods, scan, sweep, and rhythm, enabled users to read Braille characters one at a time either by exploring the characters dot by dot or by sensing a rhythmic pattern presented on the screen. The methods were tested with five blind Braille readers as a proof of concept. The results of the first experiment showed that all three methods can be used to convey information as the participants could accurately (91-97 percent) recognize individual characters. In the second experiment the presentation rate of the most efficient and preferred method, the rhythm, was varied. A mean recognition accuracy of 70 percent was found when the speed of presenting a single character was nearly doubled from the first experiment. The results showed that temporal tactile feedback and Braille coding can be used to transmit single-character information while further studies are still needed to evaluate the presentation of serial information, i.e., multiple Braille characters.

  2. A classification scheme for risk assessment methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that amore » method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report--what a 'method' is and where it fits. In Section 3 we present background for our classification scheme--what other schemes we have found, the fundamental nature of methods and their necessary incompleteness. In Section 4 we present our classification scheme in the form of a matrix, then we present an analogy that should provide an understanding of the scheme, concluding with an explanation of the two dimensions and the nine types in our scheme. In Section 5 we present examples of each of our classification types. In Section 6 we present conclusions.« less

  3. Compressed domain ECG biometric with two-lead features

    NASA Astrophysics Data System (ADS)

    Lee, Wan-Jou; Chang, Wen-Whei

    2016-07-01

    This study presents a new method to combine ECG biometrics with data compression within a common JPEG2000 framework. We target the two-lead ECG configuration that is routinely used in long-term heart monitoring. Incorporation of compressed-domain biometric techniques enables faster person identification as it by-passes the full decompression. Experiments on public ECG databases demonstrate the validity of the proposed method for biometric identification with high accuracies on both healthy and diseased subjects.

  4. Ex situ and in situ characterization of patterned photoreactive thin organic surface layers using friction force microscopy

    PubMed Central

    Shen, Quan; Edler, Matthias; Griesser, Thomas; Knall, Astrid-Caroline; Trimmel, Gregor; Kern, Wolfgang; Teichert, Christian

    2014-01-01

    Photolithographic methods allow an easy lateral top-down patterning and tuning of surface properties with photoreactive molecules and polymers. Employing friction force microscopy (FFM), we present here different FFM-based methods that enable the characterization of several photoreactive thin organic surface layers. First, three ex situ methods have been evaluated for the identification of irradiated and non-irradiated zones on the same organosilane sample by irradiation through different types of masks. These approaches are further extended to a time dependent ex situ FFM measurement, which allows to study the irradiation time dependent evolution of the resulting friction forces by sequential irradiation through differently sized masks in crossed geometry. Finally, a newly designed in situ FFM measurement, which uses a commercial bar-shaped cantilever itself as a noncontact shadow mask, enables the determination of time dependent effects on the surface modification during the photoreaction. SCANNING 36:590–598, 2014. PMID:25183629

  5. Passing waves from atomistic to continuum

    NASA Astrophysics Data System (ADS)

    Chen, Xiang; Diaz, Adrian; Xiong, Liming; McDowell, David L.; Chen, Youping

    2018-02-01

    Progress in the development of coupled atomistic-continuum methods for simulations of critical dynamic material behavior has been hampered by a spurious wave reflection problem at the atomistic-continuum interface. This problem is mainly caused by the difference in material descriptions between the atomistic and continuum models, which results in a mismatch in phonon dispersion relations. In this work, we introduce a new method based on atomistic dynamics of lattice coupled with a concurrent atomistic-continuum method to enable a full phonon representation in the continuum description. This permits the passage of short-wavelength, high-frequency phonon waves from the atomistic to continuum regions. The benchmark examples presented in this work demonstrate that the new scheme enables the passage of all allowable phonons through the atomistic-continuum interface; it also preserves the wave coherency and energy conservation after phonons transport across multiple atomistic-continuum interfaces. This work is the first step towards developing a concurrent atomistic-continuum simulation tool for non-equilibrium phonon-mediated thermal transport in materials with microstructural complexity.

  6. Cameo: A Python Library for Computer Aided Metabolic Engineering and Optimization of Cell Factories.

    PubMed

    Cardoso, João G R; Jensen, Kristian; Lieven, Christian; Lærke Hansen, Anne Sofie; Galkina, Svetlana; Beber, Moritz; Özdemir, Emre; Herrgård, Markus J; Redestig, Henning; Sonnenschein, Nikolaus

    2018-04-20

    Computational systems biology methods enable rational design of cell factories on a genome-scale and thus accelerate the engineering of cells for the production of valuable chemicals and proteins. Unfortunately, the majority of these methods' implementations are either not published, rely on proprietary software, or do not provide documented interfaces, which has precluded their mainstream adoption in the field. In this work we present cameo, a platform-independent software that enables in silico design of cell factories and targets both experienced modelers as well as users new to the field. It is written in Python and implements state-of-the-art methods for enumerating and prioritizing knockout, knock-in, overexpression, and down-regulation strategies and combinations thereof. Cameo is an open source software project and is freely available under the Apache License 2.0. A dedicated Web site including documentation, examples, and installation instructions can be found at http://cameo.bio . Users can also give cameo a try at http://try.cameo.bio .

  7. The influence of averaging procedure on the accuracy of IVIVC predictions: immediate release dosage form case study.

    PubMed

    Ostrowski, Michalł; Wilkowska, Ewa; Baczek, Tomasz

    2010-12-01

    In vivo-in vitro correlation (IVIVC) is an effective tool to predict absorption behavior of active substances from pharmaceutical dosage forms. The model for immediate release dosage form containing amoxicillin was used in the presented study to check if the calculation method of absorption profiles can influence final results achieved. The comparison showed that an averaging of individual absorption profiles performed by Wagner-Nelson (WN) conversion method can lead to lose the discrimination properties of the model. The approach considering individual plasma concentration versus time profiles enabled to average absorption profiles prior WN conversion. In turn, that enabled to find differences between dispersible tablets and capsules. It was concluded that in the case of immediate release dosage form, the decision to use averaging method should be based on an individual situation; however, it seems that the influence of such a procedure on the discrimination properties of the model is then more significant. © 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  8. Integration of cell-free protein coexpression with an enzyme-linked immunosorbent assay enables rapid analysis of protein–protein interactions directly from DNA

    PubMed Central

    Layton, Curtis J; Hellinga, Homme W

    2011-01-01

    Assays that integrate detection of binding with cell-free protein expression directly from DNA can dramatically increase the pace at which protein–protein interactions (PPIs) can be analyzed by mutagenesis. In this study, we present a method that combines in vitro protein production with an enzyme-linked immunosorbent assay (ELISA) to measure PPIs. This method uses readily available commodity instrumentation and generic antibody–affinity tag interactions. It is straightforward and rapid to execute, enabling many interactions to be assessed in parallel. In traditional ELISAs, reporter complexes are assembled stepwise with one layer at a time. In the method presented here, all the members of the reporter complex are present and assembled together. The signal strength is dependent on all the intercomponent interaction affinities and concentrations. Although this assay is straightforward to execute, establishing proper conditions and analysis of the results require a thorough understanding of the processes that determine the signal strength. The formation of the fully assembled reporter sandwich can be modeled as a competition between Langmuir adsorption isotherms for the immobilized components and binding equilibria of the solution components. We have shown that modeling this process provides semiquantitative understanding of the effects of affinity and concentration and can guide strategies for the development of experimental protocols. We tested the method experimentally using the interaction between a synthetic ankyrin repeat protein (Off7) and maltose-binding protein. Measurements obtained for a collection of alanine mutations in the interface between these two proteins demonstrate that a range of affinities can be analyzed. PMID:21674663

  9. Systems modelling methodology for the analysis of apoptosis signal transduction and cell death decisions.

    PubMed

    Rehm, Markus; Prehn, Jochen H M

    2013-06-01

    Systems biology and systems medicine, i.e. the application of systems biology in a clinical context, is becoming of increasing importance in biology, drug discovery and health care. Systems biology incorporates knowledge and methods that are applied in mathematics, physics and engineering, but may not be part of classical training in biology. We here provide an introduction to basic concepts and methods relevant to the construction and application of systems models for apoptosis research. We present the key methods relevant to the representation of biochemical processes in signal transduction models, with a particular reference to apoptotic processes. We demonstrate how such models enable a quantitative and temporal analysis of changes in molecular entities in response to an apoptosis-inducing stimulus, and provide information on cell survival and cell death decisions. We introduce methods for analyzing the spatial propagation of cell death signals, and discuss the concepts of sensitivity analyses that enable a prediction of network responses to disturbances of single or multiple parameters. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Making data matter: Voxel printing for the digital fabrication of data across scales and domains.

    PubMed

    Bader, Christoph; Kolb, Dominik; Weaver, James C; Sharma, Sunanda; Hosny, Ahmed; Costa, João; Oxman, Neri

    2018-05-01

    We present a multimaterial voxel-printing method that enables the physical visualization of data sets commonly associated with scientific imaging. Leveraging voxel-based control of multimaterial three-dimensional (3D) printing, our method enables additive manufacturing of discontinuous data types such as point cloud data, curve and graph data, image-based data, and volumetric data. By converting data sets into dithered material deposition descriptions, through modifications to rasterization processes, we demonstrate that data sets frequently visualized on screen can be converted into physical, materially heterogeneous objects. Our approach alleviates the need to postprocess data sets to boundary representations, preventing alteration of data and loss of information in the produced physicalizations. Therefore, it bridges the gap between digital information representation and physical material composition. We evaluate the visual characteristics and features of our method, assess its relevance and applicability in the production of physical visualizations, and detail the conversion of data sets for multimaterial 3D printing. We conclude with exemplary 3D-printed data sets produced by our method pointing toward potential applications across scales, disciplines, and problem domains.

  11. Determination of lifetimes and recombination currents in p-n junction solar cells, diodes, and transistors

    NASA Technical Reports Server (NTRS)

    Neugroschel, A.

    1981-01-01

    New methods are presented and illustrated that enable the accurate determination of the diffusion length of minority carriers in the narrow regions of a solar cell or a diode. Other methods now available are inaccurate for the desired case in which the width of the region is less than the diffusion length. Once the diffusion length is determined by the new methods, this result can be combined with measured dark I-V characteristics and with small-signal admittance characteristics to enable determination of the recombination currents in each quasi-neutral region of the cell - for example, in the emitter, low-doped base, and high-doped base regions of the BSF (back-surface-field) cell. This approach leads to values for the effective surface recombination velocity of the high-low junction forming the back-surface field of BSF cells or the high-low emitter junction of HLE cells. These methods are also applicable for measuring the minority-carrier lifetime in thin epitaxial layers grown on substrates with opposite conductivity type.

  12. High-speed bioimaging with frequency-division-multiplexed fluorescence confocal microscopy

    NASA Astrophysics Data System (ADS)

    Mikami, Hideharu; Harmon, Jeffrey; Ozeki, Yasuyuki; Goda, Keisuke

    2017-04-01

    We present methods of fluorescence confocal microscopy that enable unprecedentedly high frame rate of > 10,000 fps. The methods are based on a frequency-division multiplexing technique, which was originally developed in the field of communication engineering. Specifically, we achieved a broad bandwidth ( 400 MHz) of detection signals using a dual- AOD method and overcame limitations in frame rate, due to a scanning device, by using a multi-line focusing method, resulting in a significant increase in frame rate. The methods have potential biomedical applications such as observation of sub-millisecond dynamics in biological tissues, in-vivo three-dimensional imaging, and fluorescence imaging flow cytometry.

  13. Adjoint-Based Sensitivity Kernels for Glacial Isostatic Adjustment in a Laterally Varying Earth

    NASA Astrophysics Data System (ADS)

    Crawford, O.; Al-Attar, D.; Tromp, J.; Mitrovica, J. X.; Austermann, J.; Lau, H. C. P.

    2017-12-01

    We consider a new approach to both the forward and inverse problems in glacial isostatic adjustment. We present a method for forward modelling GIA in compressible and laterally heterogeneous earth models with a variety of linear and non-linear rheologies. Instead of using the so-called sea level equation, which must be solved iteratively, the forward theory we present consists of a number of coupled evolution equations that can be straightforwardly numerically integrated. We also apply the adjoint method to the inverse problem in order to calculate the derivatives of measurements of GIA with respect to the viscosity structure of the Earth. Such derivatives quantify the sensitivity of the measurements to the model. The adjoint method enables efficient calculation of continuous and laterally varying derivatives, allowing us to calculate the sensitivity of measurements of glacial isostatic adjustment to the Earth's three-dimensional viscosity structure. The derivatives have a number of applications within the inverse method. Firstly, they can be used within a gradient-based optimisation method to find a model which minimises some data misfit function. The derivatives can also be used to quantify the uncertainty in such a model and hence to provide understanding of which parts of the model are well constrained. Finally, they enable construction of measurements which provide sensitivity to a particular part of the model space. We illustrate both the forward and inverse aspects with numerical examples in a spherically symmetric earth model.

  14. CMC Technology Advancements for Gas Turbine Engine Applications

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.

    2013-01-01

    CMC research at NASA Glenn is focused on aircraft propulsion applications. The objective is to enable reduced engine emissions and fuel consumption for more environmentally friendly aircraft. Engine system studies show that incorporation of ceramic composites into turbine engines will enable significant reductions in emissions and fuel burn due to increased engine efficiency resulting from reduced cooling requirements for hot section components. This presentation will describe recent progress and challenges in developing fiber and matrix constituents for 2700 F CMC turbine applications. In addition, ongoing research in the development of durable environmental barrier coatings, ceramic joining integration technologies and life prediction methods for CMC engine components will be reviewed.

  15. Real-time myocardium segmentation for the assessment of cardiac function variation

    NASA Astrophysics Data System (ADS)

    Zoehrer, Fabian; Huellebrand, Markus; Chitiboi, Teodora; Oechtering, Thekla; Sieren, Malte; Frahm, Jens; Hahn, Horst K.; Hennemuth, Anja

    2017-03-01

    Recent developments in MRI enable the acquisition of image sequences with high spatio-temporal resolution. Cardiac motion can be captured without gating and triggering. Image size and contrast relations differ from conventional cardiac MRI cine sequences requiring new adapted analysis methods. We suggest a novel segmentation approach utilizing contrast invariant polar scanning techniques. It has been tested with 20 datasets of arrhythmia patients. The results do not differ significantly more between automatic and manual segmentations than between observers. This indicates that the presented solution could enable clinical applications of real-time MRI for the examination of arrhythmic cardiac motion in the future.

  16. Multiresolution Distance Volumes for Progressive Surface Compression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laney, D E; Bertram, M; Duchaineau, M A

    2002-04-18

    We present a surface compression method that stores surfaces as wavelet-compressed signed-distance volumes. Our approach enables the representation of surfaces with complex topology and arbitrary numbers of components within a single multiresolution data structure. This data structure elegantly handles topological modification at high compression rates. Our method does not require the costly and sometimes infeasible base mesh construction step required by subdivision surface approaches. We present several improvements over previous attempts at compressing signed-distance functions, including an 0(n) distance transform, a zero set initialization method for triangle meshes, and a specialized thresholding algorithm. We demonstrate the potential of sampled distancemore » volumes for surface compression and progressive reconstruction for complex high genus surfaces.« less

  17. The evaluation of a novel haptic-enabled virtual reality approach for computer-aided cephalometry.

    PubMed

    Medellín-Castillo, H I; Govea-Valladares, E H; Pérez-Guerrero, C N; Gil-Valladares, J; Lim, Theodore; Ritchie, James M

    2016-07-01

    In oral and maxillofacial surgery, conventional radiographic cephalometry is one of the standard auxiliary tools for diagnosis and surgical planning. While contemporary computer-assisted cephalometric systems and methodologies support cephalometric analysis, they tend neither to be practical nor intuitive for practitioners. This is particularly the case for 3D methods since the associated landmarking process is difficult and time consuming. In addition to this, there are no 3D cephalometry norms or standards defined; therefore new landmark selection methods are required which will help facilitate their establishment. This paper presents and evaluates a novel haptic-enabled landmarking approach to overcome some of the difficulties and disadvantages of the current landmarking processes used in 2D and 3D cephalometry. In order to evaluate this new system's feasibility and performance, 21 dental surgeons (comprising 7 Novices, 7 Semi-experts and 7 Experts) performed a range of case studies using a haptic-enabled 2D, 2½D and 3D digital cephalometric analyses. The results compared the 2D, 2½D and 3D cephalometric values, errors and standard deviations for each case study and associated group of participants and revealed that 3D cephalometry significantly reduced landmarking errors and variability compared to 2D methods. Through enhancing the process by providing a sense of touch, the haptic-enabled 3D digital cephalometric approach was found to be feasible and more intuitive than its counterparts as well effective at reducing errors, the variability of the measurements taken and associated task completion times. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. TK3 eBook software to author, distribute, and use electronic course content for medical education.

    PubMed

    Morton, David A; Foreman, K Bo; Goede, Patricia A; Bezzant, John L; Albertine, Kurt H

    2007-03-01

    The methods for authoring and distributing course content are undergoing substantial changes due to advancement in computer technology. Paper has been the traditional method to author and distribute course content. Paper enables students to personalize content through highlighting and note taking but does not enable the incorporation of multimedia elements. Computers enable multimedia content but lack the capability of the user to personalize the content. Therefore, we investigated TK3 eBooks as a potential solution to incorporate the benefits of both paper and computer technology. The objective of our study was to assess the utility of TK3 eBooks in the context of authoring and distributing dermatology course content for use by second-year medical students at the University of Utah School of Medicine during the spring of 2004. We incorporated all dermatology course content into TK3 eBook format. TK3 eBooks enable students to personalize information through tools such as "notebook," "hiliter," "stickies," mark pages, and keyword search. Students were given the course content in both paper and eBook formats. At the conclusion of the dermatology course, students completed a questionnaire designed to evaluate the effectiveness of the eBooks compared with paper. Students perceived eBooks as an effective way to distribute course content and as a study tool. However, students preferred paper over eBooks to take notes during lecture. In conclusion, the present study demonstrated that eBooks provide a convenient method for authoring, distributing, and using course content but that students preferred paper to take notes during lecture.

  19. GVE-Based Dynamics and Control for Formation Flying Spacecraft

    NASA Technical Reports Server (NTRS)

    Breger, Louis; How, Jonathan P.

    2004-01-01

    Formation flying is an enabling technology for many future space missions. This paper presents extensions to the equations of relative motion expressed in Keplerian orbital elements, including new initialization techniques for general formation configurations. A new linear time-varying form of the equations of relative motion is developed from Gauss Variational Equations and used in a model predictive controller. The linearizing assumptions for these equations are shown to be consistent with typical formation flying scenarios. Several linear, convex initialization techniques are presented, as well as a general, decentralized method for coordinating a tetrahedral formation using differential orbital elements. Control methods are validated using a commercial numerical propagator.

  20. Band structure analysis of leaky Bloch waves in 2D phononic crystal plates.

    PubMed

    Mazzotti, Matteo; Miniaci, Marco; Bartoli, Ivan

    2017-02-01

    A hybrid Finite Element-Plane Wave Expansion method is presented for the band structure analysis of phononic crystal plates with two dimensional lattice that are in contact with acoustic half-spaces. The method enables the computation of both real (propagative) and imaginary (attenuation) components of the Bloch wavenumber at any given frequency. Three numerical applications are presented: a benchmark dispersion analysis for an oil-loaded Titanium isotropic plate, the band structure analysis of a water-loaded Tungsten slab with square cylindrical cavities and a phononic crystal plate composed of Aurum cylinders embedded in an epoxy matrix. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Multidimensional analysis of data obtained in experiments with X-ray emulsion chambers and extensive air showers

    NASA Technical Reports Server (NTRS)

    Chilingaryan, A. A.; Galfayan, S. K.; Zazyan, M. Z.; Dunaevsky, A. M.

    1985-01-01

    Nonparametric statistical methods are used to carry out the quantitative comparison of the model and the experimental data. The same methods enable one to select the events initiated by the heavy nuclei and to calculate the portion of the corresponding events. For this purpose it is necessary to have the data on artificial events describing the experiment sufficiently well established. At present, the model with the small scaling violation in the fragmentation region is the closest to the experiments. Therefore, the treatment of gamma families obtained in the Pamir' experiment is being carried out at present with the application of these models.

  2. Speckle correlation method used to measure object's in-plane velocity.

    PubMed

    Smíd, Petr; Horváth, Pavel; Hrabovský, Miroslav

    2007-06-20

    We present a measurement of an object's in-plane velocity in one direction by the use of the speckle correlation method. Numerical correlations of speckle patterns recorded periodically during motion of the object under investigation give information used to evaluate the object's in-plane velocity. The proposed optical setup uses a detection plane in the image field and enables one to detect the object's velocity within the interval (10-150) microm x s(-1). Simulation analysis shows a way of controlling the measuring range. The presented theory, simulation analysis, and setup are verified through an experiment of measurement of the velocity profile of an object.

  3. A-posteriori error estimation for the finite point method with applications to compressible flow

    NASA Astrophysics Data System (ADS)

    Ortega, Enrique; Flores, Roberto; Oñate, Eugenio; Idelsohn, Sergio

    2017-08-01

    An a-posteriori error estimate with application to inviscid compressible flow problems is presented. The estimate is a surrogate measure of the discretization error, obtained from an approximation to the truncation terms of the governing equations. This approximation is calculated from the discrete nodal differential residuals using a reconstructed solution field on a modified stencil of points. Both the error estimation methodology and the flow solution scheme are implemented using the Finite Point Method, a meshless technique enabling higher-order approximations and reconstruction procedures on general unstructured discretizations. The performance of the proposed error indicator is studied and applications to adaptive grid refinement are presented.

  4. Full three-dimensional isotropic carpet cloak designed by quasi-conformal transformation optics.

    PubMed

    Silva, Daniely G; Teixeira, Poliane A; Gabrielli, Lucas H; Junqueira, Mateus A F C; Spadoti, Danilo H

    2017-09-18

    A fully three-dimensional carpet cloak presenting invisibility in all viewing angles is theoretically demonstrated. The design is developed using transformation optics and three-dimensional quasi-conformal mapping. Parametrization strategy and numerical optimization of the coordinate transformation deploying a quasi-Newton method is applied. A discussion about the minimum achievable anisotropy in the 3D transformation optics is presented. The method allows to reduce the anisotropy in the cloak and an isotropic medium could be considered. Numerical simulations confirm the strategy employed enabling the design of an isotropic reflectionless broadband carpet cloak independently of the incident light direction and polarization.

  5. Method for uniformly bending conduits

    DOEpatents

    Dekanich, S.J.

    1984-04-27

    The present invention is directed to a method for bending metal tubing through various radii while maintaining uniform cross section of the tubing. The present invention is practical by filling the tubing to a sufficient level with water, freezing the water to ice and bending the ice-filled tubing in a cooled die to the desired radius. The use of the ice as a filler material provides uniform cross-sectional bends of the tubing and upon removal of the ice provides an uncontaminated interior of the tubing which will enable it to be used in its intended application without encountering residual contaminants in the tubing due to the presence of the filler material.

  6. Prediction of the effects of propeller operation on the static longitudinal stability of single-engine tractor monoplanes with flaps retracted

    NASA Technical Reports Server (NTRS)

    Weil, Joseph; Sleeman, William C , Jr

    1949-01-01

    The effects of propeller operation on the static longitudinal stability of single-engine tractor monoplanes are analyzed, and a simple method is presented for computing power-on pitching-moment curves for flap-retracted flight conditions. The methods evolved are based on the results of powered-model wind-tunnel investigations of 28 model configurations. Correlation curves are presented from which the effects of power on the downwash over the tail and the stabilizer effectiveness can be rapidly predicted. The procedures developed enable prediction of power-on longitudinal stability characteristics that are generally in very good agreement with experiment.

  7. Putting the environment into the NPV calculation -- Quantifying pipeline environmental costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dott, D.R.; Wirasinghe, S.C.; Chakma, A.

    1996-12-31

    Pipeline projects impact the environment through soil and habitat disturbance, noise during construction and compressor operation, river crossing disturbance and the risk of rupture. Assigning monetary value to these negative project consequences enables the environment to be represented in the project cost-benefit analysis. This paper presents the mechanics and implications of two environmental valuation techniques: (1) the contingent valuation method and (2) the stated preference method. The use of environmental value at the project economic-evaluation stage is explained. A summary of research done on relevant environmental attribute valuation is presented and discussed. Recommendations for further research in the field aremore » made.« less

  8. Arc Fault Detection & Localization by Electromagnetic-Acoustic Remote Sensing

    NASA Astrophysics Data System (ADS)

    Vasile, C.; Ioana, C.

    2017-05-01

    Electrical arc faults that occur in photovoltaic systems represent a danger due to their economic impact on production and distribution. In this paper we propose a complete system, with focus on the methodology, that enables the detection and localization of the arc fault, by the use of an electromagnetic-acoustic sensing system. By exploiting the multiple emissions of the arc fault, in conjunction with a real-time detection signal processing method, we ensure accurate detection and localization. In its final form, this present work will present in greater detail the complete system, the methods employed, results and performance, alongside further works that will be carried on.

  9. Rational-operator-based depth-from-defocus approach to scene reconstruction.

    PubMed

    Li, Ang; Staunton, Richard; Tjahjadi, Tardi

    2013-09-01

    This paper presents a rational-operator-based approach to depth from defocus (DfD) for the reconstruction of three-dimensional scenes from two-dimensional images, which enables fast DfD computation that is independent of scene textures. Two variants of the approach, one using the Gaussian rational operators (ROs) that are based on the Gaussian point spread function (PSF) and the second based on the generalized Gaussian PSF, are considered. A novel DfD correction method is also presented to further improve the performance of the approach. Experimental results are considered for real scenes and show that both approaches outperform existing RO-based methods.

  10. Neural classifier in the estimation process of maturity of selected varieties of apples

    NASA Astrophysics Data System (ADS)

    Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Zbytek, Z.; Ludwiczak, A.; Przybylak, A.; Lewicki, A.

    2015-07-01

    This paper seeks to present methods of neural image analysis aimed at estimating the maturity state of selected varieties of apples which are popular in Poland. An identification of the degree of maturity of selected varieties of apples has been conducted on the basis of information encoded in graphical form, presented in the digital photos. The above process involves the application of the BBCH scale, used to determine the maturity of apples. The aforementioned scale is widely used in the EU and has been developed for many species of monocotyledonous plants and dicotyledonous plants. It is also worth noticing that the given scale enables detailed determinations of development stage of a given plant. The purpose of this work is to identify maturity level of selected varieties of apples, which is supported by the use of image analysis methods and classification techniques represented by artificial neural networks. The analysis of graphical representative features based on image analysis method enabled the assessment of the maturity of apples. For the utilitarian purpose the "JabVis 1.1" neural IT system was created, in accordance with requirements of the software engineering dedicated to support the decision-making processes occurring in broadly understood production process and processing of apples.

  11. A simple method for panretinal imaging with the slit lamp.

    PubMed

    Gellrich, Marcus-Matthias

    2016-12-01

    Slit lamp biomicroscopy of the retina with a convex lens is a key procedure in clinical practice. The methods presented enable ophthalmologists to adequately image large and peripheral parts of the fundus using a video-slit lamp and freely available stitching software. A routine examination of the fundus with a slit lamp and a +90 D lens is recorded on a video film. Later, sufficiently sharp still images are identified on the video sequence. These still images are imported into a freely available image-processing program (Hugin, for stitching mosaics together digitally) and corresponding points are marked on adjacent still images with some overlap. Using the digital stitching program Hugin panoramic overviews of the retina can be built which can extend to the equator. This allows to image diseases involving the whole retina or its periphery by performing a structured fundus examination with a video-slit lamp. Similar images with a video-slit lamp based on a fundus examination through a hand-held non-contact lens have not been demonstrated before. The methods presented enable those ophthalmologists without high-end imaging equipment to monitor pathological fundus findings. The suggested procedure might even be interesting for retinological departments if peripheral findings are to be documented which might be difficult with fundus cameras.

  12. Custom implant design for large cranial defects.

    PubMed

    Marreiros, Filipe M M; Heuzé, Y; Verius, M; Unterhofer, C; Freysinger, W; Recheis, W

    2016-12-01

    The aim of this work was to introduce a computer-aided design (CAD) tool that enables the design of large skull defect (>100 [Formula: see text]) implants. Functional and aesthetically correct custom implants are extremely important for patients with large cranial defects. For these cases, preoperative fabrication of implants is recommended to avoid problems of donor site morbidity, sufficiency of donor material and quality. Finally, crafting the correct shape is a non-trivial task increasingly complicated by defect size. We present a CAD tool to design such implants for the neurocranium. A combination of geometric morphometrics and radial basis functions, namely thin-plate splines, allows semiautomatic implant generation. The method uses symmetry and the best fitting shape to estimate missing data directly within the radiologic volume data. In addition, this approach delivers correct implant fitting via a boundary fitting approach. This method generates a smooth implant surface, free of sharp edges that follows the main contours of the boundary, enabling accurate implant placement in the defect site intraoperatively. The present approach is evaluated and compared to existing methods. A mean error of 89.29 % (72.64-100 %) missing landmarks with an error less or equal to 1 mm was obtained. In conclusion, the results show that our CAD tool can generate patient-specific implants with high accuracy.

  13. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    PubMed Central

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml−1/μMol ml−1)], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  14. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  15. Vector wind profile gust model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1981-01-01

    To enable development of a vector wind gust model suitable for orbital flight test operations and trade studies, hypotheses concerning the distributions of gust component variables were verified. Methods for verification of hypotheses that observed gust variables, including gust component magnitude, gust length, u range, and L range, are gamma distributed and presented. Observed gust modulus has been drawn from a bivariate gamma distribution that can be approximated with a Weibull distribution. Zonal and meridional gust components are bivariate gamma distributed. An analytical method for testing for bivariate gamma distributed variables is presented. Two distributions for gust modulus are described and the results of extensive hypothesis testing of one of the distributions are presented. The validity of the gamma distribution for representation of gust component variables is established.

  16. Feasibility Study of Interstellar Missions Using Laser Sail Probes Ranging in Size from the Nano to the Macro

    NASA Technical Reports Server (NTRS)

    Malroy, Eric T.

    2010-01-01

    This paper presents the analysis examining the feasibility of interstellar travel using laser sail probes ranging in size from the nano to the macro. The relativistic differential equations of motion for a laser sail are set up and solved using the Pasic Method. The limitations of the analysis are presented and discussed. The requirements for the laser system are examined, including the thermal analysis of the laser sails. Black holes, plasma fields, atmospheric collisions and sun light are several methods discussed to enable the deceleration of the interstellar probe. A number of novel mission scenarios are presented including the embryonic transport of plant life as a precursor to the arrival of space colonies

  17. Velocity profile, water-surface slope, and bed-material size for selected streams in Colorado

    USGS Publications Warehouse

    Marchand, J.P.; Jarrett, R.D.; Jones, L.L.

    1984-01-01

    Existing methods for determining the mean velocity in a vertical sampling section do not address the conditions present in high-gradient, shallow-depth streams common to mountainous regions such as Colorado. The report presents velocity-profile data that were collected for 11 streamflow-gaging stations in Colorado using both a standard Price type AA current meter and a prototype Price Model PAA current meter. Computational results are compiled that will enable mean velocities calculated from measurements by the two current meters to be compared with each other and with existing methods for determining mean velocity. Water-surface slope, bed-material size, and flow-characteristic data for the 11 sites studied also are presented. (USGS)

  18. Reinforced Carbon Nanotubes.

    DOEpatents

    Ren, Zhifen; Wen, Jian Guo; Lao, Jing Y.; Li, Wenzhi

    2005-06-28

    The present invention relates generally to reinforced carbon nanotubes, and more particularly to reinforced carbon nanotubes having a plurality of microparticulate carbide or oxide materials formed substantially on the surface of such reinforced carbon nanotubes composite materials. In particular, the present invention provides reinforced carbon nanotubes (CNTs) having a plurality of boron carbide nanolumps formed substantially on a surface of the reinforced CNTs that provide a reinforcing effect on CNTs, enabling their use as effective reinforcing fillers for matrix materials to give high-strength composites. The present invention also provides methods for producing such carbide reinforced CNTs.

  19. Self-Management of Patient Body Position, Pose, and Motion Using Wide-Field, Real-Time Optical Measurement Feedback: Results of a Volunteer Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parkhurst, James M.; Price, Gareth J., E-mail: gareth.price@christie.nhs.uk; Faculty of Medical and Human Sciences, Manchester Academic Health Sciences Centre, University of Manchester, Manchester

    2013-12-01

    Purpose: We present the results of a clinical feasibility study, performed in 10 healthy volunteers undergoing a simulated treatment over 3 sessions, to investigate the use of a wide-field visual feedback technique intended to help patients control their pose while reducing motion during radiation therapy treatment. Methods and Materials: An optical surface sensor is used to capture wide-area measurements of a subject's body surface with visualizations of these data displayed back to them in real time. In this study we hypothesize that this active feedback mechanism will enable patients to control their motion and help them maintain their setup posemore » and position. A capability hierarchy of 3 different level-of-detail abstractions of the measured surface data is systematically compared. Results: Use of the device enabled volunteers to increase their conformance to a reference surface, as measured by decreased variability across their body surfaces. The use of visual feedback also enabled volunteers to reduce their respiratory motion amplitude to 1.7 ± 0.6 mm compared with 2.7 ± 1.4 mm without visual feedback. Conclusions: The use of live feedback of their optically measured body surfaces enabled a set of volunteers to better manage their pose and motion when compared with free breathing. The method is suitable to be taken forward to patient studies.« less

  20. Benchmarking routine psychological services: a discussion of challenges and methods.

    PubMed

    Delgadillo, Jaime; McMillan, Dean; Leach, Chris; Lucock, Mike; Gilbody, Simon; Wood, Nick

    2014-01-01

    Policy developments in recent years have led to important changes in the level of access to evidence-based psychological treatments. Several methods have been used to investigate the effectiveness of these treatments in routine care, with different approaches to outcome definition and data analysis. To present a review of challenges and methods for the evaluation of evidence-based treatments delivered in routine mental healthcare. This is followed by a case example of a benchmarking method applied in primary care. High, average and poor performance benchmarks were calculated through a meta-analysis of published data from services working under the Improving Access to Psychological Therapies (IAPT) Programme in England. Pre-post treatment effect sizes (ES) and confidence intervals were estimated to illustrate a benchmarking method enabling services to evaluate routine clinical outcomes. High, average and poor performance ES for routine IAPT services were estimated to be 0.91, 0.73 and 0.46 for depression (using PHQ-9) and 1.02, 0.78 and 0.52 for anxiety (using GAD-7). Data from one specific IAPT service exemplify how to evaluate and contextualize routine clinical performance against these benchmarks. The main contribution of this report is to summarize key recommendations for the selection of an adequate set of psychometric measures, the operational definition of outcomes, and the statistical evaluation of clinical performance. A benchmarking method is also presented, which may enable a robust evaluation of clinical performance against national benchmarks. Some limitations concerned significant heterogeneity among data sources, and wide variations in ES and data completeness.

  1. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Finding Sums for an Infinite Class of Alternating Series

    ERIC Educational Resources Information Center

    Chen, Zhibo; Wei, Sheng; Xiao, Xuerong

    2012-01-01

    Calculus II students know that many alternating series are convergent by the Alternating Series Test. However, they know few alternating series (except geometric series and some trivial ones) for which they can find the sum. In this article, we present a method that enables the students to find sums for infinitely many alternating series in the…

  3. Integrating Teacher- and Peer-Assessments of Group Coursework Assignments in Business Education: Some Innovative Methods

    ERIC Educational Resources Information Center

    Onyia, Okey Peter

    2014-01-01

    This paper is a sequel to an earlier one that examines "the efficacy of two innovative peer-assessment templates ("PET" and "PACT") introduced to enable students provide evidence of their fairness in evaluating peer contributions to group project work" (Onyia, O. P. and Allen, S., 2012). In the present paper, three…

  4. Dechorionation of Zebrafish Embryos on Day 1 Post Fertilization Alters Response to an Acute Chemical Challenge at 6 Days Post Fertilization

    EPA Science Inventory

    Dechorionation is a method used to enable image acquisition in embryonic and larval zebrafish studies. As it is assumed that dechorionation has no long-term effects on fish embryo development, it is important to determine if that assumption is correct. The present study explored ...

  5. Expected Utility Illustrated: A Graphical Analysis of Gambles with More than Two Possible Outcomes

    ERIC Educational Resources Information Center

    Chen, Frederick H.

    2010-01-01

    The author presents a simple geometric method to graphically illustrate the expected utility from a gamble with more than two possible outcomes. This geometric result gives economics students a simple visual aid for studying expected utility theory and enables them to analyze a richer set of decision problems under uncertainty compared to what…

  6. A Computational Method for Enabling Teaching-Learning Process in Huge Online Courses and Communities

    ERIC Educational Resources Information Center

    Mora, Higinio; Ferrández, Antonio; Gil, David; Peral, Jesús

    2017-01-01

    Massive Open Online Courses and e-learning represent the future of the teaching-learning processes through the development of Information and Communication Technologies. They are the response to the new education needs of society. However, this future also presents many challenges such as the processing of online forums when a huge number of…

  7. The Urgent Need for New Approaches in School Evaluation to Enable Scotland's Curriculum for Excellence

    ERIC Educational Resources Information Center

    MacKinnon, Niall

    2011-01-01

    This paper presents observations on the nature of school audit methods in light of the implementation of Scotland's incoming Curriculum for Excellence and the major normative, technological, and cultural changes affecting schools. It points to a mismatch between the concepts and structures of the incoming curriculum and that of the universalistic…

  8. Cell-Type-Specific Optogenetics in Monkeys.

    PubMed

    Namboodiri, Vijay Mohan K; Stuber, Garret D

    2016-09-08

    The recent advent of technologies enabling cell-type-specific recording and manipulation of neuronal activity spurred tremendous progress in neuroscience. However, they have been largely limited to mice, which lack the richness in behavior of primates. Stauffer et al. now present a generalizable method for achieving cell-type specificity in monkeys. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Bayes' theorem application in the measure information diagnostic value assessment

    NASA Astrophysics Data System (ADS)

    Orzechowski, Piotr D.; Makal, Jaroslaw; Nazarkiewicz, Andrzej

    2006-03-01

    The paper presents Bayesian method application in the measure information diagnostic value assessment that is used in the computer-aided diagnosis system. The computer system described here has been created basing on the Bayesian Network and is used in Benign Prostatic Hyperplasia (BPH) diagnosis. The graphic diagnostic model enables to juxtapose experts' knowledge with data.

  10. A fuzzy model for achieving lean attributes for competitive advantages development using AHP-QFD-PROMETHEE

    NASA Astrophysics Data System (ADS)

    Roghanian, E.; Alipour, Mohammad

    2014-06-01

    Lean production has become an integral part of the manufacturing landscape as its link with superior performance and its ability to provide competitive advantage is well accepted among academics and practitioners. Lean production helps producers in overcoming the challenges organizations face through using powerful tools and enablers. However, most companies are faced with restricted resources such as financial and human resources, time, etc., in using these enablers, and are not capable of implementing all these techniques. Therefore, identifying and selecting the most appropriate and efficient tool can be a significant challenge for many companies. Hence, this literature seeks to combine competitive advantages, lean attributes, and lean enablers to determine the most appropriate enablers for improvement of lean attributes. Quality function deployment in fuzzy environment and house of quality matrix are implemented. Throughout the methodology, fuzzy logic is the basis for translating linguistic judgments required for the relationships and correlation matrix to numerical values. Moreover, for final ranking of lean enablers, a multi-criteria decision-making method (PROMETHEE) is adopted. Finally, a case study in automotive industry is presented to illustrate the implementation of the proposed methodology.

  11. 15N Hyperpolarization by Reversible Exchange Using SABRE-SHEATH

    PubMed Central

    2016-01-01

    NMR signal amplification by reversible exchange (SABRE) is a NMR hyperpolarization technique that enables nuclear spin polarization enhancement of molecules via concurrent chemical exchange of a target substrate and parahydrogen (the source of spin order) on an iridium catalyst. Recently, we demonstrated that conducting SABRE in microtesla fields provided by a magnetic shield enables up to 10% 15N-polarization (Theis, T.; et al. J. Am. Chem. Soc.2015, 137, 1404). Hyperpolarization on 15N (and heteronuclei in general) may be advantageous because of the long-lived nature of the hyperpolarization on 15N relative to the short-lived hyperpolarization of protons conventionally hyperpolarized by SABRE, in addition to wider chemical shift dispersion and absence of background signal. Here we show that these unprecedented polarization levels enable 15N magnetic resonance imaging. We also present a theoretical model for the hyperpolarization transfer to heteronuclei, and detail key parameters that should be optimized for efficient 15N-hyperpolarization. The effects of parahydrogen pressure, flow rate, sample temperature, catalyst-to-substrate ratio, relaxation time (T1), and reversible oxygen quenching are studied on a test system of 15N-pyridine in methanol-d4. Moreover, we demonstrate the first proof-of-principle 13C-hyperpolarization using this method. This simple hyperpolarization scheme only requires access to parahydrogen and a magnetic shield, and it provides large enough signal gains to enable one of the first 15N images (2 × 2 mm2 resolution). Importantly, this method enables hyperpolarization of molecular sites with NMR T1 relaxation times suitable for biomedical imaging and spectroscopy. PMID:25960823

  12. 15N Hyperpolarization by Reversible Exchange Using SABRE-SHEATH.

    PubMed

    Truong, Milton L; Theis, Thomas; Coffey, Aaron M; Shchepin, Roman V; Waddell, Kevin W; Shi, Fan; Goodson, Boyd M; Warren, Warren S; Chekmenev, Eduard Y

    2015-04-23

    NMR signal amplification by reversible exchange (SABRE) is a NMR hyperpolarization technique that enables nuclear spin polarization enhancement of molecules via concurrent chemical exchange of a target substrate and parahydrogen (the source of spin order) on an iridium catalyst. Recently, we demonstrated that conducting SABRE in microtesla fields provided by a magnetic shield enables up to 10% 15 N-polarization (Theis, T.; et al. J. Am. Chem. Soc. 2015 , 137 , 1404). Hyperpolarization on 15 N (and heteronuclei in general) may be advantageous because of the long-lived nature of the hyperpolarization on 15 N relative to the short-lived hyperpolarization of protons conventionally hyperpolarized by SABRE, in addition to wider chemical shift dispersion and absence of background signal. Here we show that these unprecedented polarization levels enable 15 N magnetic resonance imaging. We also present a theoretical model for the hyperpolarization transfer to heteronuclei, and detail key parameters that should be optimized for efficient 15 N-hyperpolarization. The effects of parahydrogen pressure, flow rate, sample temperature, catalyst-to-substrate ratio, relaxation time ( T 1 ), and reversible oxygen quenching are studied on a test system of 15 N-pyridine in methanol- d 4 . Moreover, we demonstrate the first proof-of-principle 13 C-hyperpolarization using this method. This simple hyperpolarization scheme only requires access to parahydrogen and a magnetic shield, and it provides large enough signal gains to enable one of the first 15 N images (2 × 2 mm 2 resolution). Importantly, this method enables hyperpolarization of molecular sites with NMR T 1 relaxation times suitable for biomedical imaging and spectroscopy.

  13. A Self-Directed Method for Cell-Type Identification and Separation of Gene Expression Microarrays

    PubMed Central

    Zuckerman, Neta S.; Noam, Yair; Goldsmith, Andrea J.; Lee, Peter P.

    2013-01-01

    Gene expression analysis is generally performed on heterogeneous tissue samples consisting of multiple cell types. Current methods developed to separate heterogeneous gene expression rely on prior knowledge of the cell-type composition and/or signatures - these are not available in most public datasets. We present a novel method to identify the cell-type composition, signatures and proportions per sample without need for a-priori information. The method was successfully tested on controlled and semi-controlled datasets and performed as accurately as current methods that do require additional information. As such, this method enables the analysis of cell-type specific gene expression using existing large pools of publically available microarray datasets. PMID:23990767

  14. Protein–protein docking by fast generalized Fourier transforms on 5D rotational manifolds

    PubMed Central

    Padhorny, Dzmitry; Kazennov, Andrey; Zerbe, Brandon S.; Porter, Kathryn A.; Xia, Bing; Mottarella, Scott E.; Kholodov, Yaroslav; Ritchie, David W.; Vajda, Sandor; Kozakov, Dima

    2016-01-01

    Energy evaluation using fast Fourier transforms (FFTs) enables sampling billions of putative complex structures and hence revolutionized rigid protein–protein docking. However, in current methods, efficient acceleration is achieved only in either the translational or the rotational subspace. Developing an efficient and accurate docking method that expands FFT-based sampling to five rotational coordinates is an extensively studied but still unsolved problem. The algorithm presented here retains the accuracy of earlier methods but yields at least 10-fold speedup. The improvement is due to two innovations. First, the search space is treated as the product manifold SO(3)×(SO(3)∖S1), where SO(3) is the rotation group representing the space of the rotating ligand, and (SO(3)∖S1) is the space spanned by the two Euler angles that define the orientation of the vector from the center of the fixed receptor toward the center of the ligand. This representation enables the use of efficient FFT methods developed for SO(3). Second, we select the centers of highly populated clusters of docked structures, rather than the lowest energy conformations, as predictions of the complex, and hence there is no need for very high accuracy in energy evaluation. Therefore, it is sufficient to use a limited number of spherical basis functions in the Fourier space, which increases the efficiency of sampling while retaining the accuracy of docking results. A major advantage of the method is that, in contrast to classical approaches, increasing the number of correlation function terms is computationally inexpensive, which enables using complex energy functions for scoring. PMID:27412858

  15. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  16. Close-Packed Silicon Microelectrodes for Scalable Spatially Oversampled Neural Recording

    PubMed Central

    Scholvin, Jörg; Kinney, Justin P.; Bernstein, Jacob G.; Moore-Kochlacs, Caroline; Kopell, Nancy; Fonstad, Clifton G.; Boyden, Edward S.

    2015-01-01

    Objective Neural recording electrodes are important tools for understanding neural codes and brain dynamics. Neural electrodes that are close-packed, such as in tetrodes, enable spatial oversampling of neural activity, which facilitates data analysis. Here we present the design and implementation of close-packed silicon microelectrodes, to enable spatially oversampled recording of neural activity in a scalable fashion. Methods Our probes are fabricated in a hybrid lithography process, resulting in a dense array of recording sites connected to submicron dimension wiring. Results We demonstrate an implementation of a probe comprising 1000 electrode pads, each 9 × 9 μm, at a pitch of 11 μm. We introduce design automation and packaging methods that allow us to readily create a large variety of different designs. Significance Finally, we perform neural recordings with such probes in the live mammalian brain that illustrate the spatial oversampling potential of closely packed electrode sites. PMID:26699649

  17. Drop-on-Demand Single Cell Isolation and Total RNA Analysis

    PubMed Central

    Moon, Sangjun; Kim, Yun-Gon; Dong, Lingsheng; Lombardi, Michael; Haeggstrom, Edward; Jensen, Roderick V.; Hsiao, Li-Li; Demirci, Utkan

    2011-01-01

    Technologies that rapidly isolate viable single cells from heterogeneous solutions have significantly contributed to the field of medical genomics. Challenges remain both to enable efficient extraction, isolation and patterning of single cells from heterogeneous solutions as well as to keep them alive during the process due to a limited degree of control over single cell manipulation. Here, we present a microdroplet based method to isolate and pattern single cells from heterogeneous cell suspensions (10% target cell mixture), preserve viability of the extracted cells (97.0±0.8%), and obtain genomic information from isolated cells compared to the non-patterned controls. The cell encapsulation process is both experimentally and theoretically analyzed. Using the isolated cells, we identified 11 stem cell markers among 1000 genes and compare to the controls. This automated platform enabling high-throughput cell manipulation for subsequent genomic analysis employs fewer handling steps compared to existing methods. PMID:21412416

  18. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids by Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  19. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  20. A Bookmarking Service for Organizing and Sharing URLs

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Wolfe, Shawn R.; Chen, James R.; Mathe, Nathalie; Rabinowitz, Joshua L.

    1997-01-01

    Web browser bookmarking facilities predominate as the method of choice for managing URLs. In this paper, we describe some deficiencies of current bookmarking schemes, and examine an alternative to current approaches. We present WebTagger(TM), an implemented prototype of a personal bookmarking service that provides both individuals and groups with a customizable means of organizing and accessing Web-based information resources. In addition, the service enables users to supply feedback on the utility of these resources relative to their information needs, and provides dynamically-updated ranking of resources based on incremental user feedback. Individuals may access the service from anywhere on the Internet, and require no special software. This service greatly simplifies the process of sharing URLs within groups, in comparison with manual methods involving email. The underlying bookmark organization scheme is more natural and flexible than current hierarchical schemes supported by the major Web browsers, and enables rapid access to stored bookmarks.

  1. Formal Solutions for Polarized Radiative Transfer. II. High-order Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch

    When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.

  2. Self-assembled monolayers improve protein distribution on holey carbon cryo-EM supports

    PubMed Central

    Meyerson, Joel R.; Rao, Prashant; Kumar, Janesh; Chittori, Sagar; Banerjee, Soojay; Pierson, Jason; Mayer, Mark L.; Subramaniam, Sriram

    2014-01-01

    Poor partitioning of macromolecules into the holes of holey carbon support grids frequently limits structural determination by single particle cryo-electron microscopy (cryo-EM). Here, we present a method to deposit, on gold-coated carbon grids, a self-assembled monolayer whose surface properties can be controlled by chemical modification. We demonstrate the utility of this approach to drive partitioning of ionotropic glutamate receptors into the holes, thereby enabling 3D structural analysis using cryo-EM methods. PMID:25403871

  3. System and method for key generation in security tokens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Philip G.; Humble, Travis S.; Paul, Nathanael R.

    Functional randomness in security tokens (FRIST) may achieve improved security in two-factor authentication hardware tokens by improving on the algorithms used to securely generate random data. A system and method in one embodiment according to the present invention may allow for security of a token based on storage cost and computational security. This approach may enable communication where security is no longer based solely on onetime pads (OTPs) generated from a single cryptographic function (e.g., SHA-256).

  4. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries

    PubMed Central

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303

  5. Arbitrary-level hanging nodes for adaptive hphp-FEM approximations in 3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavel Kus; Pavel Solin; David Andrs

    2014-11-01

    In this paper we discuss constrained approximation with arbitrary-level hanging nodes in adaptive higher-order finite element methods (hphp-FEM) for three-dimensional problems. This technique enables using highly irregular meshes, and it greatly simplifies the design of adaptive algorithms as it prevents refinements from propagating recursively through the finite element mesh. The technique makes it possible to design efficient adaptive algorithms for purely hexahedral meshes. We present a detailed mathematical description of the method and illustrate it with numerical examples.

  6. Optimizing distance-based methods for large data sets

    NASA Astrophysics Data System (ADS)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  7. Rapid Technology Assessment via Unified Deployment of Global Optical and Virtual Diagnostics

    NASA Technical Reports Server (NTRS)

    Jordan, Jeffrey D.; Watkins, A. Neal; Fleming, Gary A.; Leighty, Bradley D.; Schwartz, Richard J.; Ingram, JoAnne L.; Grinstead, Keith D., Jr.; Oglesby, Donald M.; Tyler, Charles

    2003-01-01

    This paper discusses recent developments in rapid technology assessment resulting from an active collaboration between researchers at the Air Force Research Laboratory (AFRL) at Wright Patterson Air Force Base (WPAFB) and the NASA Langley Research Center (LaRC). This program targets the unified development and deployment of global measurement technologies coupled with a virtual diagnostic interface to enable the comparative evaluation of experimental and computational results. Continuing efforts focus on the development of seamless data translation methods to enable integration of data sets of disparate file format in a common platform. Results from a successful low-speed wind tunnel test at WPAFB in which global surface pressure distributions were acquired simultaneously with model deformation and geometry measurements are discussed and comparatively evaluated with numerical simulations. Intensity- and lifetime-based pressure-sensitive paint (PSP) and projection moire interferometry (PMI) results are presented within the context of rapid technology assessment to enable simulation-based R&D.

  8. Multi-dimensional super-resolution imaging enables surface hydrophobicity mapping

    NASA Astrophysics Data System (ADS)

    Bongiovanni, Marie N.; Godet, Julien; Horrocks, Mathew H.; Tosatto, Laura; Carr, Alexander R.; Wirthensohn, David C.; Ranasinghe, Rohan T.; Lee, Ji-Eun; Ponjavic, Aleks; Fritz, Joelle V.; Dobson, Christopher M.; Klenerman, David; Lee, Steven F.

    2016-12-01

    Super-resolution microscopy allows biological systems to be studied at the nanoscale, but has been restricted to providing only positional information. Here, we show that it is possible to perform multi-dimensional super-resolution imaging to determine both the position and the environmental properties of single-molecule fluorescent emitters. The method presented here exploits the solvatochromic and fluorogenic properties of nile red to extract both the emission spectrum and the position of each dye molecule simultaneously enabling mapping of the hydrophobicity of biological structures. We validated this by studying synthetic lipid vesicles of known composition. We then applied both to super-resolve the hydrophobicity of amyloid aggregates implicated in neurodegenerative diseases, and the hydrophobic changes in mammalian cell membranes. Our technique is easily implemented by inserting a transmission diffraction grating into the optical path of a localization-based super-resolution microscope, enabling all the information to be extracted simultaneously from a single image plane.

  9. Enhancing understanding and improving prediction of severe weather through spatiotemporal relational learning.

    PubMed

    McGovern, Amy; Gagne, David J; Williams, John K; Brown, Rodger A; Basara, Jeffrey B

    Severe weather, including tornadoes, thunderstorms, wind, and hail annually cause significant loss of life and property. We are developing spatiotemporal machine learning techniques that will enable meteorologists to improve the prediction of these events by improving their understanding of the fundamental causes of the phenomena and by building skillful empirical predictive models. In this paper, we present significant enhancements of our Spatiotemporal Relational Probability Trees that enable autonomous discovery of spatiotemporal relationships as well as learning with arbitrary shapes. We focus our evaluation on two real-world case studies using our technique: predicting tornadoes in Oklahoma and predicting aircraft turbulence in the United States. We also discuss how to evaluate success for a machine learning algorithm in the severe weather domain, which will enable new methods such as ours to transfer from research to operations, provide a set of lessons learned for embedded machine learning applications, and discuss how to field our technique.

  10. MEAs and 3D nanoelectrodes: electrodeposition as tool for a precisely controlled nanofabrication.

    PubMed

    Weidlich, Sabrina; Krause, Kay J; Schnitker, Jan; Wolfrum, Bernhard; Offenhäusser, Andreas

    2017-01-31

    Microelectrode arrays (MEAs) are gaining increasing importance for the investigation of signaling processes between electrogenic cells. However, efficient cell-chip coupling for robust and long-term electrophysiological recording and stimulation still remains a challenge. A possible approach for the improvement of the cell-electrode contact is the utilization of three-dimensional structures. In recent years, various 3D electrode geometries have been developed, but we are still lacking a fabrication approach that enables the formation of different 3D structures on a single chip in a controlled manner. This, however, is needed to enable a direct and reliable comparison of the recording capabilities of the different structures. Here, we present a method for a precisely controlled deposition of nanoelectrodes, enabling the fabrication of multiple, well-defined types of structures on our 64 electrode MEAs towards a rapid-prototyping approach to 3D electrodes.

  11. Multi-dimensional super-resolution imaging enables surface hydrophobicity mapping

    PubMed Central

    Bongiovanni, Marie N.; Godet, Julien; Horrocks, Mathew H.; Tosatto, Laura; Carr, Alexander R.; Wirthensohn, David C.; Ranasinghe, Rohan T.; Lee, Ji-Eun; Ponjavic, Aleks; Fritz, Joelle V.; Dobson, Christopher M.; Klenerman, David; Lee, Steven F.

    2016-01-01

    Super-resolution microscopy allows biological systems to be studied at the nanoscale, but has been restricted to providing only positional information. Here, we show that it is possible to perform multi-dimensional super-resolution imaging to determine both the position and the environmental properties of single-molecule fluorescent emitters. The method presented here exploits the solvatochromic and fluorogenic properties of nile red to extract both the emission spectrum and the position of each dye molecule simultaneously enabling mapping of the hydrophobicity of biological structures. We validated this by studying synthetic lipid vesicles of known composition. We then applied both to super-resolve the hydrophobicity of amyloid aggregates implicated in neurodegenerative diseases, and the hydrophobic changes in mammalian cell membranes. Our technique is easily implemented by inserting a transmission diffraction grating into the optical path of a localization-based super-resolution microscope, enabling all the information to be extracted simultaneously from a single image plane. PMID:27929085

  12. GNSS CORS hardware and software enabling new science

    NASA Astrophysics Data System (ADS)

    Drummond, P.

    2009-12-01

    GNSS CORS networks are enabling new opportunities for science and public and private sector business. This paper will explore how the newest geodetic monitoring software and GNSS receiver hardware from Trimble Navigation Ltd are enabling new science. Technology trends and science opportunities will be explored. These trends include the installation of active GNSS control, automation of observations and processing, and the advantages of multi-observable and multi-constellation observations, all performed with the use of off the shelf products and industry standard open-source data formats. Also the possibilities with moving science from an after-the-fact postprocessed model to a real-time epoch-by-epoch solution will be explored. This presentation will also discuss the combination of existing GNSS CORS networks with project specific installations used for monitoring. Experience is showing GNSS is able to provide higher resolution data than previous methods, providing new tools for science, decision makers and financial planners.

  13. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less

  14. Probing the structure of heterogeneous diluted materials by diffraction tomography.

    PubMed

    Bleuet, Pierre; Welcomme, Eléonore; Dooryhée, Eric; Susini, Jean; Hodeau, Jean-Louis; Walter, Philippe

    2008-06-01

    The advent of nanosciences calls for the development of local structural probes, in particular to characterize ill-ordered or heterogeneous materials. Furthermore, because materials properties are often related to their heterogeneity and the hierarchical arrangement of their structure, different structural probes covering a wide range of scales are required. X-ray diffraction is one of the prime structural methods but suffers from a relatively poor detection limit, whereas transmission electron analysis involves destructive sample preparation. Here we show the potential of coupling pencil-beam tomography with X-ray diffraction to examine unidentified phases in nanomaterials and polycrystalline materials. The demonstration is carried out on a high-pressure pellet containing several carbon phases and on a heterogeneous powder containing chalcedony and iron pigments. The present method enables a non-invasive structural refinement with a weight sensitivity of one part per thousand. It enables the extraction of the scattering patterns of amorphous and crystalline compounds with similar atomic densities and compositions. Furthermore, such a diffraction-tomography experiment can be carried out simultaneously with X-ray fluorescence, Compton and absorption tomographies, enabling a multimodal analysis of prime importance in materials science, chemistry, geology, environmental science, medical science, palaeontology and cultural heritage.

  15. Route Network Construction with Location-Direction-Enabled Photographs

    NASA Astrophysics Data System (ADS)

    Fujita, Hideyuki; Sagara, Shota; Ohmori, Tadashi; Shintani, Takahiko

    2018-05-01

    We propose a method for constructing a geometric graph for generating routes that summarize a geographical area and also have visual continuity by using a set of location-direction-enabled photographs. A location- direction-enabled photograph is a photograph that has information about the location (position of the camera at the time of shooting) and the direction (direction of the camera at the time of shooting). Each nodes of the graph corresponds to a location-direction-enabled photograph. The location of each node is the location of the corresponding photograph, and a route on the graph corresponds to a route in the geographic area and a sequence of photographs. The proposed graph is constructed to represent characteristic spots and paths linking the spots, and it is assumed to be a kind of a spatial summarization of the area with the photographs. Therefore, we call the routes on the graph as spatial summary route. Each route on the proposed graph also has a visual continuity, which means that we can understand the spatial relationship among the continuous photographs on the route such as moving forward, backward, turning right, etc. In this study, when the changes in the shooting position and shooting direction satisfied a given threshold, the route was defined to have visual continuity. By presenting the photographs in order along the generated route, information can be presented sequentially, while maintaining visual continuity to a great extent.

  16. Leveraging AMI data for distribution system model calibration and situational awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peppanen, Jouni; Reno, Matthew J.; Thakkar, Mohini

    The many new distributed energy resources being installed at the distribution system level require increased visibility into system operations that will be enabled by distribution system state estimation (DSSE) and situational awareness applications. Reliable and accurate DSSE requires both robust methods for managing the big data provided by smart meters and quality distribution system models. This paper presents intelligent methods for detecting and dealing with missing or inaccurate smart meter data, as well as the ways to process the data for different applications. It also presents an efficient and flexible parameter estimation method based on the voltage drop equation andmore » regression analysis to enhance distribution system model accuracy. Finally, it presents a 3-D graphical user interface for advanced visualization of the system state and events. Moreover, we demonstrate this paper for a university distribution network with the state-of-the-art real-time and historical smart meter data infrastructure.« less

  17. Leveraging AMI data for distribution system model calibration and situational awareness

    DOE PAGES

    Peppanen, Jouni; Reno, Matthew J.; Thakkar, Mohini; ...

    2015-01-15

    The many new distributed energy resources being installed at the distribution system level require increased visibility into system operations that will be enabled by distribution system state estimation (DSSE) and situational awareness applications. Reliable and accurate DSSE requires both robust methods for managing the big data provided by smart meters and quality distribution system models. This paper presents intelligent methods for detecting and dealing with missing or inaccurate smart meter data, as well as the ways to process the data for different applications. It also presents an efficient and flexible parameter estimation method based on the voltage drop equation andmore » regression analysis to enhance distribution system model accuracy. Finally, it presents a 3-D graphical user interface for advanced visualization of the system state and events. Moreover, we demonstrate this paper for a university distribution network with the state-of-the-art real-time and historical smart meter data infrastructure.« less

  18. Simulation of Thermographic Responses of Delaminations in Composites with Quadrupole Method

    NASA Technical Reports Server (NTRS)

    Winfree, William P.; Zalameda, Joseph N.; Howell, Patricia A.; Cramer, K. Elliott

    2016-01-01

    The application of the quadrupole method for simulating thermal responses of delaminations in carbon fiber reinforced epoxy composites materials is presented. The method solves for the flux at the interface containing the delamination. From the interface flux, the temperature at the surface is calculated. While the results presented are for single sided measurements, with ash heating, expansion of the technique to arbitrary temporal flux heating or through transmission measurements is simple. The quadrupole method is shown to have two distinct advantages relative to finite element or finite difference techniques. First, it is straight forward to incorporate arbitrary shaped delaminations into the simulation. Second, the quadrupole method enables calculation of the thermal response at only the times of interest. This, combined with a significant reduction in the number of degrees of freedom for the same simulation quality, results in a reduction of the computation time by at least an order of magnitude. Therefore, it is a more viable technique for model based inversion of thermographic data. Results for simulations of delaminations in composites are presented and compared to measurements and finite element method results.

  19. Quantitating Iron in Serum Ferritin by Use of ICP-MS

    NASA Technical Reports Server (NTRS)

    Smith, Scott M.; Gillman, Patricia L.

    2003-01-01

    A laboratory method has been devised to enable measurement of the concentration of iron bound in ferritin from small samples of blood (serum). Derived partly from a prior method that depends on large samples of blood, this method involves the use of an inductively-coupled-plasma mass spectrometer (ICP-MS). Ferritin is a complex of iron with the protein apoferritin. Heretofore, measurements of the concentration of serum ferritin (as distinguished from direct measurements of the concentration of iron in serum ferritin) have been used to assess iron stores in humans. Low levels of serum ferritin could indicate the first stage of iron depletion. High levels of serum ferritin could indicate high levels of iron (for example, in connection with hereditary hemochromatosis an iron-overload illness that is characterized by progressive organ damage and can be fatal). However, the picture is complicated: A high level of serum ferritin could also indicate stress and/or inflammation instead of (or in addition to) iron overload, and low serum iron concentration could indicate inflammation rather than iron deficiency. Only when concentrations of both serum iron and serum ferritin increase and decrease together can the patient s iron status be assessed accurately. Hence, in enabling accurate measurement of the iron content of serum ferritin, the present method can improve the diagnosis of the patient s iron status. The prior method of measuring the concentration of iron involves the use of an atomic-absorption spectrophotometer with a graphite furnace. The present method incorporates a modified version of the sample- preparation process of the prior method. First, ferritin is isolated; more specifically, it is immobilized by immunoprecipitation with rabbit antihuman polyclonal antibody bound to agarose beads. The ferritin is then separated from other iron-containing proteins and free iron by a series of centrifugation and wash steps. Next, the ferritin is digested with nitric acid to extract its iron content. Finally, a micronebulizer is used to inject the sample of the product of the digestion into the ICPMS for analysis of its iron content. The sensitivity of the ICP-MS is high enough to enable it to characterize samples smaller than those required in the prior method (samples can be 0.15 to 0.60 mL).

  20. Zephyr: Open-source Parallel Seismic Waveform Inversion in an Integrated Python-based Framework

    NASA Astrophysics Data System (ADS)

    Smithyman, B. R.; Pratt, R. G.; Hadden, S. M.

    2015-12-01

    Seismic Full-Waveform Inversion (FWI) is an advanced method to reconstruct wave properties of materials in the Earth from a series of seismic measurements. These methods have been developed by researchers since the late 1980s, and now see significant interest from the seismic exploration industry. As researchers move towards implementing advanced numerical modelling (e.g., 3D, multi-component, anisotropic and visco-elastic physics), it is desirable to make use of a modular approach, minimizing the effort developing a new set of tools for each new numerical problem. SimPEG (http://simpeg.xyz) is an open source project aimed at constructing a general framework to enable geophysical inversion in various domains. In this abstract we describe Zephyr (https://github.com/bsmithyman/zephyr), which is a coupled research project focused on parallel FWI in the seismic context. The software is built on top of Python, Numpy and IPython, which enables very flexible testing and implementation of new features. Zephyr is an open source project, and is released freely to enable reproducible research. We currently implement a parallel, distributed seismic forward modelling approach that solves the 2.5D (two-and-one-half dimensional) viscoacoustic Helmholtz equation at a range modelling frequencies, generating forward solutions for a given source behaviour, and gradient solutions for a given set of observed data. Solutions are computed in a distributed manner on a set of heterogeneous workers. The researcher's frontend computer may be separated from the worker cluster by a network link to enable full support for computation on remote clusters from individual workstations or laptops. The present codebase introduces a numerical discretization equivalent to that used by FULLWV, a well-known seismic FWI research codebase. This makes it straightforward to compare results from Zephyr directly with FULLWV. The flexibility introduced by the use of a Python programming environment makes extension of the codebase with new methods much more straightforward. This enables comparison and integration of new efforts with existing results.

  1. Statistical analysis of road-vehicle-driver interaction as an enabler to designing behavioural models

    NASA Astrophysics Data System (ADS)

    Chakravarty, T.; Chowdhury, A.; Ghose, A.; Bhaumik, C.; Balamuralidhar, P.

    2014-03-01

    Telematics form an important technology enabler for intelligent transportation systems. By deploying on-board diagnostic devices, the signatures of vehicle vibration along with its location and time are recorded. Detailed analyses of the collected signatures offer deep insights into the state of the objects under study. Towards that objective, we carried out experiments by deploying telematics device in one of the office bus that ferries employees to office and back. Data is being collected from 3-axis accelerometer, GPS, speed and the time for all the journeys. In this paper, we present initial results of the above exercise by applying statistical methods to derive information through systematic analysis of the data collected over four months. It is demonstrated that the higher order derivative of the measured Z axis acceleration samples display the properties Weibull distribution when the time axis is replaced by the amplitude of such processed acceleration data. Such an observation offers us a method to predict future behaviour where deviations from prediction are classified as context-based aberrations or progressive degradation of the system. In addition we capture the relationship between speed of the vehicle and median of the jerk energy samples using regression analysis. Such results offer an opportunity to develop a robust method to model road-vehicle interaction thereby enabling us to predict such like driving behaviour and condition based maintenance etc.

  2. WordSeeker: concurrent bioinformatics software for discovering genome-wide patterns and word-based genomic signatures

    PubMed Central

    2010-01-01

    Background An important focus of genomic science is the discovery and characterization of all functional elements within genomes. In silico methods are used in genome studies to discover putative regulatory genomic elements (called words or motifs). Although a number of methods have been developed for motif discovery, most of them lack the scalability needed to analyze large genomic data sets. Methods This manuscript presents WordSeeker, an enumerative motif discovery toolkit that utilizes multi-core and distributed computational platforms to enable scalable analysis of genomic data. A controller task coordinates activities of worker nodes, each of which (1) enumerates a subset of the DNA word space and (2) scores words with a distributed Markov chain model. Results A comprehensive suite of performance tests was conducted to demonstrate the performance, speedup and efficiency of WordSeeker. The scalability of the toolkit enabled the analysis of the entire genome of Arabidopsis thaliana; the results of the analysis were integrated into The Arabidopsis Gene Regulatory Information Server (AGRIS). A public version of WordSeeker was deployed on the Glenn cluster at the Ohio Supercomputer Center. Conclusion WordSeeker effectively utilizes concurrent computing platforms to enable the identification of putative functional elements in genomic data sets. This capability facilitates the analysis of the large quantity of sequenced genomic data. PMID:21210985

  3. Challenges and perspectives of garnet solid electrolytes for all solid-state lithium batteries

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Geng, Zhen; Han, Cuiping; Fu, Yongzhu; Li, Song; He, Yan-bing; Kang, Feiyu; Li, Baohua

    2018-06-01

    Garnet Li7La3Zr2O12 (LLZO) solid electrolytes recently have attracted tremendous interest as they have the potential to enable all solid-state lithium batteries (ASSLBs) owing to high ionic conductivity (10-3 to 10-4 S cm-1), negligible electronic transport, wide potential window (up to 9 V), and good chemical stability. Here we present the key issues and challenges of LLZO in the aspects of ion conduction property, interfacial compatibility, and stability in air. First, different preparation methods of LLZO are reviewed. Then, recent progress about the improvement of ionic conductivity and interfacial property between LLZO and electrodes are presented. Finally, we list some emerging LLZO-based solid-state batteries and provide perspectives for further research. The aim of this review is to summarize the up-to-date developments of LLZO and lead the direction for future development which could enable LLZO-based ASSLBs.

  4. Typograph: Multiscale Spatial Exploration of Text Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Burtner, Edwin R.; Cramer, Nicholas O.

    2013-12-01

    Visualizing large document collections using a spatial layout of terms can enable quick overviews of information. However, these metaphors (e.g., word clouds, tag clouds, etc.) often lack interactivity to explore the information and the location and rendering of the terms are often not based on mathematical models that maintain relative distances from other information based on similarity metrics. Further, transitioning between levels of detail (i.e., from terms to full documents) can be challanging. In this paper, we present Typograph, a multi-scale spatial exploration visualization for large document collections. Based on the term-based visualization methods, Typograh enables multipel levels of detailmore » (terms, phrases, snippets, and full documents) within the single spatialization. Further, the information is placed based on their relative similarity to other information to create the “near = similar” geography metaphor. This paper discusses the design principles and functionality of Typograph and presents a use case analyzing Wikipedia to demonstrate usage.« less

  5. Object-based media and stream-based computing

    NASA Astrophysics Data System (ADS)

    Bove, V. Michael, Jr.

    1998-03-01

    Object-based media refers to the representation of audiovisual information as a collection of objects - the result of scene-analysis algorithms - and a script describing how they are to be rendered for display. Such multimedia presentations can adapt to viewing circumstances as well as to viewer preferences and behavior, and can provide a richer link between content creator and consumer. With faster networks and processors, such ideas become applicable to live interpersonal communications as well, creating a more natural and productive alternative to traditional videoconferencing. In this paper is outlined an example of object-based media algorithms and applications developed by my group, and present new hardware architectures and software methods that we have developed to enable meeting the computational requirements of object- based and other advanced media representations. In particular we describe stream-based processing, which enables automatic run-time parallelization of multidimensional signal processing tasks even given heterogenous computational resources.

  6. Superresolution restoration of an image sequence: adaptive filtering approach.

    PubMed

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  7. Simple proof of the quantum benchmark fidelity for continuous-variable quantum devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Namiki, Ryo

    2011-04-15

    An experimental success criterion for continuous-variable quantum teleportation and memory is to surpass the limit of the average fidelity achieved by classical measure-and-prepare schemes with respect to a Gaussian-distributed set of coherent states. We present an alternative proof of the classical limit based on the familiar notions of state-channel duality and partial transposition. The present method enables us to produce a quantum-domain criterion associated with a given set of measured fidelities.

  8. Authenticated communication from quantum readout of PUFs

    NASA Astrophysics Data System (ADS)

    Škorić, Boris; Pinkse, Pepijn W. H.; Mosk, Allard P.

    2017-08-01

    Quantum readout of physical unclonable functions (PUFs) is a recently introduced method for remote authentication of objects. We present an extension of the protocol to enable the authentication of data: A verifier can check if received classical data were sent by the PUF holder. We call this modification QR-d or, in the case of the optical-PUF implementation, QSA-d. We discuss how QSA-d can be operated in a parallel way. We also present a protocol for authenticating quantum states.

  9. Subscale Validation of the Subsurface Active Filtration of Exhaust (SAFE) Approach to the NTP Ground Testing

    NASA Technical Reports Server (NTRS)

    Marshall, William M.; Borowski, Stanley K.; Bulman, Mel; Joyner, Russell; Martin, Charles R.

    2015-01-01

    Nuclear thermal propulsion (NTP) has been recognized as an enabling technology for missions to Mars and beyond. However, one of the key challenges of developing a nuclear thermal rocket is conducting verification and development tests on the ground. A number of ground test options are presented, with the Sub-surface Active Filtration of Exhaust (SAFE) method identified as a preferred path forward for the NTP program. The SAFE concept utilizes the natural soil characteristics present at the Nevada National Security Site to provide a natural filter for nuclear rocket exhaust during ground testing. A validation method of the SAFE concept is presented, utilizing a non-nuclear sub-scale hydrogen/oxygen rocket seeded with detectible radioisotopes. Additionally, some alternative ground test concepts, based upon the SAFE concept, are presented. Finally, an overview of the ongoing discussions of developing a ground test campaign are presented.

  10. Computation of transmitted and received B1 fields in magnetic resonance imaging.

    PubMed

    Milles, Julien; Zhu, Yue Min; Chen, Nan-Kuei; Panych, Lawrence P; Gimenez, Gérard; Guttmann, Charles R G

    2006-05-01

    Computation of B1 fields is a key issue for determination and correction of intensity nonuniformity in magnetic resonance images. This paper presents a new method for computing transmitted and received B1 fields. Our method combines a modified MRI acquisition protocol and an estimation technique based on the Levenberg-Marquardt algorithm and spatial filtering. It enables accurate estimation of transmitted and received B1 fields for both homogeneous and heterogeneous objects. The method is validated using numerical simulations and experimental data from phantom and human scans. The experimental results are in agreement with theoretical expectations.

  11. Reflection-Based Learning for Professional Ethical Formation.

    PubMed

    Branch, William T; George, Maura

    2017-04-01

    One way practitioners learn ethics is by reflecting on experience. They may reflect in the moment (reflection-in-action) or afterwards (reflection-on-action). We illustrate how a teaching clinician may transform relationships with patients and teach person-centered care through reflective learning. We discuss reflective learning pedagogies and present two case examples of our preferred method, guided group reflection using narratives. This method fosters moral development alongside professional identity formation in students and advanced learners. Our method for reflective learning addresses and enables processing of the most pressing ethical issues that learners encounter in practice. © 2017 American Medical Association. All Rights Reserved.

  12. Automatic Topography Using High Precision Digital Moire Methods

    NASA Astrophysics Data System (ADS)

    Yatagai, T.; Idesawa, M.; Saito, S.

    1983-07-01

    Three types of moire topographic methods using digital techniques are proposed. Deformed gratings obtained by projecting a reference grating onto an object under test are subjected to digital analysis. The electronic analysis procedures of deformed gratings described here enable us to distinguish between depression and elevation of the object, so that automatic measurement of 3-D shapes and automatic moire fringe interpolation are performed. Based on the digital moire methods, we have developed a practical measurement system, with a linear photodiode array on a micro-stage as a scanning image sensor. Examples of fringe analysis in medical applications are presented.

  13. Generalized method calculating the effective diffusion coefficient in periodic channels.

    PubMed

    Kalinay, Pavol

    2015-01-07

    The method calculating the effective diffusion coefficient in an arbitrary periodic two-dimensional channel, presented in our previous paper [P. Kalinay, J. Chem. Phys. 141, 144101 (2014)], is generalized to 3D channels of cylindrical symmetry, as well as to 2D or 3D channels with particles driven by a constant longitudinal external driving force. The next possible extensions are also indicated. The former calculation was based on calculus in the complex plane, suitable for the stationary diffusion in 2D domains. The method is reformulated here using standard tools of functional analysis, enabling the generalization.

  14. Safe Life Propulsion Design Technologies (3rd Generation Propulsion Research and Technology)

    NASA Technical Reports Server (NTRS)

    Ellis, Rod

    2000-01-01

    The tasks outlined in this viewgraph presentation on safe life propulsion design technologies (third generation propulsion research and technology) include the following: (1) Ceramic matrix composite (CMC) life prediction methods; (2) Life prediction methods for ultra high temperature polymer matrix composites for reusable launch vehicle (RLV) airframe and engine application; (3) Enabling design and life prediction technology for cost effective large-scale utilization of MMCs and innovative metallic material concepts; (4) Probabilistic analysis methods for brittle materials and structures; (5) Damage assessment in CMC propulsion components using nondestructive characterization techniques; and (6) High temperature structural seals for RLV applications.

  15. Nano-ceramics and method thereof

    DOEpatents

    Satcher, Jr., Joe H.; Gash, Alex [Livermore, CA; Simpson, Randall [Livermore, CA; Landingham, Richard [Livermore, CA; Reibold, Robert A [Salida, CA

    2006-08-08

    Disclosed herein is a method to produce ceramic materials utilizing the sol-gel process. The methods enable the preparation of intimate homogeneous dispersions of materials while offering the ability to control the size of one component within another. The method also enables the preparation of materials that will densify at reduced temperature.

  16. 3D printing functional materials and devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    McAlpine, Michael C.

    2017-05-01

    The development of methods for interfacing high performance functional devices with biology could impact regenerative medicine, smart prosthetics, and human-machine interfaces. Indeed, the ability to three-dimensionally interweave biological and functional materials could enable the creation of devices possessing unique geometries, properties, and functionalities. Yet, most high quality functional materials are two dimensional, hard and brittle, and require high crystallization temperatures for maximal performance. These properties render the corresponding devices incompatible with biology, which is three-dimensional, soft, stretchable, and temperature sensitive. We overcome these dichotomies by: 1) using 3D printing and scanning for customized, interwoven, anatomically accurate device architectures; 2) employing nanotechnology as an enabling route for overcoming mechanical discrepancies while retaining high performance; and 3) 3D printing a range of soft and nanoscale materials to enable the integration of a diverse palette of high quality functional nanomaterials with biology. 3D printing is a multi-scale platform, allowing for the incorporation of functional nanoscale inks, the printing of microscale features, and ultimately the creation of macroscale devices. This three-dimensional blending of functional materials and `living' platforms may enable next-generation 3D printed devices.

  17. Local CC2 response method based on the Laplace transform: analytic energy gradients for ground and excited states.

    PubMed

    Ledermüller, Katrin; Schütz, Martin

    2014-04-28

    A multistate local CC2 response method for the calculation of analytic energy gradients with respect to nuclear displacements is presented for ground and electronically excited states. The gradient enables the search for equilibrium geometries of extended molecular systems. Laplace transform is used to partition the eigenvalue problem in order to obtain an effective singles eigenvalue problem and adaptive, state-specific local approximations. This leads to an approximation in the energy Lagrangian, which however is shown (by comparison with the corresponding gradient method without Laplace transform) to be of no concern for geometry optimizations. The accuracy of the local approximation is tested and the efficiency of the new code is demonstrated by application calculations devoted to a photocatalytic decarboxylation process of present interest.

  18. Biodiesel production with special emphasis on lipase-catalyzed transesterification.

    PubMed

    Bisen, Prakash S; Sanodiya, Bhagwan S; Thakur, Gulab S; Baghel, Rakesh K; Prasad, G B K S

    2010-08-01

    The production of biodiesel by transesterification employing acid or base catalyst has been industrially accepted for its high conversion and reaction rates. Downstream processing costs and environmental problems associated with biodiesel production and byproducts recovery have led to the search for alternative production methods. Recently, enzymatic transesterification involving lipases has attracted attention for biodiesel production as it produces high purity product and enables easy separation from the byproduct, glycerol. The use of immobilized lipases and immobilized whole cells may lower the overall cost, while presenting less downstream processing problems, to biodiesel production. The present review gives an overview on biodiesel production technology and analyzes the factors/methods of enzymatic approach reported in the literature and also suggests suitable method on the basis of evidence for industrial production of biodiesel.

  19. Speeding up GW Calculations to Meet the Challenge of Large Scale Quasiparticle Predictions.

    PubMed

    Gao, Weiwei; Xia, Weiyi; Gao, Xiang; Zhang, Peihong

    2016-11-11

    Although the GW approximation is recognized as one of the most accurate theories for predicting materials excited states properties, scaling up conventional GW calculations for large systems remains a major challenge. We present a powerful and simple-to-implement method that can drastically accelerate fully converged GW calculations for large systems, enabling fast and accurate quasiparticle calculations for complex materials systems. We demonstrate the performance of this new method by presenting the results for ZnO and MgO supercells. A speed-up factor of nearly two orders of magnitude is achieved for a system containing 256 atoms (1024 valence electrons) with a negligibly small numerical error of ±0.03 eV. Finally, we discuss the application of our method to the GW calculations for 2D materials.

  20. J plots: a new method for characterizing structures in the interstellar medium

    NASA Astrophysics Data System (ADS)

    Jaffa, S. E.; Whitworth, A. P.; Clarke, S. D.; Howard, A. D. P.

    2018-06-01

    Large-scale surveys have brought about a revolution in astronomy. To analyse the resulting wealth of data, we need automated tools to identify, classify, and quantify the important underlying structures. We present here a method for classifying and quantifying a pixelated structure, based on its principal moments of inertia. The method enables us to automatically detect, and objectively compare, centrally condensed cores, elongated filaments, and hollow rings. We illustrate the method by applying it to (i) observations of surface density from Hi-GAL, and (ii) simulations of filament growth in a turbulent medium. We limit the discussion here to 2D data; in a future paper, we will extend the method to 3D data.

  1. Galerkin-collocation domain decomposition method for arbitrary binary black holes

    NASA Astrophysics Data System (ADS)

    Barreto, W.; Clemente, P. C. M.; de Oliveira, H. P.; Rodriguez-Mueller, B.

    2018-05-01

    We present a new computational framework for the Galerkin-collocation method for double domain in the context of ADM 3 +1 approach in numerical relativity. This work enables us to perform high resolution calculations for initial sets of two arbitrary black holes. We use the Bowen-York method for binary systems and the puncture method to solve the Hamiltonian constraint. The nonlinear numerical code solves the set of equations for the spectral modes using the standard Newton-Raphson method, LU decomposition and Gaussian quadratures. We show convergence of our code for the conformal factor and the ADM mass. Thus, we display features of the conformal factor for different masses, spins and linear momenta.

  2. [A sarcophagus with a surprise: computed tomography of a mummy from the Late Period of ancient Egypt].

    PubMed

    Isidro, Albert; Díez-Santacoloma, Iván; Bagot, Jaume; Milla, Lidón; Gallart, Anna

    2016-01-01

    Diagnostic imaging techniques, at present especially computed tomography (CT), have become the most important noninvasive method for the study of mummies because they enable high resolution images and three-dimensional reconstructions without damaging the mummified subject. We present a sarcophagus with a mummy hidden inside that was acquired by a gallery in Barcelona. The sarcophagus and mummy were examined by CT at the Hospital Universitari Sagrat Cor in Barcelona. A flexible clamp was used to obtain tissue samples for further study. The results showed the presence of an anatomically intact female human subject albeit with a destructured thorax and upper abdomen. Various metal objects were detected, corresponding to amulets, artificial eyes, and an external wooden brace. CT is an excellent noninvasive imaging technique for the detailed study of mummies, as it enables not only the anatomic identification of the mummified subject but also the obtainment of tissue samples for complementary analyses. The description of these findings enables us to know the major radiologic landmarks for the paleopathologic study of mummies. Copyright © 2015 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  3. BBB disruption with unfocused ultrasound alone-A paradigm shift

    NASA Astrophysics Data System (ADS)

    Kyle, Al

    2012-10-01

    One paradigm for ultrasound-enabled blood brain barrier disruption uses image guided focused ultrasound and preformed microbubble agents to enable drug delivery to the brain. We propose an alternative approach: unguided, unfocused ultrasound with no adjunctive agent. Compared with the focused approach, the proposed method affects a larger region of the brain, and is aimed at treatment of regional neurological disease including glioblastoma multiforme (GBM). Avoidance of image guidance and focusing reduces cost for equipment and staff training. Avoidance of adjunctive agents also lowers cost and is enabled by a longer exposure time. Since 2004, our group has worked with two animal models, three investigators in four laboratories to safely deliver five compounds, increasing the concentration of large molecule markers in brain tissue two fold or more. Safety and effectiveness data for four studies have been presented at the Ultrasound Industry Association meetings in 2007 and 2010. This paper describes new safety and effectiveness results for a fifth study. We present evidence of delivery of large molecules - including Avastin-to the brains of a large animal model correlated with acoustic pressure, and summarize the advantages and disadvantages of this novel approach.

  4. Scalable Total Syntheses of N-Linked Tryptamine Dimers by Direct Indole-Aniline Coupling: Psychotrimine and Kapakahines B & F

    PubMed Central

    Newhouse, Timothy; Lewis, Chad A.; Eastman, Kyle J.; Baran, Phil S.

    2010-01-01

    This report details the invention of a method to enable syntheses of psychotrimine, 1, and the kapakahines F and B, 2 & 3, on a gram scale and in a minimum number of steps. Mechanistic inquiries are presented for the key enabling quaternization of indole at the C3 position by electrophilic attack of an activated aniline species. Excellent chemo-, regio-, and diastereoselectivities are observed for reactions with o-iodoaniline, an indole cation equivalent. Additionally, the scope of this reaction is broad with respect to the tryptamine and aniline components. The anti-cancer profiles of psychotrimine, 1, and kapakahines F and B, 2 & 3, have also been evaluated. PMID:20426477

  5. Recent advances in imaging cancer of the kidney and urinary tract.

    PubMed

    Hilton, Susan; Jones, Lisa P

    2014-10-01

    Modern radiologic imaging is an aid to treatment planning for localized renal cancer, enabling characterization of mass lesions. For patients who present with advanced renal cancer, new imaging techniques enable a functional assessment of treatment response not possible using anatomic measurements alone. Multidetector CT urography permits simultaneous assessment of the kidneys and urinary tract for patients with unexplained hematuria. Both CT and MRI play a significant role in staging and follow up of patients treated for urothelial cancer. Newer imaging methods such as diffusion-weighted MRI have shown promising results for improving accuracy of staging and follow up of urothelial cancer. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Healing X-ray scattering images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jiliang; Lhermitte, Julien; Tian, Ye

    X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less

  7. Healing X-ray scattering images

    DOE PAGES

    Liu, Jiliang; Lhermitte, Julien; Tian, Ye; ...

    2017-05-24

    X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less

  8. Rapid Development and Distribution of Mobile Media-Rich Clinical Practice Guidelines Nationwide in Colombia.

    PubMed

    Flórez-Arango, José F; Sriram Iyengar, M; Caicedo, Indira T; Escobar, German

    2017-01-01

    Development and electronic distribution of Clinical Practice Guidelines production is costly and challenging. This poster presents a rapid method to represent existing guidelines in auditable, computer executable multimedia format. We used a technology that enables a small number of clinicians to, in a short period of time, develop a substantial amount of computer executable guidelines without programming.

  9. Uncertainty relations, zero point energy and the linear canonical group

    NASA Technical Reports Server (NTRS)

    Sudarshan, E. C. G.

    1993-01-01

    The close relationship between the zero point energy, the uncertainty relations, coherent states, squeezed states, and correlated states for one mode is investigated. This group-theoretic perspective enables the parametrization and identification of their multimode generalization. In particular the generalized Schroedinger-Robertson uncertainty relations are analyzed. An elementary method of determining the canonical structure of the generalized correlated states is presented.

  10. Fragon: rapid high-resolution structure determination from ideal protein fragments.

    PubMed

    Jenkins, Huw T

    2018-03-01

    Correctly positioning ideal protein fragments by molecular replacement presents an attractive method for obtaining preliminary phases when no template structure for molecular replacement is available. This has been exploited in several existing pipelines. This paper presents a new pipeline, named Fragon, in which fragments (ideal α-helices or β-strands) are placed using Phaser and the phases calculated from these coordinates are then improved by the density-modification methods provided by ACORN. The reliable scoring algorithm provided by ACORN identifies success. In these cases, the resulting phases are usually of sufficient quality to enable automated model building of the entire structure. Fragon was evaluated against two test sets comprising mixed α/β folds and all-β folds at resolutions between 1.0 and 1.7 Å. Success rates of 61% for the mixed α/β test set and 30% for the all-β test set were achieved. In almost 70% of successful runs, fragment placement and density modification took less than 30 min on relatively modest four-core desktop computers. In all successful runs the best set of phases enabled automated model building with ARP/wARP to complete the structure.

  11. Flow cytometric analysis of microbial contamination in food industry technological lines--initial study.

    PubMed

    Józwa, Wojciech; Czaczyk, Katarzyna

    2012-04-02

    Flow cytometry constitutes an alternative for traditional methods of microorganisms identification and analysis, including methods requiring cultivation step. It enables the detection of pathogens and other microorganisms contaminants without the need to culture microbial cells meaning that the sample (water, waste or food e.g. milk, wine, beer) may be analysed directly. This leads to a significant reduction of time required for analysis allowing monitoring of production processes and immediate reaction in case of contamination or any disruption occurs. Apart from the analysis of raw materials or products on different stages of manufacturing process, the flow cytometry seems to constitute an ideal tool for the assessment of microbial contamination on the surface of technological lines. In the present work samples comprising smears from 3 different surfaces of technological lines from fruit and vegetable processing company from Greater Poland were analysed directly with flow cytometer. The measured parameters were forward and side scatter of laser light signals allowing the estimation of microbial cell contents in each sample. Flow cytometric analysis of the surface of food industry production lines enable the preliminary evaluation of microbial contamination within few minutes from the moment of sample arrival without the need of sample pretreatment. The presented method of fl ow cytometric initial evaluation of microbial state of food industry technological lines demonstrated its potential for developing a robust, routine method for the rapid and labor-saving detection of microbial contamination in food industry.

  12. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  13. On the present and future of dissolution-DNP

    NASA Astrophysics Data System (ADS)

    Ardenkjaer-Larsen, Jan Henrik

    2016-03-01

    Dissolution-DNP is a method to create solutions of molecules with nuclear spin polarization close to unity. The many orders of magnitude signal enhancement have enabled many new applications, in particular in vivo MR metabolic imaging. The method relies on solid state dynamic nuclear polarization at low temperature followed by a dissolution to produce the room temperature solution of highly polarized spins. This work describes the present and future of dissolution-DNP in the mind of the author. The article describes some of the current trends in the field as well as outlines some of the areas where new ideas will make an impact. Most certainly, the future will take unpredicted directions, but hopefully the thoughts presented here will stimulate new ideas that can further advance the field.

  14. Determination of alloy content from plume spectral measurements

    NASA Technical Reports Server (NTRS)

    Madzsar, George C.

    1991-01-01

    The mathematical derivation for a method to determine the identities and amounts of alloys present in a flame where numerous alloys may be present is described. This method is applicable if the total number of elemental species from all alloys that may be in the flame is greater than or equal to the total number of alloys. Arranging the atomic spectral line emission equations for the elemental species as a series of simultaneous equations enables solution for identity and amount of the alloy present in the flame. This technique is intended for identification and quantification of alloy content in the plume of a rocket engine. Spectroscopic measurements reveal the atomic species entrained in the plume. Identification of eroding alloys may lead to the identification of the eroding component.

  15. Robust synthesis and continuous manufacturing of carbon nanotube forests and graphene films

    NASA Astrophysics Data System (ADS)

    Polsen, Erik S.

    Successful translation of the outstanding properties of carbon nanotubes (CNTs) and graphene to commercial applications requires highly consistent methods of synthesis, using scalable and cost-effective machines. This thesis presents robust process conditions and a series of process operations that will enable integrated roll-to-roll (R2R) CNT and graphene growth on flexible substrates. First, a comprehensive study was undertaken to establish the sources of variation in laboratory CVD growth of CNT forests. Statistical analysis identified factors that contribute to variation in forest height and density including ambient humidity, sample position in the reactor, and barometric pressure. Implementation of system modifications and user procedures reduced the variation in height and density by 50% and 54% respectively. With improved growth, two new methods for continuous deposition and patterning of catalyst nanoparticles for CNT forest growth were developed, enabling the diameter, density and pattern geometry to be tailored through the control of process parameters. Convective assembly of catalyst nanoparticles in solution enables growth of CNT forests with density 3-fold higher than using sputtered catalyst films with the same growth parameters. Additionally, laser printing of magnetic ink character recognition toner provides a large scale patterning method, with digital control of the pattern density and tunable CNT density via laser intensity. A concentric tube CVD reactor was conceptualized, designed and built for R2R growth of CNT forests and graphene on flexible substrates helically fed through the annular gap. The design enables downstream injection of the hydrocarbon source, and gas consumption is reduced 90% compared to a standard tube furnace. Multi-wall CNT forests are grown continuously on metallic and ceramic fiber substrates at 33 mm/min. High quality, uniform bi- and multi-layer graphene is grown on Cu and Ni foils at 25 - 495 mm/min. A second machine for continuous forest growth and delamination was developed; and forest-substrate adhesion strength was controlled through CVD parameters. Taken together, these methods enable uniform R2R processing of CNT forests and graphene with engineered properties. Last, it is projected that foreseeable improvements in CNT forest quality and density using these methods will result in electrical and thermal properties that exceed state-of-the-art bulk materials.

  16. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldham, Mark, E-mail: mark.oldham@duke.edu; Thomas, Andrew; O'Daniel, Jennifer

    2012-10-01

    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution wasmore » measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on the patient's anatomy. The latter step represents an important development that advances the clinical relevance of complex treatment QA.« less

  17. Enhanced Imaging of Corrosion in Aircraft Structures with Reverse Geometry X-ray(registered tm)

    NASA Technical Reports Server (NTRS)

    Winfree, William P.; Cmar-Mascis, Noreen A.; Parker, F. Raymond

    2000-01-01

    The application of Reverse Geometry X-ray to the detection and characterization of corrosion in aircraft structures is presented. Reverse Geometry X-ray is a unique system that utilizes an electronically scanned x-ray source and a discrete detector for real time radiographic imaging of a structure. The scanned source system has several advantages when compared to conventional radiography. First, the discrete x-ray detector can be miniaturized and easily positioned inside a complex structure (such as an aircraft wing) enabling images of each surface of the structure to be obtained separately. Second, using a measurement configuration with multiple detectors enables the simultaneous acquisition of data from several different perspectives without moving the structure or the measurement system. This provides a means for locating the position of flaws and enhances separation of features at the surface from features inside the structure. Data is presented on aircraft specimens with corrosion in the lap joint. Advanced laminographic imaging techniques utilizing data from multiple detectors are demonstrated to be capable of separating surface features from corrosion in the lap joint and locating the corrosion in multilayer structures. Results of this technique are compared to computed tomography cross sections obtained from a microfocus x-ray tomography system. A method is presented for calibration of the detectors of the Reverse Geometry X-ray system to enable quantification of the corrosion to within 2%.

  18. A monolithic Lagrangian approach for fluid-structure interaction problems

    NASA Astrophysics Data System (ADS)

    Ryzhakov, P. B.; Rossi, R.; Idelsohn, S. R.; Oñate, E.

    2010-11-01

    Current work presents a monolithic method for the solution of fluid-structure interaction problems involving flexible structures and free-surface flows. The technique presented is based upon the utilization of a Lagrangian description for both the fluid and the structure. A linear displacement-pressure interpolation pair is used for the fluid whereas the structure utilizes a standard displacement-based formulation. A slight fluid compressibility is assumed that allows to relate the mechanical pressure to the local volume variation. The method described features a global pressure condensation which in turn enables the definition of a purely displacement-based linear system of equations. A matrix-free technique is used for the solution of such linear system, leading to an efficient implementation. The result is a robust method which allows dealing with FSI problems involving arbitrary variations in the shape of the fluid domain. The method is completely free of spurious added-mass effects.

  19. Fast online deconvolution of calcium imaging data

    PubMed Central

    Zhou, Pengcheng; Paninski, Liam

    2017-01-01

    Fluorescent calcium indicators are a popular means for observing the spiking activity of large neuronal populations, but extracting the activity of each neuron from raw fluorescence calcium imaging data is a nontrivial problem. We present a fast online active set method to solve this sparse non-negative deconvolution problem. Importantly, the algorithm 3progresses through each time series sequentially from beginning to end, thus enabling real-time online estimation of neural activity during the imaging session. Our algorithm is a generalization of the pool adjacent violators algorithm (PAVA) for isotonic regression and inherits its linear-time computational complexity. We gain remarkable increases in processing speed: more than one order of magnitude compared to currently employed state of the art convex solvers relying on interior point methods. Unlike these approaches, our method can exploit warm starts; therefore optimizing model hyperparameters only requires a handful of passes through the data. A minor modification can further improve the quality of activity inference by imposing a constraint on the minimum spike size. The algorithm enables real-time simultaneous deconvolution of O(105) traces of whole-brain larval zebrafish imaging data on a laptop. PMID:28291787

  20. Company Profile: Selventa, Inc.

    PubMed

    Fryburg, David A; Latino, Louis J; Tagliamonte, John; Kenney, Renee D; Song, Diane H; Levine, Arnold J; de Graaf, David

    2012-08-01

    Selventa, Inc. (MA, USA) is a biomarker discovery company that enables personalized healthcare. Originally founded as Genstruct, Inc., Selventa has undergone significant evolution from a technology-based service provider to an active partner in the development of diagnostic tests, functioning as a molecular dashboard of disease activity using a unique platform. As part of that evolution, approximately 2 years ago the company was rebranded as Selventa to reflect its new identity and mission. The contributions to biomedical research by Selventa are based on in silico, reverse-engineering methods to determine biological causality. That is, given a set of in vitro or in vivo biological observations, which biological mechanisms can explain the measured results? Facilitated by a large and carefully curated knowledge base, these in silico methods generated new insights into the mechanisms driving a disease. As Selventa's methods would enable biomarker discovery and be directly applicable to generating novel diagnostics, the scientists at Selventa have focused on the development of predictive biomarkers of response in autoimmune and oncologic diseases. Selventa is presently building a portfolio of independent, as well as partnered, biomarker projects with the intention to create diagnostic tests that predict response to therapy.

  1. DNP System Output Volume Reduction Using Inert Fluids

    PubMed Central

    Peterson, Eric T; Gordon, Jeremy W; Erickson, Matthew G; Fain, Sean B; Rowland, Ian J

    2011-01-01

    Purpose To present a method for significantly increasing the concentration of a hyperpolarized compound produced by a commercial DNP polarizer, enabling the polarization process to be more suitable for pre-clinical applications. Materials and Methods Using a HyperSense® DNP polarizer, we have investigated the combined use of perfluorocarbon and water to warm and dissolve the hyperpolarized material from the polarization temperature of 1.4 K to produce material at temperatures suitable for injection. Results By replacing 75% of the water in the dissolution volume with a chemically and biologically inert liquid that is immiscible with water, the injection volume can be reduced fourfold Rapid separation of the water and perfluorocarbon mixture enables the aqueous layer containing polarized material to be easily and rapidly collected. Conclusion The approach provides a significantly increased concentration of compound in a volume for injection that is more appropriate for small animal studies. This is demonstrated for 13C labeled pyruvic acid and 13C labeled succinate, but may be applied to the majority of nuclei and compounds hyperpolarized by the DNP method. PMID:21448970

  2. Will isomalto-oligosaccharides, a well-established functional food in Asia, break through the European and American market? The status of knowledge on these prebiotics.

    PubMed

    Goffin, Dorothee; Delzenne, Nathalie; Blecker, Christophe; Hanon, Emilien; Deroanne, Claude; Paquot, Michel

    2011-05-01

    This critical review article presents the current state of knowledge on isomalto-oligosaccharides, some well known functional oligosaccharides in Asia, to evaluate their potential as emergent prebiotics in the American and European functional food market. It includes first a unique inventory of the different families of compounds which have been considered as IMOs and their specific structure. A description has been given of the different production methods including the involved enzymes and their specific activities, the substrates, and the types of IMOs produced. Considering the structural complexity of IMO products, specific characterization methods are described, as well as purification methods which enable the body to get rid of digestible oligosaccharides. Finally, an extensive review of their techno-functional and nutritional properties enables placing IMOs inside the growing prebiotic market. This review is of particular interest considering that IMO commercialization in America and Europe is a topical subject due to the recent submission by Bioneutra Inc. (Canada) of a novel food file to the UK Food Standards Agency, as well as several patents for IMO production.

  3. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  4. Compressible, multiphase semi-implicit method with moment of fluid interface representation

    DOE PAGES

    Jemison, Matthew; Sussman, Mark; Arienti, Marco

    2014-09-16

    A unified method for simulating multiphase flows using an exactly mass, momentum, and energy conserving Cell-Integrated Semi-Lagrangian advection algorithm is presented. The deforming material boundaries are represented using the moment-of-fluid method. Our new algorithm uses a semi-implicit pressure update scheme that asymptotically preserves the standard incompressible pressure projection method in the limit of infinite sound speed. The asymptotically preserving attribute makes the new method applicable to compressible and incompressible flows including stiff materials; enabling large time steps characteristic of incompressible flow algorithms rather than the small time steps required by explicit methods. Moreover, shocks are captured and material discontinuities aremore » tracked, without the aid of any approximate or exact Riemann solvers. As a result, wimulations of underwater explosions and fluid jetting in one, two, and three dimensions are presented which illustrate the effectiveness of the new algorithm at efficiently computing multiphase flows containing shock waves and material discontinuities with large “impedance mismatch.”« less

  5. Numerical calculation of thermo-mechanical problems at large strains based on complex step derivative approximation of tangent stiffness matrices

    NASA Astrophysics Data System (ADS)

    Balzani, Daniel; Gandhi, Ashutosh; Tanaka, Masato; Schröder, Jörg

    2015-05-01

    In this paper a robust approximation scheme for the numerical calculation of tangent stiffness matrices is presented in the context of nonlinear thermo-mechanical finite element problems and its performance is analyzed. The scheme extends the approach proposed in Kim et al. (Comput Methods Appl Mech Eng 200:403-413, 2011) and Tanaka et al. (Comput Methods Appl Mech Eng 269:454-470, 2014 and bases on applying the complex-step-derivative approximation to the linearizations of the weak forms of the balance of linear momentum and the balance of energy. By incorporating consistent perturbations along the imaginary axis to the displacement as well as thermal degrees of freedom, we demonstrate that numerical tangent stiffness matrices can be obtained with accuracy up to computer precision leading to quadratically converging schemes. The main advantage of this approach is that contrary to the classical forward difference scheme no round-off errors due to floating-point arithmetics exist within the calculation of the tangent stiffness. This enables arbitrarily small perturbation values and therefore leads to robust schemes even when choosing small values. An efficient algorithmic treatment is presented which enables a straightforward implementation of the method in any standard finite-element program. By means of thermo-elastic and thermo-elastoplastic boundary value problems at finite strains the performance of the proposed approach is analyzed.

  6. Parallelized Stochastic Cutoff Method for Long-Range Interacting Systems

    NASA Astrophysics Data System (ADS)

    Endo, Eishin; Toga, Yuta; Sasaki, Munetaka

    2015-07-01

    We present a method of parallelizing the stochastic cutoff (SCO) method, which is a Monte-Carlo method for long-range interacting systems. After interactions are eliminated by the SCO method, we subdivide a lattice into noninteracting interpenetrating sublattices. This subdivision enables us to parallelize the Monte-Carlo calculation in the SCO method. Such subdivision is found by numerically solving the vertex coloring of a graph created by the SCO method. We use an algorithm proposed by Kuhn and Wattenhofer to solve the vertex coloring by parallel computation. This method was applied to a two-dimensional magnetic dipolar system on an L × L square lattice to examine its parallelization efficiency. The result showed that, in the case of L = 2304, the speed of computation increased about 102 times by parallel computation with 288 processors.

  7. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  8. Breaking Computational Barriers: Real-time Analysis and Optimization with Large-scale Nonlinear Models via Model Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin Thomas; Drohmann, Martin; Tuminaro, Raymond S.

    2014-10-01

    Model reduction for dynamical systems is a promising approach for reducing the computational cost of large-scale physics-based simulations to enable high-fidelity models to be used in many- query (e.g., Bayesian inference) and near-real-time (e.g., fast-turnaround simulation) contexts. While model reduction works well for specialized problems such as linear time-invariant systems, it is much more difficult to obtain accurate, stable, and efficient reduced-order models (ROMs) for systems with general nonlinearities. This report describes several advances that enable nonlinear reduced-order models (ROMs) to be deployed in a variety of time-critical settings. First, we present an error bound for the Gauss-Newton with Approximatedmore » Tensors (GNAT) nonlinear model reduction technique. This bound allows the state-space error for the GNAT method to be quantified when applied with the backward Euler time-integration scheme. Second, we present a methodology for preserving classical Lagrangian structure in nonlinear model reduction. This technique guarantees that important properties--such as energy conservation and symplectic time-evolution maps--are preserved when performing model reduction for models described by a Lagrangian formalism (e.g., molecular dynamics, structural dynamics). Third, we present a novel technique for decreasing the temporal complexity --defined as the number of Newton-like iterations performed over the course of the simulation--by exploiting time-domain data. Fourth, we describe a novel method for refining projection-based reduced-order models a posteriori using a goal-oriented framework similar to mesh-adaptive h -refinement in finite elements. The technique allows the ROM to generate arbitrarily accurate solutions, thereby providing the ROM with a 'failsafe' mechanism in the event of insufficient training data. Finally, we present the reduced-order model error surrogate (ROMES) method for statistically quantifying reduced- order-model errors. This enables ROMs to be rigorously incorporated in uncertainty-quantification settings, as the error model can be treated as a source of epistemic uncertainty. This work was completed as part of a Truman Fellowship appointment. We note that much additional work was performed as part of the Fellowship. One salient project is the development of the Trilinos-based model-reduction software module Razor , which is currently bundled with the Albany PDE code and currently allows nonlinear reduced-order models to be constructed for any application supported in Albany. Other important projects include the following: 1. ROMES-equipped ROMs for Bayesian inference: K. Carlberg, M. Drohmann, F. Lu (Lawrence Berkeley National Laboratory), M. Morzfeld (Lawrence Berkeley National Laboratory). 2. ROM-enabled Krylov-subspace recycling: K. Carlberg, V. Forstall (University of Maryland), P. Tsuji, R. Tuminaro. 3. A pseudo balanced POD method using only dual snapshots: K. Carlberg, M. Sarovar. 4. An analysis of discrete v. continuous optimality in nonlinear model reduction: K. Carlberg, M. Barone, H. Antil (George Mason University). Journal articles for these projects are in progress at the time of this writing.« less

  9. Voids in cosmological simulations over cosmic time

    NASA Astrophysics Data System (ADS)

    Wojtak, Radosław; Powell, Devon; Abel, Tom

    2016-06-01

    We study evolution of voids in cosmological simulations using a new method for tracing voids over cosmic time. The method is based on tracking watershed basins (contiguous regions around density minima) of well-developed voids at low redshift, on a regular grid of density field. It enables us to construct a robust and continuous mapping between voids at different redshifts, from initial conditions to the present time. We discuss how the new approach eliminates strong spurious effects of numerical origin when voids' evolution is traced by matching voids between successive snapshots (by analogy to halo merger trees). We apply the new method to a cosmological simulation of a standard Λ-cold-dark-matter cosmological model and study evolution of basic properties of typical voids (with effective radii 6 h-1 Mpc < Rv < 20 h-1 Mpc at redshift z = 0) such as volumes, shapes, matter density distributions and relative alignments. The final voids at low redshifts appear to retain a significant part of the configuration acquired in initial conditions. Shapes of voids evolve in a collective way which barely modifies the overall distribution of the axial ratios. The evolution appears to have a weak impact on mutual alignments of voids implying that the present state is in large part set up by the primordial density field. We present evolution of dark matter density profiles computed on isodensity surfaces which comply with the actual shapes of voids. Unlike spherical density profiles, this approach enables us to demonstrate development of theoretically predicted bucket-like shape of the final density profiles indicating a wide flat core and a sharp transition to high-density void walls.

  10. Social research design: framework for integrating philosophical and practical elements.

    PubMed

    Cunningham, Kathryn Burns

    2014-09-01

    To provide and elucidate a comprehensible framework for the design of social research. An abundance of information exists concerning the process of designing social research. The overall message that can be gleaned is that numerable elements - both philosophical (ontological and epistemological assumptions and theoretical perspective) and practical (issue to be addressed, purpose, aims and research questions) - are influential in the process of selecting a research methodology and methods, and that these elements and their inter-relationships must be considered and explicated to ensure a coherent research design that enables well-founded and meaningful conclusions. There is a lack of guidance concerning the integration of practical and philosophical elements, hindering their consideration and explication. The author's PhD research into loneliness and cancer. This is a methodology paper. A guiding framework that incorporates all of the philosophical and practical elements influential in social research design is presented. The chronological and informative relationships between the elements are discussed. The framework presented can be used by social researchers to consider and explicate the practical and philosophical elements influential in the selection of a methodology and methods. It is hoped that the framework presented will aid social researchers with the design and the explication of the design of their research, thereby enhancing the credibility of their projects and enabling their research to establish well-founded and meaningful conclusions.

  11. Diameter measurement of optical nanofiber based on high-order Bragg reflections using a ruled grating.

    PubMed

    Zhu, Ming; Wang, Yao-Ting; Sun, Yi-Zhi; Zhang, Lijian; Ding, Wei

    2018-02-01

    A convenient method using a commercially available ruled grating for precise and overall diameter measurement of optical nanofibers (ONFs) is presented. We form a composite Bragg reflector with a micronscale period by dissolving aluminum coating, slicing the grating along ruling lines, and mounting it on an ONF. The resonant wavelengths of high-order Bragg reflections possess fiber diameter dependence, enabling nondestructive measurement of the ONF diameter profile. This method provides an easy and economic diagnostic tool for wide varieties of ONF-based applications.

  12. State criminal justice telecommunications (STACOM). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Fielding, J. E.; Frewing, H. K.; Lee, J. J.; Leflang, W. G.; Reilly, N. B.

    1977-01-01

    Techniques for identifying user requirements and network designs for criminal justice networks on a state wide basis are discussed. Topics covered include: methods for determining data required; data collection and survey; data organization procedures, and methods for forecasting network traffic volumes. Developed network design techniques center around a computerized topology program which enables the user to generate least cost network topologies that satisfy network traffic requirements, response time requirements and other specified functional requirements. The developed techniques were applied in Texas and Ohio, and results of these studies are presented.

  13. Stampless fabrication of sheet bars using disposable templates

    NASA Astrophysics Data System (ADS)

    Smolentsev, V. P.; Safonov, S. V.; Smolentsev, E. V.; Fedonin, O. N.

    2016-04-01

    The article is devoted to the new method of small-scale fabrication of sheet bars. The procedure is performed by using disposable overlay templates, or those associated with a sheet, which parameters are obtained directly from the drawing. The proposed method used as a substitution of die cutting enables to intensify the preparatory technological process, which is particularly effective when launching the market-oriented items into production. It significantly increases the competitiveness of mechanical engineering and creates the conditions for technical support of present-day flexible production systems.

  14. People detection method using graphics processing units for a mobile robot with an omnidirectional camera

    NASA Astrophysics Data System (ADS)

    Kang, Sungil; Roh, Annah; Nam, Bodam; Hong, Hyunki

    2011-12-01

    This paper presents a novel vision system for people detection using an omnidirectional camera mounted on a mobile robot. In order to determine regions of interest (ROI), we compute a dense optical flow map using graphics processing units, which enable us to examine compliance with the ego-motion of the robot in a dynamic environment. Shape-based classification algorithms are employed to sort ROIs into human beings and nonhumans. The experimental results show that the proposed system detects people more precisely than previous methods.

  15. An acoustic on-chip goniometer for room temperature macromolecular crystallography.

    PubMed

    Burton, C G; Axford, D; Edwards, A M J; Gildea, R J; Morris, R H; Newton, M I; Orville, A M; Prince, M; Topham, P D; Docker, P T

    2017-12-05

    This paper describes the design, development and successful use of an on-chip goniometer for room-temperature macromolecular crystallography via acoustically induced rotations. We present for the first time a low cost, rate-tunable, acoustic actuator for gradual in-fluid sample reorientation about varying axes and its utilisation for protein structure determination on a synchrotron beamline. The device enables the efficient collection of diffraction data via a rotation method from a sample within a surface confined droplet. This method facilitates efficient macromolecular structural data acquisition in fluid environments for dynamical studies.

  16. Laser Interferometry Method as a Novel Tool in Endotoxins Research.

    PubMed

    Arabski, Michał; Wąsik, Sławomir

    2017-01-01

    Optical properties of chemical substances are widely used at present for assays thereof in a variety of scientific disciplines. One of the measurement techniques applied in physical sciences, with a potential for novel applications in biology, is laser interferometry. This method enables to record the diffusion properties of chemical substances. Here we describe the novel application of laser interferometry in chitosan interactions with lipopolysaccharide by detection of colistin diffusion. The proposed model could be used in simple measurements of polymer interactions with endotoxins and/or biological active compounds, like antibiotics.

  17. Monotonically improving approximate answers to relational algebra queries

    NASA Technical Reports Server (NTRS)

    Smith, Kenneth P.; Liu, J. W. S.

    1989-01-01

    We present here a query processing method that produces approximate answers to queries posed in standard relational algebra. This method is monotone in the sense that the accuracy of the approximate result improves with the amount of time spent producing the result. This strategy enables us to trade the time to produce the result for the accuracy of the result. An approximate relational model that characterizes appromimate relations and a partial order for comparing them is developed. Relational operators which operate on and return approximate relations are defined.

  18. Parallel Cartesian grid refinement for 3D complex flow simulations

    NASA Astrophysics Data System (ADS)

    Angelidis, Dionysios; Sotiropoulos, Fotis

    2013-11-01

    A second order accurate method for discretizing the Navier-Stokes equations on 3D unstructured Cartesian grids is presented. Although the grid generator is based on the oct-tree hierarchical method, fully unstructured data-structure is adopted enabling robust calculations for incompressible flows, avoiding both the need of synchronization of the solution between different levels of refinement and usage of prolongation/restriction operators. The current solver implements a hybrid staggered/non-staggered grid layout, employing the implicit fractional step method to satisfy the continuity equation. The pressure-Poisson equation is discretized by using a novel second order fully implicit scheme for unstructured Cartesian grids and solved using an efficient Krylov subspace solver. The momentum equation is also discretized with second order accuracy and the high performance Newton-Krylov method is used for integrating them in time. Neumann and Dirichlet conditions are used to validate the Poisson solver against analytical functions and grid refinement results to a significant reduction of the solution error. The effectiveness of the fractional step method results in the stability of the overall algorithm and enables the performance of accurate multi-resolution real life simulations. This material is based upon work supported by the Department of Energy under Award Number DE-EE0005482.

  19. Bio++: a set of C++ libraries for sequence analysis, phylogenetics, molecular evolution and population genetics.

    PubMed

    Dutheil, Julien; Gaillard, Sylvain; Bazin, Eric; Glémin, Sylvain; Ranwez, Vincent; Galtier, Nicolas; Belkhir, Khalid

    2006-04-04

    A large number of bioinformatics applications in the fields of bio-sequence analysis, molecular evolution and population genetics typically share input/output methods, data storage requirements and data analysis algorithms. Such common features may be conveniently bundled into re-usable libraries, which enable the rapid development of new methods and robust applications. We present Bio++, a set of Object Oriented libraries written in C++. Available components include classes for data storage and handling (nucleotide/amino-acid/codon sequences, trees, distance matrices, population genetics datasets), various input/output formats, basic sequence manipulation (concatenation, transcription, translation, etc.), phylogenetic analysis (maximum parsimony, markov models, distance methods, likelihood computation and maximization), population genetics/genomics (diversity statistics, neutrality tests, various multi-locus analyses) and various algorithms for numerical calculus. Implementation of methods aims at being both efficient and user-friendly. A special concern was given to the library design to enable easy extension and new methods development. We defined a general hierarchy of classes that allow the developer to implement its own algorithms while remaining compatible with the rest of the libraries. Bio++ source code is distributed free of charge under the CeCILL general public licence from its website http://kimura.univ-montp2.fr/BioPP.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balsa Terzic, Gabriele Bassi

    In this paper we discuss representations of charge particle densities in particle-in-cell (PIC) simulations, analyze the sources and profiles of the intrinsic numerical noise, and present efficient methods for their removal. We devise two alternative estimation methods for charged particle distribution which represent significant improvement over the Monte Carlo cosine expansion used in the 2d code of Bassi, designed to simulate coherent synchrotron radiation (CSR) in charged particle beams. The improvement is achieved by employing an alternative beam density estimation to the Monte Carlo cosine expansion. The representation is first binned onto a finite grid, after which two grid-based methodsmore » are employed to approximate particle distributions: (i) truncated fast cosine transform (TFCT); and (ii) thresholded wavelet transform (TWT). We demonstrate that these alternative methods represent a staggering upgrade over the original Monte Carlo cosine expansion in terms of efficiency, while the TWT approximation also provides an appreciable improvement in accuracy. The improvement in accuracy comes from a judicious removal of the numerical noise enabled by the wavelet formulation. The TWT method is then integrated into Bassi's CSR code, and benchmarked against the original version. We show that the new density estimation method provides a superior performance in terms of efficiency and spatial resolution, thus enabling high-fidelity simulations of CSR effects, including microbunching instability.« less

  1. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Staining Methods for Normal and Regenerative Myelin in the Nervous System.

    PubMed

    Carriel, Víctor; Campos, Antonio; Alaminos, Miguel; Raimondo, Stefania; Geuna, Stefano

    2017-01-01

    Histochemical techniques enable the specific identification of myelin by light microscopy. Here we describe three histochemical methods for the staining of myelin suitable for formalin-fixed and paraffin-embedded materials. The first method is conventional luxol fast blue (LFB) method which stains myelin in blue and Nissl bodies and mast cells in purple. The second method is a LBF-based method called MCOLL, which specifically stains the myelin as well the collagen fibers and cells, giving an integrated overview of the histology and myelin content of the tissue. Finally, we describe the osmium tetroxide method, which consist in the osmication of previously fixed tissues. Osmication is performed prior the embedding of tissues in paraffin giving a permanent positive reaction for myelin as well as other lipids present in the tissue.

  3. Conceptualising forensic science and forensic reconstruction. Part II: The critical interaction between research, policy/law and practice.

    PubMed

    Morgan, R M

    2017-11-01

    This paper builds on the FoRTE conceptual model presented in part I to address the forms of knowledge that are integral to the four components of the model. Articulating the different forms of knowledge within effective forensic reconstructions is valuable. It enables a nuanced approach to the development and use of evidence bases to underpin decision-making at every stage of a forensic reconstruction by enabling transparency in the reporting of inferences. It also enables appropriate methods to be developed to ensure quality and validity. It is recognised that the domains of practice, research, and policy/law intersect to form the nexus where forensic science is situated. Each domain has a distinctive infrastructure that influences the production and application of different forms of knowledge in forensic science. The channels that can enable the interaction between these domains, enhance the impact of research in theory and practice, increase access to research findings, and support quality are presented. The particular strengths within the different domains to deliver problem solving forensic reconstructions are thereby identified and articulated. It is argued that a conceptual understanding of forensic reconstruction that draws on the full range of both explicit and tacit forms of knowledge, and incorporates the strengths of the different domains pertinent to forensic science, offers a pathway to harness the full value of trace evidence for context sensitive, problem-solving forensic applications. Copyright © 2017 The Author. Published by Elsevier B.V. All rights reserved.

  4. Spatiotemporal behaviour of isodiffracting hollow Gaussian pulsed beams

    NASA Astrophysics Data System (ADS)

    Xu, Yanbing; Lü, Baida

    2007-05-01

    A model of isodiffracting hollow Gaussian pulsed beams (HGPBs) is presented. Based on the Fourier transform method, an analytical formula for the HGPBs propagating in free space is derived, which enables us to study the spatiotemporal behaviour of the ultrashort pulsed beams. Some interesting phenomena of ultrashort pulsed beams, such as the symmetrical temporal profiles, the dark rings, etc, are discussed in detail and illustrated numerically.

  5. Characterizing the performance of ecosystem models across time scales: A spectral analysis of the North American Carbon Program site-level synthesis

    Treesearch

    Michael C. Dietze; Rodrigo Vargas; Andrew D. Richardson; Paul C. Stoy; Alan G. Barr; Ryan S. Anderson; M. Altaf Arain; Ian T. Baker; T. Andrew Black; Jing M. Chen; Philippe Ciais; Lawrence B. Flanagan; Christopher M. Gough; Robert F. Grant; David Hollinger; R. Cesar Izaurralde; Christopher J. Kucharik; Peter Lafleur; Shugang Liu; Erandathie Lokupitiya; Yiqi Luo; J. William Munger; Changhui Peng; Benjamin Poulter; David T. Price; Daniel M. Ricciuto; William J. Riley; Alok Kumar Sahoo; Kevin Schaefer; Andrew E. Suyker; Hanqin Tian; Christina Tonitto; Hans Verbeeck; Shashi B. Verma; Weifeng Wang; Ensheng Weng

    2011-01-01

    Ecosystem models are important tools for diagnosing the carbon cycle and projecting its behavior across space and time. Despite the fact that ecosystems respond to drivers at multiple time scales, most assessments of model performance do not discriminate different time scales. Spectral methods, such as wavelet analyses, present an alternative approach that enables the...

  6. Red, green, and blue lasing enabled by single-exciton gain in colloidal quantum dot films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nurmikko, Arto V.; Dang, Cuong

    The methods and materials described herein contemplate the use films of colloidal quantum dots as a gain medium in a vertical-cavity surface-emitting laser. The present disclosure demonstrates a laser with single-exciton gain in the red, green, and blue wavelengths. Leveraging this nanocomposite gain, the results realize a significant step toward full-color single-material lasers.

  7. Robotic tele-existence

    NASA Technical Reports Server (NTRS)

    Tachi, Susumu; Arai, Hirohiko; Maeda, Taro

    1989-01-01

    Tele-existence is an advanced type of teleoperation system that enables a human operator at the controls to perform remote manipulation tasks dexterously with the feeling that he or she exists in the remote anthropomorphic robot in the remote environment. The concept of a tele-existence is presented, the principle of the tele-existence display method is explained, some of the prototype systems are described, and its space application is discussed.

  8. Globescope: Student Involvement in Culture Trait Studies as Part of the Social Studies Curriculum in Grades 5-12.

    ERIC Educational Resources Information Center

    Peters, Richard

    The program described in this guide provides a method of researching and comparing diverse cultures for middle and high school students. Teams of students investigate cultures from around the world and present findings to the entire class. The team approach enables the class to be exposed to a variety of materials and gives students experience in…

  9. Improving the Nutritional Value of the Food Served and the Dining Experience in a Primary School

    ERIC Educational Resources Information Center

    Duncan, Sue

    2011-01-01

    The aim of this paper is to demonstrate that it is possible to make major changes in a primary school with limited investment. I wish to present the methods used in this process which enabled me to examine the existing catering, to identify, investigate and research the problems, to explore the literature available and to synthesise my results…

  10. Beyond the New Architectures - Enabling Rapid System Configurations

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2009-01-01

    This presentation slide document reviews the attempts to integrate systems and create common standards for missions. A primary example is telemetry and command sets for satellites. The XML Telemetric and Command Exchange (XTCE) exists, but this is not easy to implement. There is a need for a new standard. The document proposes a method to achieve the standard, and the benefits of using a new standard,

  11. Speeding up GW Calculations to Meet the Challenge of Large Scale Quasiparticle Predictions

    PubMed Central

    Gao, Weiwei; Xia, Weiyi; Gao, Xiang; Zhang, Peihong

    2016-01-01

    Although the GW approximation is recognized as one of the most accurate theories for predicting materials excited states properties, scaling up conventional GW calculations for large systems remains a major challenge. We present a powerful and simple-to-implement method that can drastically accelerate fully converged GW calculations for large systems, enabling fast and accurate quasiparticle calculations for complex materials systems. We demonstrate the performance of this new method by presenting the results for ZnO and MgO supercells. A speed-up factor of nearly two orders of magnitude is achieved for a system containing 256 atoms (1024 valence electrons) with a negligibly small numerical error of ±0.03 eV. Finally, we discuss the application of our method to the GW calculations for 2D materials. PMID:27833140

  12. [Methods for determination of cholinesterase activity].

    PubMed

    Dingová, D; Hrabovská, A

    2015-01-01

    Cholinesterases hydrolyze acetylcholine and thus they play a key role in a process of cholinergic neurotransmission. Changes in their activities are linked to many diseases (e.g Alzheimer disease, Parkinson disease, lipid disorders). Thus, it is important to determine their activity in a fast, simply and precise way. In this review, different approaches of studying cholinesterase activities (e.g pH-dependent, spectrophotometric, radiometric, histochemical methods or biosensors) are discussed. Comparisons, advantages or disadvantages of selected methods (e.g most widely used Ellman's assay, extremely sensitive Johnson Russell method or modern technique with golden nanoparticles) are presented. This review enables one to choose a suitable method for determination of cholinesterase activities with respect to laboratory equipment, type of analysis, pH, temperature scale or special conditions.

  13. Detection of osmotic damages in GRP boat hulls

    NASA Astrophysics Data System (ADS)

    Krstulović-Opara, L.; Domazet, Ž.; Garafulić, E.

    2013-09-01

    Infrared thermography as a tool of non-destructive testing is method enabling visualization and estimation of structural anomalies and differences in structure's topography. In presented paper problem of osmotic damage in submerged glass reinforced polymer structures is addressed. The osmotic damage can be detected by a simple humidity gauging, but for proper evaluation and estimation testing methods are restricted and hardly applicable. In this paper it is demonstrated that infrared thermography, based on estimation of heat wave propagation, can be used. Three methods are addressed; Pulsed thermography, Fast Fourier Transform and Continuous Morlet Wavelet. An additional image processing based on gradient approach is applied on all addressed methods. It is shown that the Continuous Morlet Wavelet is the most appropriate method for detection of osmotic damage.

  14. The application of time series models to cloud field morphology analysis

    NASA Technical Reports Server (NTRS)

    Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.

    1987-01-01

    A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.

  15. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

    NASA Astrophysics Data System (ADS)

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

  16. Quantitative Examination of Corrosion Damage by Means of Thermal Response Measurements

    NASA Technical Reports Server (NTRS)

    Rajic, Nik

    1998-01-01

    Two computational methods are presented that enable a characterization of corrosion damage to be performed from thermal response measurements derived from a standard flash thermographic inspection. The first is based upon a one dimensional analytical solution to the heat diffusion equation and presumes the lateral extent of damage is large compared to the residual structural thickness, such that lateral heat diffusion effects can be considered insignificant. The second proposed method, based on a finite element optimization scheme, addresses the more general case where these conditions are not met. Results from an experimental application are given to illustrate the precision, robustness and practical efficacy of both methods.

  17. Methodical approaches to value assessment and determination of the capitalization level of high-rise construction

    NASA Astrophysics Data System (ADS)

    Smirnov, Vitaly; Dashkov, Leonid; Gorshkov, Roman; Burova, Olga; Romanova, Alina

    2018-03-01

    The article presents the analysis of the methodological approaches to cost estimation and determination of the capitalization level of high-rise construction objects. Factors determining the value of real estate were considered, three main approaches for estimating the value of real estate objects are given. The main methods of capitalization estimation were analyzed, the most reasonable method for determining the level of capitalization of high-rise buildings was proposed. In order to increase the value of real estate objects, the author proposes measures that enable to increase significantly the capitalization of the enterprise through more efficient use of intangible assets and goodwill.

  18. Neural dynamic optimization for control systems.II. Theory.

    PubMed

    Seong, C Y; Widrow, B

    2001-01-01

    The paper presents neural dynamic optimization (NDO) as a method of optimal feedback control for nonlinear multi-input-multi-output (MIMO) systems. The main feature of NDO is that it enables neural networks to approximate the optimal feedback solution whose existence dynamic programming (DP) justifies, thereby reducing the complexities of computation and storage problems of the classical methods such as DP. This paper mainly describes the theory of NDO, while the two other companion papers of this topic explain the background for the development of NDO and demonstrate the method with several applications including control of autonomous vehicles and of a robot arm, respectively.

  19. Facile generation of cell microarrays using vacuum degassing and coverslip sweeping.

    PubMed

    Wang, Min S; Luo, Zhen; Cherukuri, Sundar; Nitin, Nitin

    2014-07-15

    A simple method to generate cell microarrays with high-percentage well occupancy and well-defined cell confinement is presented. This method uses a synergistic combination of vacuum degassing and coverslip sweeping. The vacuum degassing step dislodges air bubbles from the microwells, which in turn enables the cells to enter the microwells, while the physical sweeping step using a glass coverslip removes the excess cells outside the microwells. This low-cost preparation method provides a simple solution to generating cell microarrays that can be performed in basic research laboratories and point-of-care settings for routine cell-based screening assays. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Power Series Approximation for the Correlation Kernel Leading to Kohn-Sham Methods Combining Accuracy, Computational Efficiency, and General Applicability

    NASA Astrophysics Data System (ADS)

    Erhard, Jannis; Bleiziffer, Patrick; Görling, Andreas

    2016-09-01

    A power series approximation for the correlation kernel of time-dependent density-functional theory is presented. Using this approximation in the adiabatic-connection fluctuation-dissipation (ACFD) theorem leads to a new family of Kohn-Sham methods. The new methods yield reaction energies and barriers of unprecedented accuracy and enable a treatment of static (strong) correlation with an accuracy of high-level multireference configuration interaction methods but are single-reference methods allowing for a black-box-like handling of static correlation. The new methods exhibit a better scaling of the computational effort with the system size than rivaling wave-function-based electronic structure methods. Moreover, the new methods do not suffer from the problem of singularities in response functions plaguing previous ACFD methods and therefore are applicable to any type of electronic system.

  1. Highly multiplexed and quantitative cell-surface protein profiling using genetically barcoded antibodies.

    PubMed

    Pollock, Samuel B; Hu, Amy; Mou, Yun; Martinko, Alexander J; Julien, Olivier; Hornsby, Michael; Ploder, Lynda; Adams, Jarrett J; Geng, Huimin; Müschen, Markus; Sidhu, Sachdev S; Moffat, Jason; Wells, James A

    2018-03-13

    Human cells express thousands of different surface proteins that can be used for cell classification, or to distinguish healthy and disease conditions. A method capable of profiling a substantial fraction of the surface proteome simultaneously and inexpensively would enable more accurate and complete classification of cell states. We present a highly multiplexed and quantitative surface proteomic method using genetically barcoded antibodies called phage-antibody next-generation sequencing (PhaNGS). Using 144 preselected antibodies displayed on filamentous phage (Fab-phage) against 44 receptor targets, we assess changes in B cell surface proteins after the development of drug resistance in a patient with acute lymphoblastic leukemia (ALL) and in adaptation to oncogene expression in a Myc-inducible Burkitt lymphoma model. We further show PhaNGS can be applied at the single-cell level. Our results reveal that a common set of proteins including FLT3, NCR3LG1, and ROR1 dominate the response to similar oncogenic perturbations in B cells. Linking high-affinity, selective, genetically encoded binders to NGS enables direct and highly multiplexed protein detection, comparable to RNA-sequencing for mRNA. PhaNGS has the potential to profile a substantial fraction of the surface proteome simultaneously and inexpensively to enable more accurate and complete classification of cell states. Copyright © 2018 the Author(s). Published by PNAS.

  2. Large scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU)

    PubMed Central

    Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin

    2014-01-01

    Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633

  3. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  4. Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.

    2013-12-01

    Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.

  5. Magic Angle Spinning NMR Metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhi Hu, Jian

    Nuclear Magnetic Resonance (NMR) spectroscopy is a non-destructive, quantitative, reproducible, untargeted and unbiased method that requires no or minimal sample preparation, and is one of the leading analytical tools for metabonomics research [1-3]. The easy quantification and the no need of prior knowledge about compounds present in a sample associated with NMR are advantageous over other techniques [1,4]. 1H NMR is especially attractive because protons are present in virtually all metabolites and its NMR sensitivity is high, enabling the simultaneous identification and monitoring of a wide range of low molecular weight metabolites.

  6. C3: The Compositional Construction of Content: A New, More Effective and Efficient Way to Marshal Inferences from Background Knowledge that will Enable More Natural and Effective Communication with Automomous Systems

    DTIC Science & Technology

    2014-01-06

    products derived from this funding. This includes two proposed activities for Summer 2014: • Deep Semantic Annotation with Shallow Methods; James... process that we need to ensure that words are unambiguous before we read them (present in just the semantic field that is presently active). Publication...Technical Report). MIT Artificial Intelligence Laboratory. Allen, J., Manshadi, M., Dzikovska, M., & Swift, M. (2007). Deep linguistic processing for

  7. Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Rallabhandi, Sriram K.

    2010-01-01

    A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.

  8. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  9. Prediction of turning stability using receptance coupling

    NASA Astrophysics Data System (ADS)

    Jasiewicz, Marcin; Powałka, Bartosz

    2018-01-01

    This paper presents an issue of machining stability prediction of dynamic "lathe - workpiece" system evaluated using receptance coupling method. Dynamic properties of the lathe components (the spindle and the tailstock) are assumed to be constant and can be determined experimentally based on the results of the impact test. Hence, the variable of the system "machine tool - holder - workpiece" is the machined part, which can be easily modelled analytically. The method of receptance coupling enables a synthesis of experimental (spindle, tailstock) and analytical (machined part) models, so impact testing of the entire system becomes unnecessary. The paper presents methodology of analytical and experimental models synthesis, evaluation of the stability lobes and experimental validation procedure involving both the determination of the dynamic properties of the system and cutting tests. In the summary the experimental verification results would be presented and discussed.

  10. Micro-array isolation of circulating tumor cells (CTCs): the droplet biopsy chip

    NASA Astrophysics Data System (ADS)

    Panchapakesan, B.

    2017-08-01

    We present a new method for circulating tumor cell capture based on micro-array isolation from droplets. Called droplet biopsy, our technique uses a 76-element array of carbon nanotube devices functionalized with anti-EpCAM and antiHer2 antibodies for immunocapture of spiked breast cancer cells in the blood. This droplet biopsy chip can enable capture of CTCs based on both positive and negative selection strategy. Negative selection is achieved through depletion of contaminating leukocytes through the differential settling of blood into layers. We report 55%-100% cancer cell capture yield in this first droplet biopsy chip study. The droplet biopsy is an enabling idea where one can capture CTCs based on multiple biomarkers in a single blood sample.

  11. Enabling fast, stable and accurate peridynamic computations using multi-time-step integration

    DOE PAGES

    Lindsay, P.; Parks, M. L.; Prakash, A.

    2016-04-13

    Peridynamics is a nonlocal extension of classical continuum mechanics that is well-suited for solving problems with discontinuities such as cracks. This paper extends the peridynamic formulation to decompose a problem domain into a number of smaller overlapping subdomains and to enable the use of different time steps in different subdomains. This approach allows regions of interest to be isolated and solved at a small time step for increased accuracy while the rest of the problem domain can be solved at a larger time step for greater computational efficiency. Lastly, performance of the proposed method in terms of stability, accuracy, andmore » computational cost is examined and several numerical examples are presented to corroborate the findings.« less

  12. Confinement of hydrogen at high pressure in carbon nanotubes

    DOEpatents

    Lassila, David H [Aptos, CA; Bonner, Brian P [Livermore, CA

    2011-12-13

    A high pressure hydrogen confinement apparatus according to one embodiment includes carbon nanotubes capped at one or both ends thereof with a hydrogen-permeable membrane to enable the high pressure confinement of hydrogen and release of the hydrogen therethrough. A hydrogen confinement apparatus according to another embodiment includes an array of multi-walled carbon nanotubes each having first and second ends, the second ends being capped with palladium (Pd) to enable the high pressure confinement of hydrogen and release of the hydrogen therethrough as a function of palladium temperature, wherein the array of carbon nanotubes is capable of storing hydrogen gas at a pressure of at least 1 GPa for greater than 24 hours. Additional apparatuses and methods are also presented.

  13. Developing a multipoint titration method with a variable dose implementation for anaerobic digestion monitoring.

    PubMed

    Salonen, K; Leisola, M; Eerikäinen, T

    2009-01-01

    Determination of metabolites from an anaerobic digester with an acid base titration is considered as superior method for many reasons. This paper describes a practical at line compatible multipoint titration method. The titration procedure was improved by speed and data quality. A simple and novel control algorithm for estimating a variable titrant dose was derived for this purpose. This non-linear PI-controller like algorithm does not require any preliminary information from sample. Performance of this controller is superior compared to traditional linear PI-controllers. In addition, simplification for presenting polyprotic acids as a sum of multiple monoprotic acids is introduced along with a mathematical error examination. A method for inclusion of the ionic strength effect with stepwise iteration is shown. The titration model is presented with matrix notations enabling simple computation of all concentration estimates. All methods and algorithms are illustrated in the experimental part. A linear correlation better than 0.999 was obtained for both acetate and phosphate used as model compounds with slopes of 0.98 and 1.00 and average standard deviations of 0.6% and 0.8%, respectively. Furthermore, insensitivity of the presented method for overlapping buffer capacity curves was shown.

  14. A new phase encoding approach for a compact head-up display

    NASA Astrophysics Data System (ADS)

    Suszek, Jaroslaw; Makowski, Michal; Sypek, Maciej; Siemion, Andrzej; Kolodziejczyk, Andrzej; Bartosz, Andrzej

    2008-12-01

    The possibility of encoding multiple asymmetric symbols into a single thin binary Fourier hologram would have a practical application in the design of simple translucent holographic head-up displays. A Fourier hologram displays the encoded images at the infinity so this enables an observation without a time-consuming eye accommodation. Presenting a set of the most crucial signs for a driver in this way is desired, especially by older people with various eyesight disabilities. In this paper a method of holographic design is presented that assumes a combination of a spatial segmentation and carrier frequencies. It allows to achieve multiple reconstructed images selectable by the angle of the incident laser beam. In order to encode several binary symbols into a single Fourier hologram, the chessboard shaped segmentation function is used. An optimized sequence of phase encoding steps and a final direct phase binarization enables recording of asymmetric symbols into a binary hologram. The theoretical analysis is presented, verified numerically and confirmed in the optical experiment. We suggest and describe a practical and highly useful application of such holograms in an inexpensive HUD device for the use of the automotive industry. We present two alternative propositions of car viewing setups.

  15. Using multimodal information for the segmentation of fluorescent micrographs with application to virology and microbiology.

    PubMed

    Held, Christian; Wenzel, Jens; Webel, Rike; Marschall, Manfred; Lang, Roland; Palmisano, Ralf; Wittenberg, Thomas

    2011-01-01

    In order to improve reproducibility and objectivity of fluorescence microscopy based experiments and to enable the evaluation of large datasets, flexible segmentation methods are required which are able to adapt to different stainings and cell types. This adaption is usually achieved by the manual adjustment of the segmentation methods parameters, which is time consuming and challenging for biologists with no knowledge on image processing. To avoid this, parameters of the presented methods automatically adapt to user generated ground truth to determine the best method and the optimal parameter setup. These settings can then be used for segmentation of the remaining images. As robust segmentation methods form the core of such a system, the currently used watershed transform based segmentation routine is replaced by a fast marching level set based segmentation routine which incorporates knowledge on the cell nuclei. Our evaluations reveal that incorporation of multimodal information improves segmentation quality for the presented fluorescent datasets.

  16. An Advice Mechanism for Heterogeneous Robot Teams

    NASA Astrophysics Data System (ADS)

    Daniluk, Steven

    The use of reinforcement learning for robot teams has enabled complex tasks to be performed, but at the cost of requiring a large amount of exploration. Exchanging information between robots in the form of advice is one method to accelerate performance improvements. This thesis presents an advice mechanism for robot teams that utilizes advice from heterogeneous advisers via a method guaranteeing convergence to an optimal policy. The presented mechanism has the capability to use multiple advisers at each time step, and decide when advice should be requested and accepted, such that the use of advice decreases over time. Additionally, collective collaborative, and cooperative behavioural algorithms are integrated into a robot team architecture, to create a new framework that provides fault tolerance and modularity for robot teams.

  17. Generation of binary holograms for deep scenes captured with a camera and a depth sensor

    NASA Astrophysics Data System (ADS)

    Leportier, Thibault; Park, Min-Chul

    2017-01-01

    This work presents binary hologram generation from images of a real object acquired from a Kinect sensor. Since hologram calculation from a point-cloud or polygon model presents a heavy computational burden, we adopted a depth-layer approach to generate the holograms. This method enables us to obtain holographic data of large scenes quickly. Our investigations focus on the performance of different methods, iterative and noniterative, to convert complex holograms into binary format. Comparisons were performed to examine the reconstruction of the binary holograms at different depths. We also propose to modify the direct binary search algorithm to take into account several reference image planes. Then, deep scenes featuring multiple planes of interest can be reconstructed with better efficiency.

  18. System and method for characterizing voiced excitations of speech and acoustic signals, removing acoustic noise from speech, and synthesizing speech

    DOEpatents

    Burnett, Greg C [Livermore, CA; Holzrichter, John F [Berkeley, CA; Ng, Lawrence C [Danville, CA

    2006-08-08

    The present invention is a system and method for characterizing human (or animate) speech voiced excitation functions and acoustic signals, for removing unwanted acoustic noise which often occurs when a speaker uses a microphone in common environments, and for synthesizing personalized or modified human (or other animate) speech upon command from a controller. A low power EM sensor is used to detect the motions of windpipe tissues in the glottal region of the human speech system before, during, and after voiced speech is produced by a user. From these tissue motion measurements, a voiced excitation function can be derived. Further, the excitation function provides speech production information to enhance noise removal from human speech and it enables accurate transfer functions of speech to be obtained. Previously stored excitation and transfer functions can be used for synthesizing personalized or modified human speech. Configurations of EM sensor and acoustic microphone systems are described to enhance noise cancellation and to enable multiple articulator measurements.

  19. System and method for characterizing voiced excitations of speech and acoustic signals, removing acoustic noise from speech, and synthesizing speech

    DOEpatents

    Burnett, Greg C.; Holzrichter, John F.; Ng, Lawrence C.

    2004-03-23

    The present invention is a system and method for characterizing human (or animate) speech voiced excitation functions and acoustic signals, for removing unwanted acoustic noise which often occurs when a speaker uses a microphone in common environments, and for synthesizing personalized or modified human (or other animate) speech upon command from a controller. A low power EM sensor is used to detect the motions of windpipe tissues in the glottal region of the human speech system before, during, and after voiced speech is produced by a user. From these tissue motion measurements, a voiced excitation function can be derived. Further, the excitation function provides speech production information to enhance noise removal from human speech and it enables accurate transfer functions of speech to be obtained. Previously stored excitation and transfer functions can be used for synthesizing personalized or modified human speech. Configurations of EM sensor and acoustic microphone systems are described to enhance noise cancellation and to enable multiple articulator measurements.

  20. System and method for characterizing voiced excitations of speech and acoustic signals, removing acoustic noise from speech, and synthesizing speech

    DOEpatents

    Burnett, Greg C.; Holzrichter, John F.; Ng, Lawrence C.

    2006-02-14

    The present invention is a system and method for characterizing human (or animate) speech voiced excitation functions and acoustic signals, for removing unwanted acoustic noise which often occurs when a speaker uses a microphone in common environments, and for synthesizing personalized or modified human (or other animate) speech upon command from a controller. A low power EM sensor is used to detect the motions of windpipe tissues in the glottal region of the human speech system before, during, and after voiced speech is produced by a user. From these tissue motion measurements, a voiced excitation function can be derived. Further, the excitation function provides speech production information to enhance noise removal from human speech and it enables accurate transfer functions of speech to be obtained. Previously stored excitation and transfer functions can be used for synthesizing personalized or modified human speech. Configurations of EM sensor and acoustic microphone systems are described to enhance noise cancellation and to enable multiple articulator measurements.

  1. System And Method For Characterizing Voiced Excitations Of Speech And Acoustic Signals, Removing Acoustic Noise From Speech, And Synthesizi

    DOEpatents

    Burnett, Greg C.; Holzrichter, John F.; Ng, Lawrence C.

    2006-04-25

    The present invention is a system and method for characterizing human (or animate) speech voiced excitation functions and acoustic signals, for removing unwanted acoustic noise which often occurs when a speaker uses a microphone in common environments, and for synthesizing personalized or modified human (or other animate) speech upon command from a controller. A low power EM sensor is used to detect the motions of windpipe tissues in the glottal region of the human speech system before, during, and after voiced speech is produced by a user. From these tissue motion measurements, a voiced excitation function can be derived. Further, the excitation function provides speech production information to enhance noise removal from human speech and it enables accurate transfer functions of speech to be obtained. Previously stored excitation and transfer functions can be used for synthesizing personalized or modified human speech. Configurations of EM sensor and acoustic microphone systems are described to enhance noise cancellation and to enable multiple articulator measurements.

  2. Empirical intrinsic geometry for nonlinear modeling and time series filtering.

    PubMed

    Talmon, Ronen; Coifman, Ronald R

    2013-07-30

    In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.

  3. Computational neuroanatomy using brain deformations: From brain parcellation to multivariate pattern analysis and machine learning.

    PubMed

    Davatzikos, Christos

    2016-10-01

    The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. Copyright © 2016. Published by Elsevier B.V.

  4. Lithography Assisted Fiber-Drawing Nanomanufacturing

    PubMed Central

    Gholipour, Behrad; Bastock, Paul; Cui, Long; Craig, Christopher; Khan, Khouler; Hewak, Daniel W.; Soci, Cesare

    2016-01-01

    We present a high-throughput and scalable technique for the production of metal nanowires embedded in glass fibres by taking advantage of thin film properties and patterning techniques commonly used in planar microfabrication. This hybrid process enables the fabrication of single nanowires and nanowire arrays encased in a preform material within a single fibre draw, providing an alternative to costly and time-consuming iterative fibre drawing. This method allows the combination of materials with different thermal properties to create functional optoelectronic nanostructures. As a proof of principle of the potential of this technique, centimetre long gold nanowires (bulk Tm = 1064 °C) embedded in silicate glass fibres (Tg = 567 °C) were drawn in a single step with high aspect ratios (>104); such nanowires can be released from the glass matrix and show relatively high electrical conductivity. Overall, this fabrication method could enable mass manufacturing of metallic nanowires for plasmonics and nonlinear optics applications, as well as the integration of functional multimaterial structures for completely fiberised optoelectronic devices. PMID:27739543

  5. Optimization of infrared two-color multicycle field synthesis for intense-isolated-attosecond-pulse generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lan Pengfei; Takahashi, Eiji J.; Midorikawa, Katsumi

    2010-11-15

    We present the optimization of the two-color synthesis method for generating an intense isolated attosecond pulse (IAP) in the multicycle regime. By mixing an infrared assistant pulse with a Ti:sapphire main pulse, we show that an IAP can be produced using a multicycle two-color pulse with a duration longer than 30 fs. We also discuss the influence of the carrier-envelope phase (CEP) and the relative intensity on the generation of IAPs. By optimizing the wavelength of the assistant field, IAP generation becomes insensitive to the CEP slip. Therefore, the optimized two-color method enables us to relax the requirements of pulsemore » duration and easily produce the IAP with a conventional multicycle laser pulse. In addition, it enables us to markedly suppress the ionization of the harmonic medium. This is a major advantage for efficiently generating intense IAPs from a neutral medium by applying the appropriate phase-matching and energy-scaling techniques.« less

  6. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  7. Identity-by-Descent-Based Phasing and Imputation in Founder Populations Using Graphical Models

    PubMed Central

    Palin, Kimmo; Campbell, Harry; Wright, Alan F; Wilson, James F; Durbin, Richard

    2011-01-01

    Accurate knowledge of haplotypes, the combination of alleles co-residing on a single copy of a chromosome, enables powerful gene mapping and sequence imputation methods. Since humans are diploid, haplotypes must be derived from genotypes by a phasing process. In this study, we present a new computational model for haplotype phasing based on pairwise sharing of haplotypes inferred to be Identical-By-Descent (IBD). We apply the Bayesian network based model in a new phasing algorithm, called systematic long-range phasing (SLRP), that can capitalize on the close genetic relationships in isolated founder populations, and show with simulated and real genome-wide genotype data that SLRP substantially reduces the rate of phasing errors compared to previous phasing algorithms. Furthermore, the method accurately identifies regions of IBD, enabling linkage-like studies without pedigrees, and can be used to impute most genotypes with very low error rate. Genet. Epidemiol. 2011. © 2011 Wiley Periodicals, Inc.35:853-860, 2011 PMID:22006673

  8. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  9. Structures and Materials Working Group report

    NASA Technical Reports Server (NTRS)

    Torczyner, Robert; Hanks, Brantley R.

    1986-01-01

    The appropriateness of the selection of four issues (advanced materials development, analysis/design methods, tests of large flexible structures, and structural concepts) was evaluated. A cross-check of the issues and their relationship to the technology drivers is presented. Although all of the issues addressed numerous drivers, the advanced materials development issue impacts six out of the seven drivers and is considered to be the most crucial. The advanced materials technology development and the advanced design/analysis methods development were determined to be enabling technologies with the testing issues and development of structural concepts considered to be of great importance, although not enabling technologies. In addition, and of more general interest and criticality, the need for a Government/Industry commitment which does not now exist, was established. This commitment would call for the establishment of the required infrastructure to facilitate the development of the capabilities highlighted through the availability of resources and testbed facilities, including a national testbed in space to be in place in ten years.

  10. Lithography Assisted Fiber-Drawing Nanomanufacturing

    NASA Astrophysics Data System (ADS)

    Gholipour, Behrad; Bastock, Paul; Cui, Long; Craig, Christopher; Khan, Khouler; Hewak, Daniel W.; Soci, Cesare

    2016-10-01

    We present a high-throughput and scalable technique for the production of metal nanowires embedded in glass fibres by taking advantage of thin film properties and patterning techniques commonly used in planar microfabrication. This hybrid process enables the fabrication of single nanowires and nanowire arrays encased in a preform material within a single fibre draw, providing an alternative to costly and time-consuming iterative fibre drawing. This method allows the combination of materials with different thermal properties to create functional optoelectronic nanostructures. As a proof of principle of the potential of this technique, centimetre long gold nanowires (bulk Tm = 1064 °C) embedded in silicate glass fibres (Tg = 567 °C) were drawn in a single step with high aspect ratios (>104) such nanowires can be released from the glass matrix and show relatively high electrical conductivity. Overall, this fabrication method could enable mass manufacturing of metallic nanowires for plasmonics and nonlinear optics applications, as well as the integration of functional multimaterial structures for completely fiberised optoelectronic devices.

  11. Computational neuroanatomy using brain deformations: From brain parcellation to multivariate pattern analysis and machine learning

    PubMed Central

    Davatzikos, Christos

    2017-01-01

    The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. PMID:27514582

  12. Health Technology-Enabled Interventions for Adherence Support and Retention in Care Among US HIV-Infected Adolescents and Young Adults: An Integrative Review.

    PubMed

    Navarra, Ann-Margaret Dunn; Gwadz, Marya Viorst; Whittemore, Robin; Bakken, Suzanne R; Cleland, Charles M; Burleson, Winslow; Jacobs, Susan Kaplan; Melkus, Gail D'Eramo

    2017-11-01

    The objective of this integrative review was to describe current US trends for health technology-enabled adherence interventions among behaviorally HIV-infected youth (ages 13-29 years), and present the feasibility and efficacy of identified interventions. A comprehensive search was executed across five electronic databases (January 2005-March 2016). Of the 1911 identified studies, nine met the inclusion criteria of quantitative or mixed methods design, technology-enabled adherence and or retention intervention for US HIV-infected youth. The majority were small pilots. Intervention dose varied between studies applying similar technology platforms with more than half not informed by a theoretical framework. Retention in care was not a reported outcome, and operationalization of adherence was heterogeneous across studies. Despite these limitations, synthesized findings from this review demonstrate feasibility of computer-based interventions, and initial efficacy of SMS texting for adherence support among HIV-infected youth. Moving forward, there is a pressing need for the expansion of this evidence base.

  13. GLO-Roots: an imaging platform enabling multidimensional characterization of soil-grown root systems

    PubMed Central

    Rellán-Álvarez, Rubén; Lobet, Guillaume; Lindner, Heike; Pradier, Pierre-Luc; Sebastian, Jose; Yee, Muh-Ching; Geng, Yu; Trontin, Charlotte; LaRue, Therese; Schrager-Lavelle, Amanda; Haney, Cara H; Nieu, Rita; Maloof, Julin; Vogel, John P; Dinneny, José R

    2015-01-01

    Root systems develop different root types that individually sense cues from their local environment and integrate this information with systemic signals. This complex multi-dimensional amalgam of inputs enables continuous adjustment of root growth rates, direction, and metabolic activity that define a dynamic physical network. Current methods for analyzing root biology balance physiological relevance with imaging capability. To bridge this divide, we developed an integrated-imaging system called Growth and Luminescence Observatory for Roots (GLO-Roots) that uses luminescence-based reporters to enable studies of root architecture and gene expression patterns in soil-grown, light-shielded roots. We have developed image analysis algorithms that allow the spatial integration of soil properties, gene expression, and root system architecture traits. We propose GLO-Roots as a system that has great utility in presenting environmental stimuli to roots in ways that evoke natural adaptive responses and in providing tools for studying the multi-dimensional nature of such processes. DOI: http://dx.doi.org/10.7554/eLife.07597.001 PMID:26287479

  14. Towards a Semantically-Enabled Control Strategy for Building Simulations: Integration of Semantic Technologies and Model Predictive Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.

    State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less

  15. Propagation stability of self-reconstructing Bessel beams enables contrast-enhanced imaging in thick media.

    PubMed

    Fahrbach, Florian O; Rohrbach, Alexander

    2012-01-17

    Laser beams that can self-reconstruct their initial beam profile even in the presence of massive phase perturbations are able to propagate deeper into inhomogeneous media. This ability has crucial advantages for light sheet-based microscopy in thick media, such as cell clusters, embryos, skin or brain tissue or plants, as well as scattering synthetic materials. A ring system around the central intensity maximum of a Bessel beam enables its self-reconstruction, but at the same time illuminates out-of-focus regions and deteriorates image contrast. Here we present a detection method that minimizes the negative effect of the ring system. The beam's propagation stability along one straight line enables the use of a confocal line principle, resulting in a significant increase in image contrast. The axial resolution could be improved by nearly 100% relative to the standard light-sheet techniques using scanned Gaussian beams, while demonstrating self-reconstruction also for high propagation depths.

  16. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  17. Optimization of a yeast RNA interference system for controlling gene expression and enabling rapid metabolic engineering.

    PubMed

    Crook, Nathan C; Schmitz, Alexander C; Alper, Hal S

    2014-05-16

    Reduction of endogenous gene expression is a fundamental operation of metabolic engineering, yet current methods for gene knockdown (i.e., genome editing) remain laborious and slow, especially in yeast. In contrast, RNA interference allows facile and tunable gene knockdown via a simple plasmid transformation step, enabling metabolic engineers to rapidly prototype knockdown strategies in multiple strains before expending significant cost to undertake genome editing. Although RNAi is naturally present in a myriad of eukaryotes, it has only been recently implemented in Saccharomyces cerevisiae as a heterologous pathway and so has not yet been optimized as a metabolic engineering tool. In this study, we elucidate a set of design principles for the construction of hairpin RNA expression cassettes in yeast and implement RNA interference to quickly identify routes for improvement of itaconic acid production in this organism. The approach developed here enables rapid prototyping of knockdown strategies and thus accelerates and reduces the cost of the design-build-test cycle in yeast.

  18. GLO-Roots: An imaging platform enabling multidimensional characterization of soil-grown root systems

    DOE PAGES

    Rellan-Alvarez, Ruben; Lobet, Guillaume; Lindner, Heike; ...

    2015-08-19

    Root systems develop different root types that individually sense cues from their local environment and integrate this information with systemic signals. This complex multi-dimensional amalgam of inputs enables continuous adjustment of root growth rates, direction, and metabolic activity that define a dynamic physical network. Current methods for analyzing root biology balance physiological relevance with imaging capability. To bridge this divide, we developed an integrated-imaging system called Growth and Luminescence Observatory for Roots (GLO-Roots) that uses luminescence-based reporters to enable studies of root architecture and gene expression patterns in soil-grown, light-shielded roots. We have developed image analysis algorithms that allow themore » spatial integration of soil properties, gene expression, and root system architecture traits. We propose GLO-Roots as a system that has great utility in presenting environmental stimuli to roots in ways that evoke natural adaptive responses and in providing tools for studying the multi-dimensional nature of such processes.« less

  19. Bio-recycling of metals: Recycling of technical products using biological applications.

    PubMed

    Pollmann, Katrin; Kutschke, Sabine; Matys, Sabine; Raff, Johannes; Hlawacek, Gregor; Lederer, Franziska L

    2018-03-16

    The increasing demand of different essential metals as a consequence of the development of new technologies, especially in the so called "low carbon technologies" require the development of innovative technologies that enable an economic and environmentally friendly metal recovery from primary and secondary resources. There is serious concern that the demand of some critical elements might exceed the present supply within a few years, thus necessitating the development of novel strategies and technologies to meet the requirements of industry and society. Besides an improvement of exploitation and processing of ores, the more urgent issue of recycling of strategic metals has to be enforced. However, current recycling rates are very low due to the increasing complexity of products and the low content of certain critical elements, thus hindering an economic metal recovery. On the other hand, increasing environmental consciousness as well as limitations of classical methods require innovative recycling methodologies in order to enable a circular economy. Modern biotechnologies can contribute to solve some of the problems related to metal recycling. These approaches use natural properties of organisms, bio-compounds, and biomolecules to interact with minerals, materials, metals, or metal ions such as surface attachment, mineral dissolution, transformation, and metal complexation. Further, modern genetic approaches, e.g. realized by synthetic biology, enable the smart design of new chemicals. The article presents some recent developments in the fields of bioleaching, biosorption, bioreduction, and bioflotation, and their use for metal recovery from different waste materials. Currently only few of these developments are commercialized. Major limitations are high costs in comparison to conventional methods and low element selectivity. The article discusses future trends to overcome these barriers. Especially interdisciplinary approaches, the combination of different technologies, the inclusion of modern genetic methods, as well as the consideration of existing, yet unexplored natural resources will push innovations in these fields. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Blood pulsation measurement using cameras operating in visible light: limitations.

    PubMed

    Koprowski, Robert

    2016-10-03

    The paper presents an automatic method for analysis and processing of images from a camera operating in visible light. This analysis applies to images containing the human facial area (body) and enables to measure the blood pulse rate. Special attention was paid to the limitations of this measurement method taking into account the possibility of using consumer cameras in real conditions (different types of lighting, different camera resolution, camera movement). The proposed new method of image analysis and processing was associated with three stages: (1) image pre-processing-allowing for the image filtration and stabilization (object location tracking); (2) main image processing-allowing for segmentation of human skin areas, acquisition of brightness changes; (3) signal analysis-filtration, FFT (Fast Fourier Transformation) analysis, pulse calculation. The presented algorithm and method for measuring the pulse rate has the following advantages: (1) it allows for non-contact and non-invasive measurement; (2) it can be carried out using almost any camera, including webcams; (3) it enables to track the object on the stage, which allows for the measurement of the heart rate when the patient is moving; (4) for a minimum of 40,000 pixels, it provides a measurement error of less than ±2 beats per minute for p < 0.01 and sunlight, or a slightly larger error (±3 beats per minute) for artificial lighting; (5) analysis of a single image takes about 40 ms in Matlab Version 7.11.0.584 (R2010b) with Image Processing Toolbox Version 7.1 (R2010b).

  1. Chemical signal activation of an organocatalyst enables control over soft material formation.

    PubMed

    Trausel, Fanny; Maity, Chandan; Poolman, Jos M; Kouwenberg, D S J; Versluis, Frank; van Esch, Jan H; Eelkema, Rienk

    2017-10-12

    Cells can react to their environment by changing the activity of enzymes in response to specific chemical signals. Artificial catalysts capable of being activated by chemical signals are rare, but of interest for creating autonomously responsive materials. We present an organocatalyst that is activated by a chemical signal, enabling temporal control over reaction rates and the formation of materials. Using self-immolative chemistry, we design a deactivated aniline organocatalyst that is activated by the chemical signal hydrogen peroxide and catalyses hydrazone formation. Upon activation of the catalyst, the rate of hydrazone formation increases 10-fold almost instantly. The responsive organocatalyst enables temporal control over the formation of gels featuring hydrazone bonds. The generic design should enable the use of a large range of triggers and organocatalysts, and appears a promising method for the introduction of signal response in materials, constituting a first step towards achieving communication between artificial chemical systems.Enzymes regulated by chemical signals are common in biology, but few such artificial catalysts exist. Here, the authors design an aniline catalyst that, when activated by a chemical trigger, catalyses formation of hydrazone-based gels, demonstrating signal response in a soft material.

  2. Trust-based information system architecture for personal wellness.

    PubMed

    Ruotsalainen, Pekka; Nykänen, Pirkko; Seppälä, Antto; Blobel, Bernd

    2014-01-01

    Modern eHealth, ubiquitous health and personal wellness systems take place in an unsecure and ubiquitous information space where no predefined trust occurs. This paper presents novel information model and an architecture for trust based privacy management of personal health and wellness information in ubiquitous environment. The architecture enables a person to calculate a dynamic and context-aware trust value for each service provider, and using it to design personal privacy policies for trustworthy use of health and wellness services. For trust calculation a novel set of measurable context-aware and health information-sensitive attributes is developed. The architecture enables a person to manage his or her privacy in ubiquitous environment by formulating context-aware and service provider specific policies. Focus groups and information modelling was used for developing a wellness information model. System analysis method based on sequential steps that enable to combine results of analysis of privacy and trust concerns and the selection of trust and privacy services was used for development of the information system architecture. Its services (e.g. trust calculation, decision support, policy management and policy binding services) and developed attributes enable a person to define situation-aware policies that regulate the way his or her wellness and health information is processed.

  3. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  4. DEKOIS: demanding evaluation kits for objective in silico screening--a versatile tool for benchmarking docking programs and scoring functions.

    PubMed

    Vogel, Simon M; Bauer, Matthias R; Boeckler, Frank M

    2011-10-24

    For widely applied in silico screening techniques success depends on the rational selection of an appropriate method. We herein present a fast, versatile, and robust method to construct demanding evaluation kits for objective in silico screening (DEKOIS). This automated process enables creating tailor-made decoy sets for any given sets of bioactives. It facilitates a target-dependent validation of docking algorithms and scoring functions helping to save time and resources. We have developed metrics for assessing and improving decoy set quality and employ them to investigate how decoy embedding affects docking. We demonstrate that screening performance is target-dependent and can be impaired by latent actives in the decoy set (LADS) or enhanced by poor decoy embedding. The presented method allows extending and complementing the collection of publicly available high quality decoy sets toward new target space. All present and future DEKOIS data sets will be made accessible at www.dekois.com.

  5. Efficient calculation of beyond RPA correlation energies in the dielectric matrix formalism

    NASA Astrophysics Data System (ADS)

    Beuerle, Matthias; Graf, Daniel; Schurkus, Henry F.; Ochsenfeld, Christian

    2018-05-01

    We present efficient methods to calculate beyond random phase approximation (RPA) correlation energies for molecular systems with up to 500 atoms. To reduce the computational cost, we employ the resolution-of-the-identity and a double-Laplace transform of the non-interacting polarization propagator in conjunction with an atomic orbital formalism. Further improvements are achieved using integral screening and the introduction of Cholesky decomposed densities. Our methods are applicable to the dielectric matrix formalism of RPA including second-order screened exchange (RPA-SOSEX), the RPA electron-hole time-dependent Hartree-Fock (RPA-eh-TDHF) approximation, and RPA renormalized perturbation theory using an approximate exchange kernel (RPA-AXK). We give an application of our methodology by presenting RPA-SOSEX benchmark results for the L7 test set of large, dispersion dominated molecules, yielding a mean absolute error below 1 kcal/mol. The present work enables calculating beyond RPA correlation energies for significantly larger molecules than possible to date, thereby extending the applicability of these methods to a wider range of chemical systems.

  6. Ion channel drug discovery and research: the automated Nano-Patch-Clamp technology.

    PubMed

    Brueggemann, A; George, M; Klau, M; Beckler, M; Steindl, J; Behrends, J C; Fertig, N

    2004-01-01

    Unlike the genomics revolution, which was largely enabled by a single technological advance (high throughput sequencing), rapid advancement in proteomics will require a broader effort to increase the throughput of a number of key tools for functional analysis of different types of proteins. In the case of ion channels -a class of (membrane) proteins of great physiological importance and potential as drug targets- the lack of adequate assay technologies is felt particularly strongly. The available, indirect, high throughput screening methods for ion channels clearly generate insufficient information. The best technology to study ion channel function and screen for compound interaction is the patch clamp technique, but patch clamping suffers from low throughput, which is not acceptable for drug screening. A first step towards a solution is presented here. The nano patch clamp technology, which is based on a planar, microstructured glass chip, enables automatic whole cell patch clamp measurements. The Port-a-Patch is an automated electrophysiology workstation, which uses planar patch clamp chips. This approach enables high quality and high content ion channel and compound evaluation on a one-cell-at-a-time basis. The presented automation of the patch process and its scalability to an array format are the prerequisites for any higher throughput electrophysiology instruments.

  7. Thermal imaging application for behavior study of chosen nocturnal animals

    NASA Astrophysics Data System (ADS)

    Pregowski, Piotr; Owadowska, Edyta; Pietrzak, Jan

    2004-04-01

    This paper presents preliminary results of the project brought up with aim to verify the hypothesis that small, nocturnal rodents use common paths which form a common, rather stable system for fast movement. This report concentrates on results of merging uniquely good detecting features of modern IR thermal cameras with newly elaborated software. Among the final results offered by this method there are both thermal movies and single synthetic graphic images of paths traced during a few minutes or hours of investigations, as well as detailed numerical data of the ".txt" type about chosen detected events. Although it is to early to say that elaborated method will allow us to answer all ecological questions, it is possible to say that we worked out a new, valuable tool for the next steps of our project. We expect that this method enables us to solve the important ecological problems of nocturnal animals study. Supervised, stably settled area can be enlarged by use of a few thermal imagers or IR thermographic cameras, simultaneously. Presented method can be applied in other uses, even distant from presented e.g. ecological corridors detection.

  8. Accurate acceleration of kinetic Monte Carlo simulations through the modification of rate constants.

    PubMed

    Chatterjee, Abhijit; Voter, Arthur F

    2010-05-21

    We present a novel computational algorithm called the accelerated superbasin kinetic Monte Carlo (AS-KMC) method that enables a more efficient study of rare-event dynamics than the standard KMC method while maintaining control over the error. In AS-KMC, the rate constants for processes that are observed many times are lowered during the course of a simulation. As a result, rare processes are observed more frequently than in KMC and the time progresses faster. We first derive error estimates for AS-KMC when the rate constants are modified. These error estimates are next employed to develop a procedure for lowering process rates with control over the maximum error. Finally, numerical calculations are performed to demonstrate that the AS-KMC method captures the correct dynamics, while providing significant CPU savings over KMC in most cases. We show that the AS-KMC method can be employed with any KMC model, even when no time scale separation is present (although in such cases no computational speed-up is observed), without requiring the knowledge of various time scales present in the system.

  9. Inverse full state hybrid projective synchronization for chaotic maps with different dimensions

    NASA Astrophysics Data System (ADS)

    Ouannas, Adel; Grassi, Giuseppe

    2016-09-01

    A new synchronization scheme for chaotic (hyperchaotic) maps with different dimensions is presented. Specifically, given a drive system map with dimension n and a response system with dimension m, the proposed approach enables each drive system state to be synchronized with a linear response combination of the response system states. The method, based on the Lyapunov stability theory and the pole placement technique, presents some useful features: (i) it enables synchronization to be achieved for both cases of n < m and n > m; (ii) it is rigorous, being based on theorems; (iii) it can be readily applied to any chaotic (hyperchaotic) maps defined to date. Finally, the capability of the approach is illustrated by synchronization examples between the two-dimensional Hénon map (as the drive system) and the three-dimensional hyperchaotic Wang map (as the response system), and the three-dimensional Hénon-like map (as the drive system) and the two-dimensional Lorenz discrete-time system (as the response system).

  10. High spatial and temporal resolution cell manipulation techniques in microchannels.

    PubMed

    Novo, Pedro; Dell'Aica, Margherita; Janasek, Dirk; Zahedi, René P

    2016-03-21

    The advent of microfluidics has enabled thorough control of cell manipulation experiments in so called lab on chips. Lab on chips foster the integration of actuation and detection systems, and require minute sample and reagent amounts. Typically employed microfluidic structures have similar dimensions as cells, enabling precise spatial and temporal control of individual cells and their local environments. Several strategies for high spatio-temporal control of cells in microfluidics have been reported in recent years, namely methods relying on careful design of the microfluidic structures (e.g. pinched flow), by integration of actuators (e.g. electrodes or magnets for dielectro-, acousto- and magneto-phoresis), or integrations thereof. This review presents the recent developments of cell experiments in microfluidics divided into two parts: an introduction to spatial control of cells in microchannels followed by special emphasis in the high temporal control of cell-stimulus reaction and quenching. In the end, the present state of the art is discussed in line with future perspectives and challenges for translating these devices into routine applications.

  11. IGA-ADS: Isogeometric analysis FEM using ADS solver

    NASA Astrophysics Data System (ADS)

    Łoś, Marcin M.; Woźniak, Maciej; Paszyński, Maciej; Lenharth, Andrew; Hassaan, Muhamm Amber; Pingali, Keshav

    2017-08-01

    In this paper we present a fast explicit solver for solution of non-stationary problems using L2 projections with isogeometric finite element method. The solver has been implemented within GALOIS framework. It enables parallel multi-core simulations of different time-dependent problems, in 1D, 2D, or 3D. We have prepared the solver framework in a way that enables direct implementation of the selected PDE and corresponding boundary conditions. In this paper we describe the installation, implementation of exemplary three PDEs, and execution of the simulations on multi-core Linux cluster nodes. We consider three case studies, including heat transfer, linear elasticity, as well as non-linear flow in heterogeneous media. The presented package generates output suitable for interfacing with Gnuplot and ParaView visualization software. The exemplary simulations show near perfect scalability on Gilbert shared-memory node with four Intel® Xeon® CPU E7-4860 processors, each possessing 10 physical cores (for a total of 40 cores).

  12. Parameterized examination in econometrics

    NASA Astrophysics Data System (ADS)

    Malinova, Anna; Kyurkchiev, Vesselin; Spasov, Georgi

    2018-01-01

    The paper presents a parameterization of basic types of exam questions in Econometrics. This algorithm is used to automate and facilitate the process of examination, assessment and self-preparation of a large number of students. The proposed parameterization of testing questions reduces the time required to author tests and course assignments. It enables tutors to generate a large number of different but equivalent dynamic questions (with dynamic answers) on a certain topic, which are automatically assessed. The presented methods are implemented in DisPeL (Distributed Platform for e-Learning) and provide questions in the areas of filtering and smoothing of time-series data, forecasting, building and analysis of single-equation econometric models. Questions also cover elasticity, average and marginal characteristics, product and cost functions, measurement of monopoly power, supply, demand and equilibrium price, consumer and product surplus, etc. Several approaches are used to enable the required numerical computations in DisPeL - integration of third-party mathematical libraries, developing our own procedures from scratch, and wrapping our legacy math codes in order to modernize and reuse them.

  13. Versatile plasma ion source with an internal evaporator

    NASA Astrophysics Data System (ADS)

    Turek, M.; Prucnal, S.; Drozdziel, A.; Pyszniak, K.

    2011-04-01

    A novel construction of an ion source with an evaporator placed inside a plasma chamber is presented. The crucible is heated to high temperatures directly by arc discharge, which makes the ion source suitable for substances with high melting points. The compact ion source enables production of intense ion beams for wide spectrum of solid elements with typical separated beam currents of ˜100-150 μA for Al +, Mn +, As + (which corresponds to emission current densities of 15-25 mA/cm 2) for the extraction voltage of 25 kV. The ion source works for approximately 50-70 h at 100% duty cycle, which enables high ion dose implantation. The typical power consumption of the ion source is 350-400 W. The paper presents detailed experimental data (e.g. dependences of ion currents and anode voltages on discharge and filament currents and magnetic flux densities) for Cr, Fe, Al, As, Mn and In. The discussion is supported by results of Monte Carlo method based numerical simulation of ionisation in the ion source.

  14. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2006-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  15. Typograph: Multiscale Spatial Exploration of Text Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Burtner, Edwin R.; Cramer, Nicholas O.

    2013-10-06

    Visualizing large document collections using a spatial layout of terms can enable quick overviews of information. These visual metaphors (e.g., word clouds, tag clouds, etc.) traditionally show a series of terms organized by space-filling algorithms. However, often lacking in these views is the ability to interactively explore the information to gain more detail, and the location and rendering of the terms are often not based on mathematical models that maintain relative distances from other information based on similarity metrics. In this paper, we present Typograph, a multi-scale spatial exploration visualization for large document collections. Based on the term-based visualization methods,more » Typograh enables multiple levels of detail (terms, phrases, snippets, and full documents) within the single spatialization. Further, the information is placed based on their relative similarity to other information to create the “near = similar” geographic metaphor. This paper discusses the design principles and functionality of Typograph and presents a use case analyzing Wikipedia to demonstrate usage.« less

  16. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  17. Calculation of Debye-Scherrer diffraction patterns from highly stressed polycrystalline materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacDonald, M. J., E-mail: macdonm@umich.edu; SLAC National Accelerator Laboratory, Menlo Park, California 94025; Vorberger, J.

    2016-06-07

    Calculations of Debye-Scherrer diffraction patterns from polycrystalline materials have typically been done in the limit of small deviatoric stresses. Although these methods are well suited for experiments conducted near hydrostatic conditions, more robust models are required to diagnose the large strain anisotropies present in dynamic compression experiments. A method to predict Debye-Scherrer diffraction patterns for arbitrary strains has been presented in the Voigt (iso-strain) limit [Higginbotham, J. Appl. Phys. 115, 174906 (2014)]. Here, we present a method to calculate Debye-Scherrer diffraction patterns from highly stressed polycrystalline samples in the Reuss (iso-stress) limit. This analysis uses elastic constants to calculate latticemore » strains for all initial crystallite orientations, enabling elastic anisotropy and sample texture effects to be modeled directly. The effects of probing geometry, deviatoric stresses, and sample texture are demonstrated and compared to Voigt limit predictions. An example of shock-compressed polycrystalline diamond is presented to illustrate how this model can be applied and demonstrates the importance of including material strength when interpreting diffraction in dynamic compression experiments.« less

  18. National Gender Policy in Public Education in the Russian Empire in the Latter Half of the 19th-Early 20th Centuries

    ERIC Educational Resources Information Center

    Saifullova, Razilia Rauilovna; Maslova, Inga Vladimirovna; Krapotkina, Irina Evgenevna; Kaviev, Airat Farkhatovich; Nasyrova, Liliya Gabdelvalievna

    2016-01-01

    This article presents the national gender policy in public education in the Russian Empire in the latter half of the 19th-early 20th centuries. In the course of work the authors have used special historical research methods enabling to hammer out the facts and to approach historical sources from a critical standpoint. The comparative method…

  19. Oxygen radicals as key mediators in neurological disease: fact or fiction?

    PubMed

    Halliwell, B

    1992-01-01

    A free radical is any species capable of independent existence that contains one or more unpaired electrons. Free radicals and other reactive oxygen species are frequently proposed to be involved in the pathology of several neurological disorders. Criteria for establishing such involvement are presented. Development of new methods for measuring oxidative damage should enable elucidation of the precise role of reactive oxygen species in neurological disorders.

  20. Modeling elasticity in crystal growth.

    PubMed

    Elder, K R; Katakowski, Mark; Haataja, Mikko; Grant, Martin

    2002-06-17

    A new model of crystal growth is presented that describes the phenomena on atomic length and diffusive time scales. The former incorporates elastic and plastic deformation in a natural manner, and the latter enables access to time scales much larger than conventional atomic methods. The model is shown to be consistent with the predictions of Read and Shockley for grain boundary energy, and Matthews and Blakeslee for misfit dislocations in epitaxial growth.

  1. Characterizing spatial and temporal variability of dissolved gases in aquatic environments with in situ mass spectrometry.

    PubMed

    Camilli, Richard; Duryea, Anthony N

    2009-07-01

    The TETHYS mass spectrometer is intended for long-term in situ observation of dissolved gases and volatile organic compounds in aquatic environments. Its design maintains excellent low mass range sensitivity and stability during long-term operations, enabling characterization of low-frequency variability in many trace dissolved gases. Results are presented from laboratory trials and a 300-h in situ trial in a shallow marine embayment in Massachusetts, U.S.A. This time series consists of over 15000 sample measurements and represents the longest continuous record made by an in situ mass spectrometer in an aquatic environment. These measurements possess sufficient sampling density and duration to apply frequency analysis techniques for study of temporal variability in dissolved gases. Results reveal correlations with specific environmental periodicities. Numerical methods are presented for converting mass spectrometer ion peak ratios to absolute-scale dissolved gas concentrations across wide temperature regimes irrespective of ambient pressure, during vertical water column profiles in a hypoxic deep marine basin off the coast of California, U.S.A. Dissolved oxygen concentration values obtained with the TETHYS instrument indicate close correlation with polarographic oxygen sensor data across the entire depth range. These methods and technology enable observation of aquatic environmental chemical distributions and dynamics at appropriate scales of resolution.

  2. SimPhospho: a software tool enabling confident phosphosite assignment.

    PubMed

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  3. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  4. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  5. Highly Conductive Thin Uniform Gold-Coated DNA Nanowires.

    PubMed

    Stern, Avigail; Eidelshtein, Gennady; Zhuravel, Roman; Livshits, Gideon I; Rotem, Dvir; Kotlyar, Alexander; Porath, Danny

    2018-06-01

    Over the past decades, DNA, the carrier of genetic information, has been used by researchers as a structural template material. Watson-Crick base pairing enables the formation of complex 2D and 3D structures from DNA through self-assembly. Various methods have been developed to functionalize these structures for numerous utilities. Metallization of DNA has attracted much attention as a means of forming conductive nanostructures. Nevertheless, most of the metallized DNA wires reported so far suffer from irregularity and lack of end-to-end electrical connectivity. An effective technique for formation of thin gold-coated DNA wires that overcomes these drawbacks is developed and presented here. A conductive atomic force microscopy setup, which is suitable for measuring tens to thousands of nanometer long molecules and wires, is used to characterize these DNA-based nanowires. The wires reported here are the narrowest gold-coated DNA wires that display long-range conductivity. The measurements presented show that the conductivity is limited by defects, and that thicker gold coating reduces the number of defects and increases the conductive length. This preparation method enables the formation of molecular wires with dimensions and uniformity that are much more suitable for DNA-based molecular electronics. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Infrared-optical transmission and reflection measurements on loose powders

    NASA Astrophysics Data System (ADS)

    Kuhn, J.; Korder, S.; Arduini-Schuster, M. C.; Caps, R.; Fricke, J.

    1993-09-01

    A method is described to determine quantitatively the infrared-optical properties of loose powder beds via directional-hemispherical transmission and reflection measurements. Instead of the integration of the powders into a potassium bromide (KBr) or a paraffin oil matrix, which would drastically alter the scattering behavior, the powders are placed onto supporting layers of polyethylene (PE) and KBr. A commercial spectrometer is supplemented by an external optics, which enables measurements on horizontally arranged samples. For data evaluation we use a solution of the equation of radiative transfer in the 3-flux approximation under boundary conditions adapted to the PE or KBr/powder system. A comparison with Kubelka-Munk's theory and Schuster's 2-flux approximation is performed, which shows that 3-flux approximation yields results closest to the exact solution. Equations are developed, which correct transmission and reflection of the samples for the influence of the supporting layer and calculate the specific extinction and the albedo of the powder and thus enables us to separate scattering and absorption part of the extinction spectrum. Measurements on TiO2 powder are presented, which show the influence of preparation techniques and data evaluation with different methods to obtain the albedo. The specific extinction of various TiO2 powders is presented.

  7. Propulsion Diagnostic Method Evaluation Strategy (ProDiMES) User's Guide

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.

    2010-01-01

    This report is a User's Guide for the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES). ProDiMES is a standard benchmarking problem and a set of evaluation metrics to enable the comparison of candidate aircraft engine gas path diagnostic methods. This Matlab (The Mathworks, Inc.) based software tool enables users to independently develop and evaluate diagnostic methods. Additionally, a set of blind test case data is also distributed as part of the software. This will enable the side-by-side comparison of diagnostic approaches developed by multiple users. The Users Guide describes the various components of ProDiMES, and provides instructions for the installation and operation of the tool.

  8. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    PubMed

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  9. Ultra-Rapid 2-D and 3-D Laser Microprinting of Proteins

    NASA Astrophysics Data System (ADS)

    Scott, Mark Andrew

    When viewed under the microscope, biological tissues reveal an exquisite microarchitecture. These complex patterns arise during development, as cells interact with a multitude of chemical and mechanical cues in the surrounding extracellular matrix. Tissue engineers have sought for decades to repair or replace damaged tissue, often relying on porous scaffolds as an artificial extracellular matrix to support cell development. However, these grafts are unable to recapitulate the complexity of the in vivo environment, limiting our ability to regenerate functional tissue. Biomedical engineers have developed several methods for printing two- and three-dimensional patterns of proteins for studying and directing cell development. Of these methods, laser microprinting of proteins has shown the most promise for printing sub-cellular resolution gradients of cues, but the photochemistry remains too slow to enable large-scale applications for screening and therapeutics In this work, we demonstrate a novel high-speed photochemistry based on multi-photon photobleaching of fluorescein, and we build the fastest 2-D and 3-D laser microprinter for proteins to date. First, we show that multiphoton photobleaching of a deoxygenated solution of biotin-4-fluorescein onto a PEG monolayer with acrylate end-group can enable print speeds of almost 20 million pixels per second at 600 nanometer resolution. We discovered that the mechanism of fluorescein photobleaching evolves from a 2-photon to 3- and 4-photon regime at higher laser intensities, unlocking faster printing kinetics. Using this 2-D printing system, we develop a novel triangle-ratchet method for directing the polarization of single hippocampal neurons. This ability to determine which neurite becomes an axon, and which neuritis become dendrites is an essential step for developing defined in vitro neural networks. Next, we modify our multiphoton photobleaching system to print in three dimensions. For the first time, we demonstrate 3-D printing of full length proteins in collagen, fibrin and gelatin methacrylate scaffolds, as well as printing in agarose and agarose methacrylate scaffolds. We also present a novel method for 3-D printing collagen scaffolds at unprecedented speeds, up to 14layers per second, generating complex shapes in seconds with sub-micron resolution. Finally, we demonstrate that 3-D printing of scaffold architecture and protein cues inside the scaffold can be combined, for the first time enabling structures with complex sub-micron architectures and chemical cues for directing development. We believe that the ultra-rapid printing technology presented in this thesis will be a key enabler in the development of complex, artificially engineered tissues and organs. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  10. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    PubMed

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. High-Sensitivity Measurement of Density by Magnetic Levitation.

    PubMed

    Nemiroski, Alex; Kumar, A A; Soh, Siowling; Harburg, Daniel V; Yu, Hai-Dong; Whitesides, George M

    2016-03-01

    This paper presents methods that use Magnetic Levitation (MagLev) to measure very small differences in density of solid diamagnetic objects suspended in a paramagnetic medium. Previous work in this field has shown that, while it is a convenient method, standard MagLev (i.e., where the direction of magnetization and gravitational force are parallel) cannot resolve differences in density <10(-4) g/cm(3) for macroscopic objects (>mm) because (i) objects close in density prevent each other from reaching an equilibrium height due to hard contact and excluded volume, and (ii) using weaker magnets or reducing the magnetic susceptibility of the medium destabilizes the magnetic trap. The present work investigates the use of weak magnetic gradients parallel to the faces of the magnets as a means of increasing the sensitivity of MagLev without destabilization. Configuring the MagLev device in a rotated state (i.e., where the direction of magnetization and gravitational force are perpendicular) relative to the standard configuration enables simple measurements along the axes with the highest sensitivity to changes in density. Manipulating the distance of separation between the magnets or the lengths of the magnets (along the axis of measurement) enables the sensitivity to be tuned. These modifications enable an improvement in the resolution up to 100-fold over the standard configuration, and measurements with resolution down to 10(-6) g/cm(3). Three examples of characterizing the small differences in density among samples of materials having ostensibly indistinguishable densities-Nylon spheres, PMMA spheres, and drug spheres-demonstrate the applicability of rotated Maglev to measuring the density of small (0.1-1 mm) objects with high sensitivity. This capability will be useful in materials science, separations, and quality control of manufactured objects.

  12. Reconfigurable Model Execution in the OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Hwang, John T.

    2017-01-01

    NASA's OpenMDAO framework facilitates constructing complex models and computing their derivatives for multidisciplinary design optimization. Decomposing a model into components that follow a prescribed interface enables OpenMDAO to assemble multidisciplinary derivatives from the component derivatives using what amounts to the adjoint method, direct method, chain rule, global sensitivity equations, or any combination thereof, using the MAUD architecture. OpenMDAO also handles the distribution of processors among the disciplines by hierarchically grouping the components, and it automates the data transfer between components that are on different processors. These features have made OpenMDAO useful for applications in aircraft design, satellite design, wind turbine design, and aircraft engine design, among others. This paper presents new algorithms for OpenMDAO that enable reconfigurable model execution. This concept refers to dynamically changing, during execution, one or more of: the variable sizes, solution algorithm, parallel load balancing, or set of variables-i.e., adding and removing components, perhaps to switch to a higher-fidelity sub-model. Any component can reconfigure at any point, even when running in parallel with other components, and the reconfiguration algorithm presented here performs the synchronized updates to all other components that are affected. A reconfigurable software framework for multidisciplinary design optimization enables new adaptive solvers, adaptive parallelization, and new applications such as gradient-based optimization with overset flow solvers and adaptive mesh refinement. Benchmarking results demonstrate the time savings for reconfiguration compared to setting up the model again from scratch, which can be significant in large-scale problems. Additionally, the new reconfigurability feature is applied to a mission profile optimization problem for commercial aircraft where both the parametrization of the mission profile and the time discretization are adaptively refined, resulting in computational savings of roughly 10% and the elimination of oscillations in the optimized altitude profile.

  13. Complex absorbing potentials within EOM-CC family of methods: Theory, implementation, and benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuev, Dmitry; Jagau, Thomas-C.; Krylov, Anna I.

    2014-07-14

    A production-level implementation of equation-of-motion coupled-cluster singles and doubles (EOM-CCSD) for electron attachment and excitation energies augmented by a complex absorbing potential (CAP) is presented. The new method enables the treatment of metastable states within the EOM-CC formalism in a similar manner as bound states. The numeric performance of the method and the sensitivity of resonance positions and lifetimes to the CAP parameters and the choice of one-electron basis set are investigated. A protocol for studying molecular shape resonances based on the use of standard basis sets and a universal criterion for choosing the CAP parameters are presented. Our resultsmore » for a variety of π{sup *} shape resonances of small to medium-size molecules demonstrate that CAP-augmented EOM-CCSD is competitive relative to other theoretical approaches for the treatment of resonances and is often able to reproduce experimental results.« less

  14. Fitting aerodynamic forces in the Laplace domain: An application of a nonlinear nongradient technique to multilevel constrained optimization

    NASA Technical Reports Server (NTRS)

    Tiffany, S. H.; Adams, W. M., Jr.

    1984-01-01

    A technique which employs both linear and nonlinear methods in a multilevel optimization structure to best approximate generalized unsteady aerodynamic forces for arbitrary motion is described. Optimum selection of free parameters is made in a rational function approximation of the aerodynamic forces in the Laplace domain such that a best fit is obtained, in a least squares sense, to tabular data for purely oscillatory motion. The multilevel structure and the corresponding formulation of the objective models are presented which separate the reduction of the fit error into linear and nonlinear problems, thus enabling the use of linear methods where practical. Certain equality and inequality constraints that may be imposed are identified; a brief description of the nongradient, nonlinear optimizer which is used is given; and results which illustrate application of the method are presented.

  15. Scalable high-precision tuning of photonic resonators by resonant cavity-enhanced photoelectrochemical etching

    PubMed Central

    Gil-Santos, Eduardo; Baker, Christopher; Lemaître, Aristide; Gomez, Carmen; Leo, Giuseppe; Favero, Ivan

    2017-01-01

    Photonic lattices of mutually interacting indistinguishable cavities represent a cornerstone of collective phenomena in optics and could become important in advanced sensing or communication devices. The disorder induced by fabrication technologies has so far hindered the development of such resonant cavity architectures, while post-fabrication tuning methods have been limited by complexity and poor scalability. Here we present a new simple and scalable tuning method for ensembles of microphotonic and nanophotonic resonators, which enables their permanent collective spectral alignment. The method introduces an approach of cavity-enhanced photoelectrochemical etching in a fluid, a resonant process triggered by sub-bandgap light that allows for high selectivity and precision. The technique is presented on a gallium arsenide nanophotonic platform and illustrated by finely tuning one, two and up to five resonators. It opens the way to applications requiring large networks of identical resonators and their spectral referencing to external etalons. PMID:28117394

  16. THUNDER Piezoelectric Actuators as a Method of Stretch-Tuning an Optical Fiber Grating

    NASA Technical Reports Server (NTRS)

    Allison, Sidney G.; Fox, Robert L.; Froggatt, Mark E.; Childers, Brooks A.

    2000-01-01

    A method of stretching optical fiber holds interest for measuring strain in smart structures where the physical displacement may be used to tune optical fiber lasers. A small, light weight, low power tunable fiber laser is ideal for demodulating strain in optical fiber Bragg gratings attached to smart structures such as the re-usable launch vehicle that is being developed by NASA. A method is presented for stretching optical fibers using the THUNDER piezoelectric actuators invented at NASA Langley Research Center. THUNDER actuators use a piezoelectric layer bonded to a metal backing to enable the actuators to produce displacements larger than the unbonded piezoelectric material. The shift in reflected optical wavelength resulting from stretching the fiber Bragg grating is presented. Means of adapting THUNDER actuators for stretching optical fibers is discussed, including ferrules, ferrule clamp blocks, and plastic hinges made with stereo lithography.

  17. Method for Identification of Results of Dynamic Overloads in Assessment of Safety Use of the Mine Auxiliary Transportation System

    NASA Astrophysics Data System (ADS)

    Tokarczyk, Jarosław

    2016-12-01

    Method for identification the effects of dynamic overload affecting the people, which may occur in the emergency state of suspended monorail is presented in the paper. The braking curve using MBS (Multi-Body System) simulation was determined. For this purpose a computational model (MBS) of suspended monorail was developed and two different variants of numerical calculations were carried out. An algorithm of conducting numerical simulations to assess the effects of dynamic overload acting on the suspended monorails' users is also posted in the paper. An example of computational model FEM (Finite Element Method) composed of technical mean and the anthropometrical model ATB (Articulated Total Body) is shown. The simulation results are presented: graph of HIC (Head Injury Criterion) parameter and successive phases of dislocation of ATB model. Generator of computational models for safety criterion, which enables preparation of input data and remote starting the simulation, is proposed.

  18. Multi-fidelity machine learning models for accurate bandgap predictions of solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab

    Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less

  19. Multi-fidelity machine learning models for accurate bandgap predictions of solids

    DOE PAGES

    Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab

    2016-12-28

    Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less

  20. Descriptive and Computer Aided Drawing Perspective on an Unfolded Polyhedral Projection Surface

    NASA Astrophysics Data System (ADS)

    Dzwierzynska, Jolanta

    2017-10-01

    The aim of the herby study is to develop a method of direct and practical mapping of perspective on an unfolded prism polyhedral projection surface. The considered perspective representation is a rectilinear central projection onto a surface composed of several flat elements. In the paper two descriptive methods of drawing perspective are presented: direct and indirect. The graphical mapping of the effects of the representation is realized directly on the unfolded flat projection surface. That is due to the projective and graphical connection between points displayed on the polyhedral background and their counterparts received on the unfolded flat surface. For a significant improvement of the construction of line, analytical algorithms are formulated. They draw a perspective image of a segment of line passing through two different points determined by their coordinates in a spatial coordinate system of axis x, y, z. Compared to other perspective construction methods that use information about points, for computer vision and the computer aided design, our algorithms utilize data about lines, which are applied very often in architectural forms. Possibility of drawing lines in the considered perspective enables drawing an edge perspective image of an architectural object. The application of the changeable base elements of perspective as a horizon height and a station point location enable drawing perspective image from different viewing positions. The analytical algorithms for drawing perspective images are formulated in Mathcad software, however, they can be implemented in the majority of computer graphical packages, which can make drawing perspective more efficient and easier. The representation presented in the paper and the way of its direct mapping on the flat unfolded projection surface can find application in presentation of architectural space in advertisement and art.

  1. Audio Watermark Embedding Technique Applying Auditory Stream Segregation: "G-encoder Mark" Able to Be Extracted by Mobile Phone

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    We are developing audio watermarking techniques which enable extraction of embedded data by cell phones. For that we have to embed data onto frequency ranges, where our auditory response is prominent, therefore data embedding will cause much auditory noises. Previously we have proposed applying a two-channel stereo play-back feature, where noises generated by a data embedded left-channel signal will be reduced by the other right-channel signal. However, this proposal has practical problems of restricting extracting terminal location. In this paper, we propose synthesizing the noise reducing right-channel signal with the left-signal and reduces noises completely by generating an auditory stream segregation phenomenon to users. This newly proposed makes the noise reducing right-channel signal unnecessary and supports monaural play-back operations. Moreover, we propose a wide-band embedding method causing dual auditory stream segregation phenomena, which enables data embedding on whole public phone frequency ranges and stable extractions with 3-G mobile phones. From these proposals, extraction precisions become higher than those by the previously proposed method whereas the quality damages of embedded signals become smaller. In this paper we present an abstract of our newly proposed method and experimental results comparing with those by the previously proposed method.

  2. Proteomics goes forensic: Detection and mapping of blood signatures in fingermarks.

    PubMed

    Deininger, Lisa; Patel, Ekta; Clench, Malcolm R; Sears, Vaughn; Sammon, Chris; Francese, Simona

    2016-06-01

    A bottom up in situ proteomic method has been developed enabling the mapping of multiple blood signatures on the intact ridges of blood fingermarks by Matrix Assisted Laser Desorption Mass Spectrometry Imaging (MALDI-MSI). This method, at a proof of concept stage, builds upon recently published work demonstrating the opportunity to profile and identify multiple blood signatures in bloodstains via a bottom up proteomic approach. The present protocol addresses the limitation of the previously developed profiling method with respect to destructivity; destructivity should be avoided for evidence such as blood fingermarks, where the ridge detail must be preserved in order to provide the associative link between the biometric information and the events of bloodshed. Using a blood mark reference model, trypsin concentration and spraying conditions have been optimised within the technical constraints of the depositor eventually employed; the application of MALDI-MSI and Ion Mobility MS have enabled the detection, confirmation and visualisation of blood signatures directly onto the ridge pattern. These results are to be considered a first insight into a method eventually informing investigations (and judicial debates) of violent crimes in which the reliable and non-destructive detection and mapping of blood in fingermarks is paramount to reconstruct the events of bloodshed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Aerodynamic levitation, supercooled liquids and glass formation

    DOE PAGES

    Benmore, C. J.; Weber, J. K. R.

    2017-05-04

    Containerless processing or ‘levitation’ is a valuable tool for the synthesis and characterization of materials, particularly at extreme temperatures and under non-equilibrium conditions. The method enables formation of novel glasses, amorphous phases, and metastable crystalline forms that are not easily accessed when nucleation and growth can readily occur at a container interface. Removing the container enables the use of a wide variety of process atmospheres to modify a materials structure and properties. In the past decade levitation methods, including acoustic, aerodynamic, electromagnetic, and electrostatic, have become well established sample environments at X-ray synchrotron and neutron sources. This article briefly reviewsmore » the methods and then focuses on the application of aerodynamic levitation to synthesize and study new materials. This is presented in conjunction with non-contact probes used to investigate the atomic structure and to measure the properties of materials at extreme temperatures. The use of aerodynamic levitation in research using small and wide-angle X-ray diffraction, XANES, and neutron scattering are discussed in the context of technique development. The use of the containerless methods to investigate thermophysical properties is also considered. We argue that structural motifs and in the liquid state can potentially lead to the fabrication of materials, whose properties would differ substantially from their well known crystalline forms.« less

  4. A semi-automatic method for positioning a femoral bone reconstruction for strict view generation.

    PubMed

    Milano, Federico; Ritacco, Lucas; Gomez, Adrian; Gonzalez Bernaldo de Quiros, Fernan; Risk, Marcelo

    2010-01-01

    In this paper we present a semi-automatic method for femoral bone positioning after 3D image reconstruction from Computed Tomography images. This serves as grounding for the definition of strict axial, longitudinal and anterior-posterior views, overcoming the problem of patient positioning biases in 2D femoral bone measuring methods. After the bone reconstruction is aligned to a standard reference frame, new tomographic slices can be generated, on which unbiased measures may be taken. This could allow not only accurate inter-patient comparisons but also intra-patient comparisons, i.e., comparisons of images of the same patient taken at different times. This method could enable medical doctors to diagnose and follow up several bone deformities more easily.

  5. A robust bayesian estimate of the concordance correlation coefficient.

    PubMed

    Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir

    2015-01-01

    A need for assessment of agreement arises in many situations including statistical biomarker qualification or assay or method validation. Concordance correlation coefficient (CCC) is one of the most popular scaled indices reported in evaluation of agreement. Robust methods for CCC estimation currently present an important statistical challenge. Here, we propose a novel Bayesian method of robust estimation of CCC based on multivariate Student's t-distribution and compare it with its alternatives. Furthermore, we extend the method to practically relevant settings, enabling incorporation of confounding covariates and replications. The superiority of the new approach is demonstrated using simulation as well as real datasets from biomarker application in electroencephalography (EEG). This biomarker is relevant in neuroscience for development of treatments for insomnia.

  6. [Differentiation of Staphylococcus aureus isolates based on phenotypical characters].

    PubMed

    Miedzobrodzki, Jacek; Małachowa, Natalia; Markiewski, Tomasz; Białecka, Anna; Kasprowicz, Andrzej

    2008-06-30

    Typing of Staphylococcus aureus isolates is a necessary procedure for monitoring the transmission of S. aureus among carriers and in epidemiology. Evaluation of the range of relationship among isolates rely on epidemiological markers and is possible because of the clonal character of S. aureus species. Effective typing shows the scheme of transmission of infection in a selected area, enables identifying the reservoir of the microorganism, and may enhance effective eradication. A set of typing methods for use in analyses of epidemiological correlations and the identification of S. aureus isolates is presented. The following methods of typing are described: biotyping, serotyping, antibiogram, protein electrophoresis, cell protein profiles (proteom), immunoblotting, multilocus enzyme electrophoresis (MLEE), zymotyping, and standard species identification of S. aureus in the diagnostic laboratory. Phenotyping methods for S. aureus isolates used in the past and today in epidemiological investigations and in analyses of correlations among S. aureus isolates are presented in this review. The presented methods use morphological characteristics, physiological properties, and chemical structures of the bacteria as criteria for typing. The precision of these standard methods is not always satisfactory as S. aureus strains with atypical biochemical characters have evolved recently. Therefore it is essential to introduce additional typing procedures using molecular biology methods without neglecting phenotypic methods.

  7. Safety leadership and systems thinking: application and evaluation of a Risk Management Framework in the mining industry.

    PubMed

    Donovan, Sarah-Louise; Salmon, Paul M; Lenné, Michael G; Horberry, Tim

    2017-10-01

    Safety leadership is an important factor in supporting safety in high-risk industries. This article contends that applying systems-thinking methods to examine safety leadership can support improved learning from incidents. A case study analysis was undertaken of a large-scale mining landslide incident in which no injuries or fatalities were incurred. A multi-method approach was adopted, in which the Critical Decision Method, Rasmussen's Risk Management Framework and Accimap method were applied to examine the safety leadership decisions and actions which enabled the safe outcome. The approach enabled Rasmussen's predictions regarding safety and performance to be examined in the safety leadership context, with findings demonstrating the distribution of safety leadership across leader and system levels, and the presence of vertical integration as key to supporting the successful safety outcome. In doing so, the findings also demonstrate the usefulness of applying systems-thinking methods to examine and learn from incidents in terms of what 'went right'. The implications, including future research directions, are discussed. Practitioner Summary: This paper presents a case study analysis, in which systems-thinking methods are applied to the examination of safety leadership decisions and actions during a large-scale mining landslide incident. The findings establish safety leadership as a systems phenomenon, and furthermore, demonstrate the usefulness of applying systems-thinking methods to learn from incidents in terms of what 'went right'. Implications, including future research directions, are discussed.

  8. Adaptive model reduction for continuous systems via recursive rational interpolation

    NASA Technical Reports Server (NTRS)

    Lilly, John H.

    1994-01-01

    A method for adaptive identification of reduced-order models for continuous stable SISO and MIMO plants is presented. The method recursively finds a model whose transfer function (matrix) matches that of the plant on a set of frequencies chosen by the designer. The algorithm utilizes the Moving Discrete Fourier Transform (MDFT) to continuously monitor the frequency-domain profile of the system input and output signals. The MDFT is an efficient method of monitoring discrete points in the frequency domain of an evolving function of time. The model parameters are estimated from MDFT data using standard recursive parameter estimation techniques. The algorithm has been shown in simulations to be quite robust to additive noise in the inputs and outputs. A significant advantage of the method is that it enables a type of on-line model validation. This is accomplished by simultaneously identifying a number of models and comparing each with the plant in the frequency domain. Simulations of the method applied to an 8th-order SISO plant and a 10-state 2-input 2-output plant are presented. An example of on-line model validation applied to the SISO plant is also presented.

  9. Taming Many-Parameter BSM Models with Bayesian Neural Networks

    NASA Astrophysics Data System (ADS)

    Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.

    2017-09-01

    The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.

  10. Development of a dynamic headspace gas chromatography-mass spectrometry method for on-site analysis of sulfur mustard degradation products in sediments.

    PubMed

    Magnusson, R; Nordlander, T; Östin, A

    2016-01-15

    Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Magnified reconstruction of digitally recorded holograms by Fresnel-Bluestein transform.

    PubMed

    Restrepo, John F; Garcia-Sucerquia, Jorge

    2010-11-20

    A method for numerical reconstruction of digitally recorded holograms with variable magnification is presented. The proposed strategy allows for smaller, equal, or larger magnification than that achieved with Fresnel transform by introducing the Bluestein substitution into the Fresnel kernel. The magnification is obtained independent of distance, wavelength, and number of pixels, which enables the method to be applied in color digital holography and metrological applications. The approach is supported by experimental and simulation results in digital holography of objects of comparable dimensions with the recording device and in the reconstruction of holograms from digital in-line holographic microscopy.

  12. A data mining method to facilitate SAR transfer.

    PubMed

    Wassermann, Anne Mai; Bajorath, Jürgen

    2011-08-22

    A challenging practical problem in medicinal chemistry is the transfer of SAR information from one chemical series to another. Currently, there are no computational methods available to rationalize or support this process. Herein, we present a data mining approach that enables the identification of alternative analog series with different core structures, corresponding substitution patterns, and comparable potency progression. Scaffolds can be exchanged between these series and new analogs suggested that incorporate preferred R-groups. The methodology can be applied to search for alternative analog series if one series is known or, alternatively, to systematically assess SAR transfer potential in compound databases.

  13. Scalable web services for the PSIPRED Protein Analysis Workbench.

    PubMed

    Buchan, Daniel W A; Minneci, Federico; Nugent, Tim C O; Bryson, Kevin; Jones, David T

    2013-07-01

    Here, we present the new UCL Bioinformatics Group's PSIPRED Protein Analysis Workbench. The Workbench unites all of our previously available analysis methods into a single web-based framework. The new web portal provides a greatly streamlined user interface with a number of new features to allow users to better explore their results. We offer a number of additional services to enable computationally scalable execution of our prediction methods; these include SOAP and XML-RPC web server access and new HADOOP packages. All software and services are available via the UCL Bioinformatics Group website at http://bioinf.cs.ucl.ac.uk/.

  14. Semi-Supervised Novelty Detection with Adaptive Eigenbases, and Application to Radio Transients

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Majid, Walid A.; Reed, Colorado J.; Wagstaff, Kiri L.

    2011-01-01

    We present a semi-supervised online method for novelty detection and evaluate its performance for radio astronomy time series data. Our approach uses adaptive eigenbases to combine 1) prior knowledge about uninteresting signals with 2) online estimation of the current data properties to enable highly sensitive and precise detection of novel signals. We apply the method to the problem of detecting fast transient radio anomalies and compare it to current alternative algorithms. Tests based on observations from the Parkes Multibeam Survey show both effective detection of interesting rare events and robustness to known false alarm anomalies.

  15. Evaluating a normalized conceptual representation produced from natural language patient discharge summaries.

    PubMed Central

    Zweigenbaum, P.; Bouaud, J.; Bachimont, B.; Charlet, J.; Boisvieux, J. F.

    1997-01-01

    The Menelas project aimed to produce a normalized conceptual representation from natural language patient discharge summaries. Because of the complex and detailed nature of conceptual representations, evaluating the quality of output of such a system is difficult. We present the method designed to measure the quality of Menelas output, and its application to the state of the French Menelas prototype as of the end of the project. We examine this method in the framework recently proposed by Friedman and Hripcsak. We also propose two conditions which enable to reduce the evaluation preparation workload. PMID:9357694

  16. A Sensor System for Detection of Hull Surface Defects

    PubMed Central

    Navarro, Pedro; Iborra, Andrés; Fernández, Carlos; Sánchez, Pedro; Suardíaz, Juan

    2010-01-01

    This paper presents a sensor system for detecting defects in ship hull surfaces. The sensor was developed to enable a robotic system to perform grit blasting operations on ship hulls. To achieve this, the proposed sensor system captures images with the help of a camera and processes them in real time using a new defect detection method based on thresholding techniques. What makes this method different is its efficiency in the automatic detection of defects from images recorded in variable lighting conditions. The sensor system was tested under real conditions at a Spanish shipyard, with excellent results. PMID:22163590

  17. Fabricating optical lenses by inkjet printing and heat-assisted in situ curing of polydimethylsiloxane for smartphone microscopy.

    PubMed

    Sung, Yu-Lung; Jeang, Jenn; Lee, Chia-Hsiung; Shih, Wei-Chuan

    2015-04-01

    We present a highly repeatable, lithography-free and mold-free method for fabricating flexible optical lenses by in situ curing liquid polydimethylsiloxane droplets on a preheated smooth surface with an inkjet printing process. This method enables us to fabricate lenses with a focal length as short as 5.6 mm, which can be controlled by varying the droplet volume and the temperature of the preheated surface. Furthermore, the lens can be attached to a smartphone camera without any accessories and can produce high-resolution (1  μm) images for microscopy applications.

  18. Matrices pattern using FIB; 'Out-of-the-box' way of thinking.

    PubMed

    Fleger, Y; Gotlib-Vainshtein, K; Talyosef, Y

    2017-03-01

    Focused ion beam (FIB) is an extremely valuable tool in nanopatterning and nanofabrication for potentially high-resolution patterning, especially when refers to He ion beam microscopy. The work presented here demonstrates an 'out-of-the-box' method of writing using FIB, which enables creating very large matrices, up to the beam-shift limitation, in short times and with high accuracy unachievable by any other writing technique. The new method allows combining different shapes in nanometric dimensions and high resolutions for wide ranges. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  19. Demand Response Resource Quantification with Detailed Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Horsey, Henry; Merket, Noel

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  20. Driven-dissipative quantum Monte Carlo method for open quantum systems

    NASA Astrophysics Data System (ADS)

    Nagy, Alexandra; Savona, Vincenzo

    2018-05-01

    We develop a real-time full configuration-interaction quantum Monte Carlo approach to model driven-dissipative open quantum systems with Markovian system-bath coupling. The method enables stochastic sampling of the Liouville-von Neumann time evolution of the density matrix thanks to a massively parallel algorithm, thus providing estimates of observables on the nonequilibrium steady state. We present the underlying theory and introduce an initiator technique and importance sampling to reduce the statistical error. Finally, we demonstrate the efficiency of our approach by applying it to the driven-dissipative two-dimensional X Y Z spin-1/2 model on a lattice.

  1. Biomechanical properties of wheat grains: the implications on milling.

    PubMed

    Hourston, James E; Ignatz, Michael; Reith, Martin; Leubner-Metzger, Gerhard; Steinbrecher, Tina

    2017-01-01

    Millennia of continuous innovation have driven ever increasing efficiency in the milling process. Mechanically characterizing wheat grains and discerning the structure and function of the wheat bran layers can contribute to continuing innovation. We present novel shear force and puncture force testing regimes to characterize different wheat grain cultivars. The forces endured by wheat grains during the milling process can be quantified, enabling us to measure the impact of commonly applied grain pretreatments, such as microwave heating, extended tempering, enzyme and hormone treatments on grains of different 'hardness'. Using these methods, we demonstrate the importance of short tempering phases prior to milling and identify ways in which our methods can detect differences in the maximum force, energy and breaking behaviours of hard and soft grain types. We also demonstrate for the first time, endosperm weakening in wheat, through hormone stratification on single bran layers. The modern milling process is highly refined, meaning that small, cultivar specific, adjustments can result in large increases in downstream profits. We believe that methods such as these, which enable rapid testing of milling pretreatments and material properties can help to drive an innovation process that has been core to our industrial efforts since prehistory. © 2017 The Authors.

  2. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen

    2016-01-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  3. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.

    2016-12-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  4. JOINING DISSIMILAR MATERIALS USING FRICTION STIR SCRIBE TECHNIQUE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Hovanski, Yuri; Jana, Saumyadeep

    2016-09-01

    Development of robust and cost effective method of joining dissimilar materials can provide a critical pathway to enable widespread use of multi-material design and components in mainstream industrial applications. The use of multi-material components such as Steel-Aluminum, Aluminum-Polymer allows design engineers to optimize material utilization based on service requirements and often lead weight and cost reductions. However producing an effective joint between materials with vastly different thermal, microstructural and deformation response is highly problematic using conventional joining and /or fastening methods. This is especially challenging in cost sensitive high volume markets that largely rely on low–cost joining solutions. Friction Stirmore » Scribe technology was developed to meet the demands of joining materials with drastically different properties and melting regimes. The process enables joining of light metals like Magnesium and Aluminum to high temperature materials like Steels and Titanium. Additionally viable joints between polymer composites and metal can also be made using this method. This paper will present state of the art, progress made and challenges associated with this innovative derivative of Friction Stir welding in reference to joining dissimilar metals and polymer/metal combinations.« less

  5. A high-level 3D visualization API for Java and ImageJ.

    PubMed

    Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin

    2010-05-21

    Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.

  6. Modular microfluidic systems using reversibly attached PDMS fluid control modules

    NASA Astrophysics Data System (ADS)

    Skafte-Pedersen, Peder; Sip, Christopher G.; Folch, Albert; Dufva, Martin

    2013-05-01

    The use of soft lithography-based poly(dimethylsiloxane) (PDMS) valve systems is the dominating approach for high-density microscale fluidic control. Integrated systems enable complex flow control and large-scale integration, but lack modularity. In contrast, modular systems are attractive alternatives to integration because they can be tailored for different applications piecewise and without redesigning every element of the system. We present a method for reversibly coupling hard materials to soft lithography defined systems through self-aligning O-ring features thereby enabling easy interfacing of complex-valve-based systems with simpler detachable units. Using this scheme, we demonstrate the seamless interfacing of a PDMS-based fluid control module with hard polymer chips. In our system, 32 self-aligning O-ring features protruding from the PDMS fluid control module form chip-to-control module interconnections which are sealed by tightening four screws. The interconnection method is robust and supports complex fluidic operations in the reversibly attached passive chip. In addition, we developed a double-sided molding method for fabricating PDMS devices with integrated through-holes. The versatile system facilitates a wide range of applications due to the modular approach, where application specific passive chips can be readily attached to the flow control module.

  7. Biomechanical properties of wheat grains: the implications on milling

    PubMed Central

    Reith, Martin

    2017-01-01

    Millennia of continuous innovation have driven ever increasing efficiency in the milling process. Mechanically characterizing wheat grains and discerning the structure and function of the wheat bran layers can contribute to continuing innovation. We present novel shear force and puncture force testing regimes to characterize different wheat grain cultivars. The forces endured by wheat grains during the milling process can be quantified, enabling us to measure the impact of commonly applied grain pretreatments, such as microwave heating, extended tempering, enzyme and hormone treatments on grains of different ‘hardness’. Using these methods, we demonstrate the importance of short tempering phases prior to milling and identify ways in which our methods can detect differences in the maximum force, energy and breaking behaviours of hard and soft grain types. We also demonstrate for the first time, endosperm weakening in wheat, through hormone stratification on single bran layers. The modern milling process is highly refined, meaning that small, cultivar specific, adjustments can result in large increases in downstream profits. We believe that methods such as these, which enable rapid testing of milling pretreatments and material properties can help to drive an innovation process that has been core to our industrial efforts since prehistory. PMID:28100826

  8. Joining Dissimilar Materials Using Friction Stir Scribe Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Hovanski, Yuri; Jana, Saumyadeep

    2016-10-03

    Development of a robust and cost-effective method of joining dissimilar materials could provide a critical pathway to enable widespread use of multi-material designs and components in mainstream industrial applications. The use of multi-material components such as steel-aluminum and aluminum-polymer would allow design engineers to optimize material utilization based on service requirements and could often lead to weight and cost reductions. However, producing an effective joint between materials with vastly different thermal, microstructural, and deformation responses is highly problematic using conventional joining and/or fastening methods. This is especially challenging in cost sensitive, high volume markets that largely rely on low costmore » joining solutions. Friction stir scribe technology was developed to meet the demands of joining materials with drastically different properties and melting regimes. The process enables joining of light metals like magnesium and aluminum to high temperature materials like steel and titanium. Viable joints between polymer composites and metal can also be made using this method. This paper will present the state of the art, progress made, and challenges associated with this innovative derivative of friction stir welding in reference to joining dissimilar metals and polymer/metal combinations.« less

  9. New Developments in Cathodoluminescence Spectroscopy for the Study of Luminescent Materials

    PubMed Central

    den Engelsen, Daniel; Fern, George R.; Harris, Paul G.; Ireland, Terry G.; Silver, Jack

    2017-01-01

    Herein, we describe three advanced techniques for cathodoluminescence (CL) spectroscopy that have recently been developed in our laboratories. The first is a new method to accurately determine the CL-efficiency of thin layers of phosphor powders. When a wide band phosphor with a band gap (Eg > 5 eV) is bombarded with electrons, charging of the phosphor particles will occur, which eventually leads to erroneous results in the determination of the luminous efficacy. To overcome this problem of charging, a comparison method has been developed, which enables accurate measurement of the current density of the electron beam. The study of CL from phosphor specimens in a scanning electron microscope (SEM) is the second subject to be treated. A detailed description of a measuring method to determine the overall decay time of single phosphor crystals in a SEM without beam blanking is presented. The third technique is based on the unique combination of microscopy and spectrometry in the transmission electron microscope (TEM) of Brunel University London (UK). This combination enables the recording of CL-spectra of nanometre-sized specimens and determining spatial variations in CL emission across individual particles by superimposing the scanning TEM and CL-images. PMID:28772671

  10. Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping

    NASA Astrophysics Data System (ADS)

    Kadlec, Jiri

    This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed to combine volunteer snow reports, cross-country ski track reports and station measurements to fill cloud gaps in the MODIS snow cover product. The method is demonstrated by producing a continuous daily time step snow presence probability map dataset for the Czech Republic region. The ability of the presented methodology to reconstruct MODIS snow cover under cloud is validated by simulating cloud cover datasets and comparing estimated snow cover to actual MODIS snow cover. The percent correctly classified indicator showed accuracy between 80 and 90% using this method. Using crowdsourcing data (volunteer snow reports and ski tracks) improves the map accuracy by 0.7--1.2%. The output snow probability map data sets are published online using web applications and web services. Keywords: crowdsourcing, image analysis, interpolation, MODIS, R statistical software, snow cover, snowpack probability, Tethys platform, time series, WaterML, web services, winter sports.

  11. Estimation of gene induction enables a relevance-based ranking of gene sets.

    PubMed

    Bartholomé, Kilian; Kreutz, Clemens; Timmer, Jens

    2009-07-01

    In order to handle and interpret the vast amounts of data produced by microarray experiments, the analysis of sets of genes with a common biological functionality has been shown to be advantageous compared to single gene analyses. Some statistical methods have been proposed to analyse the differential gene expression of gene sets in microarray experiments. However, most of these methods either require threshhold values to be chosen for the analysis, or they need some reference set for the determination of significance. We present a method that estimates the number of differentially expressed genes in a gene set without requiring a threshold value for significance of genes. The method is self-contained (i.e., it does not require a reference set for comparison). In contrast to other methods which are focused on significance, our approach emphasizes the relevance of the regulation of gene sets. The presented method measures the degree of regulation of a gene set and is a useful tool to compare the induction of different gene sets and place the results of microarray experiments into the biological context. An R-package is available.

  12. Genetic sex determination assays in 53 mammalian species: Literature analysis and guidelines for reporting standardization.

    PubMed

    Hrovatin, Karin; Kunej, Tanja

    2018-01-01

    Erstwhile, sex was determined by observation, which is not always feasible. Nowadays, genetic methods are prevailing due to their accuracy, simplicity, low costs, and time-efficiency. However, there is no comprehensive review enabling overview and development of the field. The studies are heterogeneous, lacking a standardized reporting strategy. Therefore, our aim was to collect genetic sexing assays for mammals and assemble them in a catalogue with unified terminology. Publications were extracted from online databases using key words such as sexing and molecular. The collected data were supplemented with species and gene IDs and the type of sex-specific sequence variant (SSSV). We developed a catalogue and graphic presentation of diagnostic tests for molecular sex determination of mammals, based on 58 papers published from 2/1991 to 10/2016. The catalogue consists of five categories: species, genes, SSSVs, methods, and references. Based on the analysis of published literature, we propose minimal requirements for reporting, consisting of: species scientific name and ID, genetic sequence with name and ID, SSSV, methodology, genomic coordinates (e.g., restriction sites, SSSVs), amplification system, and description of detected amplicon and controls. The present study summarizes vast knowledge that has up to now been scattered across databases, representing the first step toward standardization regarding molecular sexing, enabling a better overview of existing tests and facilitating planned designs of novel tests. The project is ongoing; collecting additional publications, optimizing field development, and standardizing data presentation are needed.

  13. Computationally Efficient Clustering of Audio-Visual Meeting Data

    NASA Astrophysics Data System (ADS)

    Hung, Hayley; Friedland, Gerald; Yeo, Chuohao

    This chapter presents novel computationally efficient algorithms to extract semantically meaningful acoustic and visual events related to each of the participants in a group discussion using the example of business meeting recordings. The recording setup involves relatively few audio-visual sensors, comprising a limited number of cameras and microphones. We first demonstrate computationally efficient algorithms that can identify who spoke and when, a problem in speech processing known as speaker diarization. We also extract visual activity features efficiently from MPEG4 video by taking advantage of the processing that was already done for video compression. Then, we present a method of associating the audio-visual data together so that the content of each participant can be managed individually. The methods presented in this article can be used as a principal component that enables many higher-level semantic analysis tasks needed in search, retrieval, and navigation.

  14. The DEVELOP National Program's Strategy for Communicating Applied Science Outcomes

    NASA Astrophysics Data System (ADS)

    Childs-Gleason, L. M.; Ross, K. W.; Crepps, G.; Favors, J.; Kelley, C.; Miller, T. N.; Allsbrook, K. N.; Rogers, L.; Ruiz, M. L.

    2016-12-01

    NASA's DEVELOP National Program conducts rapid feasibility projects that enable the future workforce and current decision makers to collaborate and build capacity to use Earth science data to enhance environmental management and policy. The program communicates its results and applications to a broad spectrum of audiences through a variety of methods: "virtual poster sessions" that engage the general public through short project videos and interactive dialogue periods, a "Campus Ambassador Corps" that communicates about the program and its projects to academia, scientific and policy conference presentations, community engagement activities and end-of-project presentations, project "hand-offs" providing results and tools to project partners, traditional publications (both gray literature and peer-reviewed), an interactive website project gallery, targeted brochures, and through multiple social media venues and campaigns. This presentation will describe the various methods employed by DEVELOP to communicate the program's scientific outputs, target audiences, general statistics, community response and best practices.

  15. Patient Portals as a Means of Information and Communication Technology Support to Patient-Centric Care Coordination – the Missing Evidence and the Challenges of Evaluation

    PubMed Central

    Georgiou, Andrew; Hyppönen, Hannele; Ammenwerth, Elske; de Keizer, Nicolette; Magrabi, Farah; Scott, Philip

    2015-01-01

    Summary Objectives To review the potential contribution of Information and Communication Technology (ICT) to enable patient-centric and coordinated care, and in particular to explore the role of patient portals as a developing ICT tool, to assess the available evidence, and to describe the evaluation challenges. Methods Reviews of IMIA, EFMI, and other initiatives, together with literature reviews. Results We present the progression from care coordination to care integration, and from patient-centric to person-centric approaches. We describe the different roles of ICT as an enabler of the effective presentation of information as and when needed. We focus on the patient’s role as a co-producer of health as well as the focus and purpose of care. We discuss the need for changing organisational processes as well as the current mixed evidence regarding patient portals as a logical tool, and the reasons for this dichotomy, together with the evaluation principles supported by theoretical frameworks so as to yield robust evidence. Conclusions There is expressed commitment to coordinated care and to putting the patient in the centre. However to achieve this, new interactive patient portals will be needed to enable peer communication by all stakeholders including patients and professionals. Few portals capable of this exist to date. The evaluation of these portals as enablers of system change, rather than as simple windows into electronic records, is at an early stage and novel evaluation approaches are needed. PMID:26123909

  16. Analytical Approach to the Fuel Optimal Impulsive Transfer Problem Using Primer Vector Method

    NASA Astrophysics Data System (ADS)

    Fitrianingsih, E.; Armellin, R.

    2018-04-01

    One of the objectives of mission design is selecting an optimum orbital transfer which often translated as a transfer which requires minimum propellant consumption. In order to assure the selected trajectory meets the requirement, the optimality of transfer should first be analyzed either by directly calculating the ΔV of the candidate trajectories and select the one that gives a minimum value or by evaluating the trajectory according to certain criteria of optimality. The second method is performed by analyzing the profile of the modulus of the thrust direction vector which is known as primer vector. Both methods come with their own advantages and disadvantages. However, it is possible to use the primer vector method to verify if the result from the direct method is truly optimal or if the ΔV can be reduced further by implementing correction maneuver to the reference trajectory. In addition to its capability to evaluate the transfer optimality without the need to calculate the transfer ΔV, primer vector also enables us to identify the time and position to apply correction maneuver in order to optimize a non-optimum transfer. This paper will present the analytical approach to the fuel optimal impulsive transfer using primer vector method. The validity of the method is confirmed by comparing the result to those from the numerical method. The investigation of the optimality of direct transfer is used to give an example of the application of the method. The case under study is the prograde elliptic transfers from Earth to Mars. The study enables us to identify the optimality of all the possible transfers.

  17. TSSer: an automated method to identify transcription start sites in prokaryotic genomes from differential RNA sequencing data.

    PubMed

    Jorjani, Hadi; Zavolan, Mihaela

    2014-04-01

    Accurate identification of transcription start sites (TSSs) is an essential step in the analysis of transcription regulatory networks. In higher eukaryotes, the capped analysis of gene expression technology enabled comprehensive annotation of TSSs in genomes such as those of mice and humans. In bacteria, an equivalent approach, termed differential RNA sequencing (dRNA-seq), has recently been proposed, but the application of this approach to a large number of genomes is hindered by the paucity of computational analysis methods. With few exceptions, when the method has been used, annotation of TSSs has been largely done manually. In this work, we present a computational method called 'TSSer' that enables the automatic inference of TSSs from dRNA-seq data. The method rests on a probabilistic framework for identifying both genomic positions that are preferentially enriched in the dRNA-seq data as well as preferentially captured relative to neighboring genomic regions. Evaluating our approach for TSS calling on several publicly available datasets, we find that TSSer achieves high consistency with the curated lists of annotated TSSs, but identifies many additional TSSs. Therefore, TSSer can accelerate genome-wide identification of TSSs in bacterial genomes and can aid in further characterization of bacterial transcription regulatory networks. TSSer is freely available under GPL license at http://www.clipz.unibas.ch/TSSer/index.php

  18. Continuous Shape Estimation of Continuum Robots Using X-ray Images

    PubMed Central

    Lobaton, Edgar J.; Fu, Jinghua; Torres, Luis G.; Alterovitz, Ron

    2015-01-01

    We present a new method for continuously and accurately estimating the shape of a continuum robot during a medical procedure using a small number of X-ray projection images (e.g., radiographs or fluoroscopy images). Continuum robots have curvilinear structure, enabling them to maneuver through constrained spaces by bending around obstacles. Accurately estimating the robot’s shape continuously over time is crucial for the success of procedures that require avoidance of anatomical obstacles and sensitive tissues. Online shape estimation of a continuum robot is complicated by uncertainty in its kinematic model, movement of the robot during the procedure, noise in X-ray images, and the clinical need to minimize the number of X-ray images acquired. Our new method integrates kinematics models of the robot with data extracted from an optimally selected set of X-ray projection images. Our method represents the shape of the continuum robot over time as a deformable surface which can be described as a linear combination of time and space basis functions. We take advantage of probabilistic priors and numeric optimization to select optimal camera configurations, thus minimizing the expected shape estimation error. We evaluate our method using simulated concentric tube robot procedures and demonstrate that obtaining between 3 and 10 images from viewpoints selected by our method enables online shape estimation with errors significantly lower than using the kinematic model alone or using randomly spaced viewpoints. PMID:26279960

  19. Continuous Shape Estimation of Continuum Robots Using X-ray Images.

    PubMed

    Lobaton, Edgar J; Fu, Jinghua; Torres, Luis G; Alterovitz, Ron

    2013-05-06

    We present a new method for continuously and accurately estimating the shape of a continuum robot during a medical procedure using a small number of X-ray projection images (e.g., radiographs or fluoroscopy images). Continuum robots have curvilinear structure, enabling them to maneuver through constrained spaces by bending around obstacles. Accurately estimating the robot's shape continuously over time is crucial for the success of procedures that require avoidance of anatomical obstacles and sensitive tissues. Online shape estimation of a continuum robot is complicated by uncertainty in its kinematic model, movement of the robot during the procedure, noise in X-ray images, and the clinical need to minimize the number of X-ray images acquired. Our new method integrates kinematics models of the robot with data extracted from an optimally selected set of X-ray projection images. Our method represents the shape of the continuum robot over time as a deformable surface which can be described as a linear combination of time and space basis functions. We take advantage of probabilistic priors and numeric optimization to select optimal camera configurations, thus minimizing the expected shape estimation error. We evaluate our method using simulated concentric tube robot procedures and demonstrate that obtaining between 3 and 10 images from viewpoints selected by our method enables online shape estimation with errors significantly lower than using the kinematic model alone or using randomly spaced viewpoints.

  20. Does daily nurse staffing match ward workload variability? Three hospitals' experiences.

    PubMed

    Gabbay, Uri; Bukchin, Michael

    2009-01-01

    Nurse shortage and rising healthcare resource burdens mean that appropriate workforce use is imperative. This paper aims to evaluate whether daily nursing staffing meets ward workload needs. Nurse attendance and daily nurses' workload capacity in three hospitals were evaluated. Statistical process control was used to evaluate intra-ward nurse workload capacity and day-to-day variations. Statistical process control is a statistics-based method for process monitoring that uses charts with predefined target measure and control limits. Standardization was performed for inter-ward analysis by converting ward-specific crude measures to ward-specific relative measures by dividing observed/expected. Two charts: acceptable and tolerable daily nurse workload intensity, were defined. Appropriate staffing indicators were defined as those exceeding predefined rates within acceptable and tolerable limits (50 percent and 80 percent respectively). A total of 42 percent of the overall days fell within acceptable control limits and 71 percent within tolerable control limits. Appropriate staffing indicators were met in only 33 percent of wards regarding acceptable nurse workload intensity and in only 45 percent of wards regarding tolerable workloads. The study work did not differentiate crude nurse attendance and it did not take into account patient severity since crude bed occupancy was used. Double statistical process control charts and certain staffing indicators were used, which is open to debate. Wards that met appropriate staffing indicators prove the method's feasibility. Wards that did not meet appropriate staffing indicators prove the importance and the need for process evaluations and monitoring. Methods presented for monitoring daily staffing appropriateness are simple to implement either for intra-ward day-to-day variation by using nurse workload capacity statistical process control charts or for inter-ward evaluation using standardized measure of nurse workload intensity. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).

  1. NMR-based automated protein structure determination.

    PubMed

    Würz, Julia M; Kazemi, Sina; Schmidt, Elena; Bagaria, Anurag; Güntert, Peter

    2017-08-15

    NMR spectra analysis for protein structure determination can now in many cases be performed by automated computational methods. This overview of the computational methods for NMR protein structure analysis presents recent automated methods for signal identification in multidimensional NMR spectra, sequence-specific resonance assignment, collection of conformational restraints, and structure calculation, as implemented in the CYANA software package. These algorithms are sufficiently reliable and integrated into one software package to enable the fully automated structure determination of proteins starting from NMR spectra without manual interventions or corrections at intermediate steps, with an accuracy of 1-2 Å backbone RMSD in comparison with manually solved reference structures. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Image Reconstruction from Highly Undersampled (k, t)-Space Data with Joint Partial Separability and Sparsity Constraints

    PubMed Central

    Zhao, Bo; Haldar, Justin P.; Christodoulou, Anthony G.; Liang, Zhi-Pei

    2012-01-01

    Partial separability (PS) and sparsity have been previously used to enable reconstruction of dynamic images from undersampled (k, t)-space data. This paper presents a new method to use PS and sparsity constraints jointly for enhanced performance in this context. The proposed method combines the complementary advantages of PS and sparsity constraints using a unified formulation, achieving significantly better reconstruction performance than using either of these constraints individually. A globally convergent computational algorithm is described to efficiently solve the underlying optimization problem. Reconstruction results from simulated and in vivo cardiac MRI data are also shown to illustrate the performance of the proposed method. PMID:22695345

  3. Singer product apertures-A coded aperture system with a fast decoding algorithm

    NASA Astrophysics Data System (ADS)

    Byard, Kevin; Shutler, Paul M. E.

    2017-06-01

    A new type of coded aperture configuration that enables fast decoding of the coded aperture shadowgram data is presented. Based on the products of incidence vectors generated from the Singer difference sets, we call these Singer product apertures. For a range of aperture dimensions, we compare experimentally the performance of three decoding methods: standard decoding, induction decoding and direct vector decoding. In all cases the induction and direct vector methods are several orders of magnitude faster than the standard method, with direct vector decoding being significantly faster than induction decoding. For apertures of the same dimensions the increase in speed offered by direct vector decoding over induction decoding is better for lower throughput apertures.

  4. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  5. Driver behavior profiling: An investigation with different smartphone sensors and machine learning

    PubMed Central

    Ferreira, Jair; Carvalho, Eduardo; Ferreira, Bruno V.; de Souza, Cleidson; Suhara, Yoshihiko; Pentland, Alex

    2017-01-01

    Driver behavior impacts traffic safety, fuel/energy consumption and gas emissions. Driver behavior profiling tries to understand and positively impact driver behavior. Usually driver behavior profiling tasks involve automated collection of driving data and application of computer models to generate a classification that characterizes the driver aggressiveness profile. Different sensors and classification methods have been employed in this task, however, low-cost solutions and high performance are still research targets. This paper presents an investigation with different Android smartphone sensors, and classification algorithms in order to assess which sensor/method assembly enables classification with higher performance. The results show that specific combinations of sensors and intelligent methods allow classification performance improvement. PMID:28394925

  6. Enhancing the Empathic Connection: Using Action Methods to Understand Conflicts in End-of-Life Care

    PubMed Central

    Tanzi, Silvia; Biasco, Guido; Baile, Walter F.

    2014-01-01

    Empathy is a core feature of patient-centered care. It enables practitioners to better understand the patient and family concerns that are key to patient and family satisfaction, prevention of anxiety and depression, and provider empowerment. Current methods of teaching communication skills do not specifically focus on enhancing the ability to “stand in the patient's shoes” as a way of connecting with the patient and/or family experience and understanding feelings that may be a source of conflict with providers. In this paper, we present a model for deepening empathic understanding based upon action methods (role-reversal and doubling) derived from psychodrama and sociodrama. We describe these techniques and illustrate how they can be used to identify hidden emotions and attitudes and reveal that which the patient and family member may be thinking or feeling but be afraid to say. Finally, we present data showing that these methods were valuable to participants in enhancing their professional experience and skills. PMID:28725796

  7. Ex vivo and in vivo label-free imaging of lymphatic vessels using OCT lymphangiography (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Gong, Peijun; Es'haghian, Shaghayegh; Karnowski, Karol; Rea, Suzanne; Wood, Fiona M.; Yu, Dao-Yi; McLaughlin, Robert A.; Sampson, David D.

    2017-02-01

    We have been developing an automated method to image lymphatic vessels both ex vivo and in vivo with optical coherence tomography (OCT), using their optical transparency. Our method compensates for the OCT signal attenuation for each A-scan in combination with the correction of the confocal function and sensitivity fall-off, enabling reliable thresholding of lymphatic vessels from the OCT scans. Morphological image processing with a segment-joining algorithm is also incorporated into the method to mitigate partial-volume artifacts, which are particularly evident with small lymphatic vessels. Our method is demonstrated for two different clinical application goals: the monitoring of conjunctival lymphatics for surgical guidance and assessment of glaucoma treatment; and the longitudinal monitoring of human burn scars undergoing laser ablation treatment. We present examples of OCT lymphangiography ex vivo on porcine conjunctivas and in vivo on human burn scars, showing the visualization of the lymphatic vessel network and their longitudinal changes due to treatment.

  8. Single-Shot X-Ray Phase-Contrast Computed Tomography with Nonmicrofocal Laboratory Sources

    NASA Astrophysics Data System (ADS)

    Diemoz, P. C.; Hagen, C. K.; Endrizzi, M.; Minuti, M.; Bellazzini, R.; Urbani, L.; De Coppi, P.; Olivo, A.

    2017-04-01

    We present a method that enables performing x-ray phase-contrast imaging (XPCI) computed tomography with a laboratory setup using a single image per projection angle, eliminating the need to move optical elements during acquisition. Theoretical derivation of the method is presented, and its validity conditions are provided. The object is assumed to be quasihomogeneous, i.e., to feature a ratio between the refractive index and the linear attenuation coefficient that is approximately constant across the field of view. The method is experimentally demonstrated on a plastics phantom and on biological samples using a continuous rotation acquisition scheme achieving scan times of a few minutes. Moreover, we show that such acquisition times can be further reduced with the use of a high-efficiency photon-counting detector. Thanks to its ability to substantially simplify the image-acquisition procedure and greatly reduce collection times, we believe this method represents a very important step towards the application of XPCI to real-world problems.

  9. Enhancing the Empathic Connection: Using Action Methods to Understand Conflicts in End-of-Life Care.

    PubMed

    Tanzi, Silvia; Biasco, Guido; Baile, Walter F

    2014-05-01

    Empathy is a core feature of patient-centered care. It enables practitioners to better understand the patient and family concerns that are key to patient and family satisfaction, prevention of anxiety and depression, and provider empowerment. Current methods of teaching communication skills do not specifically focus on enhancing the ability to "stand in the patient's shoes" as a way of connecting with the patient and/or family experience and understanding feelings that may be a source of conflict with providers. In this paper, we present a model for deepening empathic understanding based upon action methods (role-reversal and doubling) derived from psychodrama and sociodrama. We describe these techniques and illustrate how they can be used to identify hidden emotions and attitudes and reveal that which the patient and family member may be thinking or feeling but be afraid to say. Finally, we present data showing that these methods were valuable to participants in enhancing their professional experience and skills.

  10. Droplet microfluidics with a nanoemulsion continuous phase.

    PubMed

    Gu, Tonghan; Yeap, Eunice W Q; Somasundar, Ambika; Chen, Ran; Hatton, T Alan; Khan, Saif A

    2016-07-05

    We present the first study of a novel, generalizable method that uses a water-in-oil nanoemulsion as the continuous phase to generate uniform aqueous micro-droplets in a capillary-based microfluidic system. We first study the droplet generation mechanism in this system and compare it to the more conventional case where a simple oil/solvent (with surfactant) is used as the continuous phase. Next, we present two versatile methods - adding demulsifying chemicals and heat treatment - to allow active online chemical interaction between the continuous and dispersed phases. These methods allow each generated micro-droplet to act as a well-mixed micro-reactor with walls that are 'permeable' to the nanoemulsion droplets and their contents. Finally, we demonstrate an application of this system in the fabrication of uniform hydrogel (alginate) micro-beads with control over particle properties such as size and swelling. Our work expands the toolbox of droplet-based microfluidics, enabling new opportunities and applications involving active colloidal continuous phases carrying chemical payloads, both in advanced materials synthesis and droplet-based screening and diagnostic methods.

  11. Voxel classification based airway tree segmentation

    NASA Astrophysics Data System (ADS)

    Lo, Pechin; de Bruijne, Marleen

    2008-03-01

    This paper presents a voxel classification based method for segmenting the human airway tree in volumetric computed tomography (CT) images. In contrast to standard methods that use only voxel intensities, our method uses a more complex appearance model based on a set of local image appearance features and Kth nearest neighbor (KNN) classification. The optimal set of features for classification is selected automatically from a large set of features describing the local image structure at several scales. The use of multiple features enables the appearance model to differentiate between airway tree voxels and other voxels of similar intensities in the lung, thus making the segmentation robust to pathologies such as emphysema. The classifier is trained on imperfect segmentations that can easily be obtained using region growing with a manual threshold selection. Experiments show that the proposed method results in a more robust segmentation that can grow into the smaller airway branches without leaking into emphysematous areas, and is able to segment many branches that are not present in the training set.

  12. Method for high-precision multi-layered thin film deposition for deep and extreme ultraviolet mirrors

    DOEpatents

    Ruffner, Judith Alison

    1999-01-01

    A method for coating (flat or non-flat) optical substrates with high-reflectivity multi-layer coatings for use at Deep Ultra-Violet ("DUV") and Extreme Ultra-Violet ("EUV") wavelengths. The method results in a product with minimum feature sizes of less than 0.10-.mu.m for the shortest wavelength (13.4-nm). The present invention employs a computer-based modeling and deposition method to enable lateral and vertical thickness control by scanning the position of the substrate with respect to the sputter target during deposition. The thickness profile of the sputter targets is modeled before deposition and then an appropriate scanning algorithm is implemented to produce any desired, radially-symmetric thickness profile. The present invention offers the ability to predict and achieve a wide range of thickness profiles on flat or figured substrates, i.e., account for 1/R.sup.2 factor in a model, and the ability to predict and accommodate changes in deposition rate as a result of plasma geometry, i.e., over figured substrates.

  13. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment*†

    PubMed Central

    Khan, Md. Ashfaquzzaman; Herbordt, Martin C.

    2011-01-01

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations. PMID:21822327

  14. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment.

    PubMed

    Khan, Md Ashfaquzzaman; Herbordt, Martin C

    2011-07-20

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations.

  15. Material-controlled dynamic vacuum insulation

    DOEpatents

    Benson, D.K.; Potter, T.F.

    1996-10-08

    A compact vacuum insulation panel is described comprising a chamber enclosed by two sheets of metal, glass-like spaces disposed in the chamber between the sidewalls, and a high-grade vacuum in the chamber includes apparatus and methods for enabling and disabling, or turning ``on`` and ``off`` the thermal insulating capability of the panel. One type of enabling and disabling apparatus and method includes a metal hydride for releasing hydrogen gas into the chamber in response to heat, and a hydrogen grate between the metal hydride and the chamber for selectively preventing and allowing return of the hydrogen gas to the metal hydride. Another type of enabling and disabling apparatus and method includes a variable emissivity coating on the sheets of metal in which the emissivity is controllably variable by heat or electricity. Still another type of enabling and disabling apparatus and method includes metal-to-metal contact devices that can be actuated to establish or break metal-to-metal heat paths or thermal short circuits between the metal sidewalls. 25 figs.

  16. Variably insulating portable heater/cooler

    DOEpatents

    Potter, Thomas F.

    1998-01-01

    A compact vacuum insulation panel comprising a chamber enclosed by two sheets of metal, glass-like spaces disposed in the chamber between the sidewalls, and a high-grade vacuum in the chamber includes apparatus and methods for enabling and disabling, or turning "on" and "off" the thermal insulating capability of the panel. One type of enabling and disabling apparatus and method includes a metal hydride for releasing hydrogen gas into the chamber in response to heat, and a hydrogen grate between the metal hydride and the chamber for selectively preventing and allowing return of the hydrogen gas to the metal hydride. Another type of enabling and disabling apparatus and method includes a variable emissivity coating on the sheets of metal in which the emissivity is controllably variable by heat or electricity. Still another type of enabling and disabling apparatus and method includes metal-to-metal contact devices that can be actuated to establish or break metal-to-metal heat paths or thermal short circuits between the metal sidewalls.

  17. Material-controlled dynamic vacuum insulation

    DOEpatents

    Benson, David K.; Potter, Thomas F.

    1996-10-08

    A compact vacuum insulation panel comprising a chamber enclosed by two sheets of metal, glass-like spaces disposed in the chamber between the sidewalls, and a high-grade vacuum in the chamber includes apparatus and methods for enabling and disabling, or turning "on" and "off" the thermal insulating capability of the panel. One type of enabling and disabling apparatus and method includes a metal hydride for releasing hydrogen gas into the chamber in response to heat, and a hydrogen grate between the metal hydride and the chamber for selectively preventing and allowing return of the hydrogen gas to the metal hydride. Another type of enabling and disabling apparatus and method includes a variable emissivity coating on the sheets of metal in which the emissivity is controllably variable by heat or electricity. Still another type of enabling and disabling apparatus and method includes metal-to-metal contact devices that can be actuated to establish or break metal-to-metal heat paths or thermal short circuits between the metal sidewalls.

  18. Radiation-controlled dynamic vacuum insulation

    DOEpatents

    Benson, David K.; Potter, Thomas F.

    1995-01-01

    A compact vacuum insulation panel comprising a chamber enclosed by two sheets of metal, glass-like spaces disposed in the chamber between the sidewalls, and a high-grade vacuum in the chamber that includes apparatus and methods for enabling and disabling, or turning "on" and "off" the thermal insulating capability of the panel. One type of enabling and disabling apparatus and method includes a metal hydride for releasing hydrogen gas into the chamber in response to heat, and a hydrogen grate between the metal hydride and the chamber for selectively preventing and allowing return of the hydrogen gas to the metal hydride. Another type of enabling and disabling apparatus and method includes a variable emissivity coating on the sheets of metal in which the emissivity is controllably variable by heat or electricity. Still another type of enabling and disabling apparatus and method includes metal-to-metal contact devices that can be actuated to establish or break metal-to-metal heat paths or thermal short circuits between the metal sidewalls.

  19. Radiation-controlled dynamic vacuum insulation

    DOEpatents

    Benson, D.K.; Potter, T.F.

    1995-07-18

    A compact vacuum insulation panel is described comprising a chamber enclosed by two sheets of metal, glass-like spaces disposed in the chamber between the sidewalls, and a high-grade vacuum in the chamber that includes apparatus and methods for enabling and disabling, or turning ``on`` and ``off`` the thermal insulating capability of the panel. One type of enabling and disabling apparatus and method includes a metal hydride for releasing hydrogen gas into the chamber in response to heat, and a hydrogen grate between the metal hydride and the chamber for selectively preventing and allowing return of the hydrogen gas to the metal hydride. Another type of enabling and disabling apparatus and method includes a variable emissivity coating on the sheets of metal in which the emissivity is controllably variable by heat or electricity. Still another type of enabling and disabling apparatus and method includes metal-to-metal contact devices that can be actuated to establish or break metal-to-metal heat paths or thermal short circuits between the metal sidewalls. 25 figs.

  20. Variably insulating portable heater/cooler

    DOEpatents

    Potter, T.F.

    1998-09-29

    A compact vacuum insulation panel is described comprising a chamber enclosed by two sheets of metal, glass-like spaces disposed in the chamber between the sidewalls, and a high-grade vacuum in the chamber includes apparatus and methods for enabling and disabling, or turning ``on`` and ``off`` the thermal insulating capability of the panel. One type of enabling and disabling apparatus and method includes a metal hydride for releasing hydrogen gas into the chamber in response to heat, and a hydrogen grate between the metal hydride and the chamber for selectively preventing and allowing return of the hydrogen gas to the metal hydride. Another type of enabling and disabling apparatus and method includes a variable emissivity coating on the sheets of metal in which the emissivity is controllably variable by heat or electricity. Still another type of enabling and disabling apparatus and method includes metal-to-metal contact devices that can be actuated to establish or break metal-to-metal heat paths or thermal short circuits between the metal sidewalls. 25 figs.

Top